Sample records for process capability information

  1. Information Processing Capabilities in Performers Differing in Levels of Motor Skill

    DTIC Science & Technology

    1979-01-01

    F. I. 1. , ’ Lockhart , R. S. Levels of* processing : A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 1972, 11, 671-684...ARI TECHNICAL REPORT LEVEr.79iA4 Information Processing Capabilities in Performers Differing In Levels of 00 Motor Skill ,4 by Robert N. Singer... PROCESSING CAPABILITIES IN PERFORMERS DIFFERING IN LEVELS OF MOTOR SKILL INTRODUCTION In the human behaving systems model developed by Singer, Gerson, and

  2. Information processing of earth resources data

    NASA Technical Reports Server (NTRS)

    Zobrist, A. L.; Bryant, N. A.

    1982-01-01

    Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.

  3. Bibliographic Post-Processing with the TIS Intelligent Gateway: Analytical and Communication Capabilities.

    ERIC Educational Resources Information Center

    Burton, Hilary D.

    TIS (Technology Information System) is an intelligent gateway system capable of performing quantitative evaluation and analysis of bibliographic citations using a set of Process functions. Originally developed by Lawrence Livermore National Laboratory (LLNL) to analyze information retrieved from three major federal databases, DOE/RECON,…

  4. Toward information management in corporations (4)

    NASA Astrophysics Data System (ADS)

    Yamamoto, Takeo

    The roles of personal computers (PC's) and workstations (WS's) in developing the corporate information system is discussed. The history and state of art for PC's and WS's are reviewed. Checkpoints for introducing PC's and WS's are ; Japanese word-processing capabilities, multi-media capabilities and network capabilities.

  5. A biopolymer transistor: electrical amplification by microtubules.

    PubMed

    Priel, Avner; Ramos, Arnolt J; Tuszynski, Jack A; Cantiello, Horacio F

    2006-06-15

    Microtubules (MTs) are important cytoskeletal structures engaged in a number of specific cellular activities, including vesicular traffic, cell cyto-architecture and motility, cell division, and information processing within neuronal processes. MTs have also been implicated in higher neuronal functions, including memory and the emergence of "consciousness". How MTs handle and process electrical information, however, is heretofore unknown. Here we show new electrodynamic properties of MTs. Isolated, taxol-stabilized MTs behave as biomolecular transistors capable of amplifying electrical information. Electrical amplification by MTs can lead to the enhancement of dynamic information, and processivity in neurons can be conceptualized as an "ionic-based" transistor, which may affect, among other known functions, neuronal computational capabilities.

  6. Information Technology. DOD Needs to Strengthen Management of Its Statutorily Mandated Software and System Process Improvement Efforts

    DTIC Science & Technology

    2009-09-01

    NII)/CIO Assistant Secretary of Defense for Networks and Information Integration/Chief Information Officer CMMI Capability Maturity Model...a Web-based portal to share knowledge about software process-related methodologies, such as the SEI’s Capability Maturity Model Integration ( CMMI ...19 SEI’s IDEALSM model, and Lean Six Sigma.20 For example, the portal features content areas such as software acquisition management, the SEI CMMI

  7. Vortex information display system program description manual. [data acquisition from laser Doppler velocimeters and real time operation

    NASA Technical Reports Server (NTRS)

    Conway, R.; Matuck, G. N.; Roe, J. M.; Taylor, J.; Turner, A.

    1975-01-01

    A vortex information display system is described which provides flexible control through system-user interaction for collecting wing-tip-trailing vortex data, processing this data in real time, displaying the processed data, storing raw data on magnetic tape, and post processing raw data. The data is received from two asynchronous laser Doppler velocimeters (LDV's) and includes position, velocity, and intensity information. The raw data is written onto magnetic tape for permanent storage and is also processed in real time to locate vortices and plot their positions as a function of time. The interactive capability enables the user to make real time adjustments in processing data and provides a better definition of vortex behavior. Displaying the vortex information in real time produces a feedback capability to the LDV system operator allowing adjustments to be made in the collection of raw data. Both raw data and processing can be continually upgraded during flyby testing to improve vortex behavior studies. The post-analysis capability permits the analyst to perform in-depth studies of test data and to modify vortex behavior models to improve transport predictions.

  8. KSC Technical Capabilities Website

    NASA Technical Reports Server (NTRS)

    Nufer, Brian; Bursian, Henry; Brown, Laurette L.

    2010-01-01

    This document is the website pages that review the technical capabilities that the Kennedy Space Center (KSC) has for partnership opportunities. The purpose of this information is to make prospective customers aware of the capabilities and provide an opportunity to form relationships with the experts at KSC. The technical capabilities fall into these areas: (1) Ground Operations and Processing Services, (2) Design and Analysis Solutions, (3) Command and Control Systems / Services, (4) Materials and Processes, (5) Research and Technology Development and (6) Laboratories, Shops and Test Facilities.

  9. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  10. Development of a multi-disciplinary ERTS user program in the state of Ohio. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Baldridge, P. E.; Weber, C.; Schaal, G.; Wilhelm, C.; Wurelic, G. E.; Stephan, J. G.; Ebbert, T. F.; Smail, H. E.; Mckeon, J.; Schmidt, N. (Principal Investigator)

    1977-01-01

    The author has identified the following significant results. A current uniform land inventory was derived, in part, from LANDSAT data. The State has the ability to convert processed land information from LANDSAT to Ohio Capability Analysis Program (OCAP). The OCAP is a computer information and mapping system comprised of various programs used to digitally store, analyze, and display land capability information. More accurate processing of LANDSAT data could lead to reasonably accurate, useful land allocations models. It was feasible to use LANDSAT data to investigate minerals, pollution, land use, and resource inventory.

  11. Electrochemical Probing through a Redox Capacitor To Acquire Chemical Information on Biothiols

    PubMed Central

    2016-01-01

    The acquisition of chemical information is a critical need for medical diagnostics, food/environmental monitoring, and national security. Here, we report an electrochemical information processing approach that integrates (i) complex electrical inputs/outputs, (ii) mediators to transduce the electrical I/O into redox signals that can actively probe the chemical environment, and (iii) a redox capacitor that manipulates signals for information extraction. We demonstrate the capabilities of this chemical information processing strategy using biothiols because of the emerging importance of these molecules in medicine and because their distinct chemical properties allow evaluation of hypothesis-driven information probing. We show that input sequences can be tailored to probe for chemical information both qualitatively (step inputs probe for thiol-specific signatures) and quantitatively. Specifically, we observed picomolar limits of detection and linear responses to concentrations over 5 orders of magnitude (1 pM–0.1 μM). This approach allows the capabilities of signal processing to be extended for rapid, robust, and on-site analysis of chemical information. PMID:27385047

  12. Electrochemical Probing through a Redox Capacitor To Acquire Chemical Information on Biothiols.

    PubMed

    Liu, Zhengchun; Liu, Yi; Kim, Eunkyoung; Bentley, William E; Payne, Gregory F

    2016-07-19

    The acquisition of chemical information is a critical need for medical diagnostics, food/environmental monitoring, and national security. Here, we report an electrochemical information processing approach that integrates (i) complex electrical inputs/outputs, (ii) mediators to transduce the electrical I/O into redox signals that can actively probe the chemical environment, and (iii) a redox capacitor that manipulates signals for information extraction. We demonstrate the capabilities of this chemical information processing strategy using biothiols because of the emerging importance of these molecules in medicine and because their distinct chemical properties allow evaluation of hypothesis-driven information probing. We show that input sequences can be tailored to probe for chemical information both qualitatively (step inputs probe for thiol-specific signatures) and quantitatively. Specifically, we observed picomolar limits of detection and linear responses to concentrations over 5 orders of magnitude (1 pM-0.1 μM). This approach allows the capabilities of signal processing to be extended for rapid, robust, and on-site analysis of chemical information.

  13. Conserving analyst attention units: use of multi-agent software and CEP methods to assist information analysis

    NASA Astrophysics Data System (ADS)

    Rimland, Jeffrey; McNeese, Michael; Hall, David

    2013-05-01

    Although the capability of computer-based artificial intelligence techniques for decision-making and situational awareness has seen notable improvement over the last several decades, the current state-of-the-art still falls short of creating computer systems capable of autonomously making complex decisions and judgments in many domains where data is nuanced and accountability is high. However, there is a great deal of potential for hybrid systems in which software applications augment human capabilities by focusing the analyst's attention to relevant information elements based on both a priori knowledge of the analyst's goals and the processing/correlation of a series of data streams too numerous and heterogeneous for the analyst to digest without assistance. Researchers at Penn State University are exploring ways in which an information framework influenced by Klein's (Recognition Primed Decision) RPD model, Endsley's model of situational awareness, and the Joint Directors of Laboratories (JDL) data fusion process model can be implemented through a novel combination of Complex Event Processing (CEP) and Multi-Agent Software (MAS). Though originally designed for stock market and financial applications, the high performance data-driven nature of CEP techniques provide a natural compliment to the proven capabilities of MAS systems for modeling naturalistic decision-making, performing process adjudication, and optimizing networked processing and cognition via the use of "mobile agents." This paper addresses the challenges and opportunities of such a framework for augmenting human observational capability as well as enabling the ability to perform collaborative context-aware reasoning in both human teams and hybrid human / software agent teams.

  14. Oak Ridge Computerized Hierarchical Information System (ORCHIS) status report, July 1973

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooks, A.A.

    1974-01-01

    This report summarizes the concepts, software, and contents of the Oak Ridge Computerized Hierarchical Information System. This data analysis and text processing system was developed as an integrated, comprehensive information processing capability to meet the needs of an on-going multidisciplinary research and development organization. (auth)

  15. Improvement of Organizational Performance and Instructional Design: An Analogy Based on General Principles of Natural Information Processing Systems

    ERIC Educational Resources Information Center

    Darabi, Aubteen; Kalyuga, Slava

    2012-01-01

    The process of improving organizational performance through designing systemic interventions has remarkable similarities to designing instruction for improving learners' performance. Both processes deal with subjects (learners and organizations correspondingly) with certain capabilities that are exposed to novel information designed for producing…

  16. Neural network for processing both spatial and temporal data with time based back-propagation

    NASA Technical Reports Server (NTRS)

    Villarreal, James A. (Inventor); Shelton, Robert O. (Inventor)

    1993-01-01

    Neural networks are computing systems modeled after the paradigm of the biological brain. For years, researchers using various forms of neural networks have attempted to model the brain's information processing and decision-making capabilities. Neural network algorithms have impressively demonstrated the capability of modeling spatial information. On the other hand, the application of parallel distributed models to the processing of temporal data has been severely restricted. The invention introduces a novel technique which adds the dimension of time to the well known back-propagation neural network algorithm. In the space-time neural network disclosed herein, the synaptic weights between two artificial neurons (processing elements) are replaced with an adaptable-adjustable filter. Instead of a single synaptic weight, the invention provides a plurality of weights representing not only association, but also temporal dependencies. In this case, the synaptic weights are the coefficients to the adaptable digital filters. Novelty is believed to lie in the disclosure of a processing element and a network of the processing elements which are capable of processing temporal as well as spacial data.

  17. Informing future NRT satellite distribution capabilities: Lessons learned from NASA's Land Atmosphere NRT capability for EOS (LANCE)

    NASA Astrophysics Data System (ADS)

    Davies, D.; Murphy, K. J.; Michael, K.

    2013-12-01

    NASA's Land Atmosphere Near real-time Capability for EOS (Earth Observing System) (LANCE) provides data and imagery from Terra, Aqua and Aura satellites in less than 3 hours from satellite observation, to meet the needs of the near real-time (NRT) applications community. This article describes the architecture of the LANCE and outlines the modifications made to achieve the 3-hour latency requirement with a view to informing future NRT satellite distribution capabilities. It also describes how latency is determined. LANCE is a distributed system that builds on the existing EOS Data and Information System (EOSDIS) capabilities. To achieve the NRT latency requirement, many components of the EOS satellite operations, ground and science processing systems have been made more efficient without compromising the quality of science data processing. The EOS Data and Operations System (EDOS) processes the NRT stream with higher priority than the science data stream in order to minimize latency. In addition to expediting transfer times, the key difference between the NRT Level 0 products and those for standard science processing is the data used to determine the precise location and tilt of the satellite. Standard products use definitive geo-location (attitude and ephemeris) data provided daily, whereas NRT products use predicted geo-location provided by the instrument Global Positioning System (GPS) or approximation of navigational data (depending on platform). Level 0 data are processed in to higher-level products at designated Science Investigator-led Processing Systems (SIPS). The processes used by LANCE have been streamlined and adapted to work with datasets as soon as they are downlinked from satellites or transmitted from ground stations. Level 2 products that require ancillary data have modified production rules to relax the requirements for ancillary data so reducing processing times. Looking to the future, experience gained from LANCE can provide valuable lessons on satellite and ground system architectures and on how the delivery of NRT products from other NASA missions might be achieved.

  18. Innovative railroad information displays : executive summary

    DOT National Transportation Integrated Search

    1998-01-01

    The objectives ofthis study were to explore the potential of advanced digital technology, : novel concepts of information management, geographic information databases and : display capabilities in order to enhance planning and decision-making process...

  19. The Contribution of Cognitive Engineering to the Effective Design and Use of Information Systems.

    ERIC Educational Resources Information Center

    Garg-Janardan, Chaya; Salvendy, Gavriel

    1986-01-01

    Examines the role of human information processing and decision-making capabilities and limitations in the design of effective human-computer interfaces. Several cognitive engineering principles that should guide the design process are outlined. (48 references) (Author/CLB)

  20. Social Information Processing and Emotional Understanding in Children with LD

    ERIC Educational Resources Information Center

    Bauminger, Nirit; Edelsztein, Hany Schorr; Morash, Janice

    2005-01-01

    The present study aimed to comprehensively examine social cognition processes in children with and without learning disabilities (LD), focusing on social information processing (SIP) and complex emotional understanding capabilities such as understanding complex, mixed, and hidden emotions. Participants were 50 children with LD (age range 9.4-12.7;…

  1. The Effects of Transcranial Direct Current Stimulation (tDCS) on Multitasking Throughput Capacity

    PubMed Central

    Nelson, Justin; McKinley, Richard A.; Phillips, Chandler; McIntire, Lindsey; Goodyear, Chuck; Kreiner, Aerial; Monforton, Lanie

    2016-01-01

    Background: Multitasking has become an integral attribute associated with military operations within the past several decades. As the amount of information that needs to be processed during these high level multitasking environments exceeds the human operators' capabilities, the information throughput capacity reaches an asymptotic limit. At this point, the human operator can no longer effectively process and respond to the incoming information resulting in a plateau or decline in performance. The objective of the study was to evaluate the efficacy of a non-invasive brain stimulation technique known as transcranial direct current stimulation (tDCS) applied to a scalp location over the left dorsolateral prefrontal cortex (lDLPFC) to improve information processing capabilities during a multitasking environment. Methods: The study consisted of 20 participants from Wright-Patterson Air Force Base (16 male and 4 female) with an average age of 31.1 (SD = 4.5). Participants were randomly assigned into two groups, each consisting of eight males and two females. Group one received 2 mA of anodal tDCS and group two received sham tDCS over the lDLPFC on their testing day. Results: The findings indicate that anodal tDCS significantly improves the participants' information processing capability resulting in improved performance compared to sham tDCS. For example, the multitasking throughput capacity for the sham tDCS group plateaued near 1.0 bits/s at the higher baud input (2.0 bits/s) whereas the anodal tDCS group plateaued near 1.3 bits/s. Conclusion: The findings provided new evidence that tDCS has the ability to augment and enhance multitasking capability in a human operator. Future research should be conducted to determine the longevity of the enhancement of transcranial direct current stimulation on multitasking performance, which has yet to be accomplished. PMID:27965553

  2. The Effects of Transcranial Direct Current Stimulation (tDCS) on Multitasking Throughput Capacity.

    PubMed

    Nelson, Justin; McKinley, Richard A; Phillips, Chandler; McIntire, Lindsey; Goodyear, Chuck; Kreiner, Aerial; Monforton, Lanie

    2016-01-01

    Background: Multitasking has become an integral attribute associated with military operations within the past several decades. As the amount of information that needs to be processed during these high level multitasking environments exceeds the human operators' capabilities, the information throughput capacity reaches an asymptotic limit. At this point, the human operator can no longer effectively process and respond to the incoming information resulting in a plateau or decline in performance. The objective of the study was to evaluate the efficacy of a non-invasive brain stimulation technique known as transcranial direct current stimulation (tDCS) applied to a scalp location over the left dorsolateral prefrontal cortex (lDLPFC) to improve information processing capabilities during a multitasking environment. Methods: The study consisted of 20 participants from Wright-Patterson Air Force Base (16 male and 4 female) with an average age of 31.1 (SD = 4.5). Participants were randomly assigned into two groups, each consisting of eight males and two females. Group one received 2 mA of anodal tDCS and group two received sham tDCS over the lDLPFC on their testing day. Results: The findings indicate that anodal tDCS significantly improves the participants' information processing capability resulting in improved performance compared to sham tDCS. For example, the multitasking throughput capacity for the sham tDCS group plateaued near 1.0 bits/s at the higher baud input (2.0 bits/s) whereas the anodal tDCS group plateaued near 1.3 bits/s. Conclusion: The findings provided new evidence that tDCS has the ability to augment and enhance multitasking capability in a human operator. Future research should be conducted to determine the longevity of the enhancement of transcranial direct current stimulation on multitasking performance, which has yet to be accomplished.

  3. Geographic Information System (GIS) capabilities in traffic accident information management: a qualitative approach.

    PubMed

    Ahmadi, Maryam; Valinejadi, Ali; Goodarzi, Afshin; Safari, Ameneh; Hemmat, Morteza; Majdabadi, Hesamedin Askari; Mohammadi, Ali

    2017-06-01

    Traffic accidents are one of the more important national and international issues, and their consequences are important for the political, economical, and social level in a country. Management of traffic accident information requires information systems with analytical and accessibility capabilities to spatial and descriptive data. The aim of this study was to determine the capabilities of a Geographic Information System (GIS) in management of traffic accident information. This qualitative cross-sectional study was performed in 2016. In the first step, GIS capabilities were identified via literature retrieved from the Internet and based on the included criteria. Review of the literature was performed until data saturation was reached; a form was used to extract the capabilities. In the second step, study population were hospital managers, police, emergency, statisticians, and IT experts in trauma, emergency and police centers. Sampling was purposive. Data was collected using a questionnaire based on the first step data; validity and reliability were determined by content validity and Cronbach's alpha of 75%. Data was analyzed using the decision Delphi technique. GIS capabilities were identified in ten categories and 64 sub-categories. Import and process of spatial and descriptive data and so, analysis of this data were the most important capabilities of GIS in traffic accident information management. Storing and retrieving of descriptive and spatial data, providing statistical analysis in table, chart and zoning format, management of bad structure issues, determining the cost effectiveness of the decisions and prioritizing their implementation were the most important capabilities of GIS which can be efficient in the management of traffic accident information.

  4. Decision Analysis Methods Used to Make Appropriate Investments in Human Exploration Capabilities and Technologies

    NASA Technical Reports Server (NTRS)

    Williams-Byrd, Julie; Arney, Dale C.; Hay, Jason; Reeves, John D.; Craig, Douglas

    2016-01-01

    NASA is transforming human spaceflight. The Agency is shifting from an exploration-based program with human activities in low Earth orbit (LEO) and targeted robotic missions in deep space to a more sustainable and integrated pioneering approach. Through pioneering, NASA seeks to address national goals to develop the capacity for people to work, learn, operate, live, and thrive safely beyond Earth for extended periods of time. However, pioneering space involves daunting technical challenges of transportation, maintaining health, and enabling crew productivity for long durations in remote, hostile, and alien environments. Prudent investments in capability and technology developments, based on mission need, are critical for enabling a campaign of human exploration missions. There are a wide variety of capabilities and technologies that could enable these missions, so it is a major challenge for NASA's Human Exploration and Operations Mission Directorate (HEOMD) to make knowledgeable portfolio decisions. It is critical for this pioneering initiative that these investment decisions are informed with a prioritization process that is robust and defensible. It is NASA's role to invest in targeted technologies and capabilities that would enable exploration missions even though specific requirements have not been identified. To inform these investments decisions, NASA's HEOMD has supported a variety of analysis activities that prioritize capabilities and technologies. These activities are often based on input from subject matter experts within the NASA community who understand the technical challenges of enabling human exploration missions. This paper will review a variety of processes and methods that NASA has used to prioritize and rank capabilities and technologies applicable to human space exploration. The paper will show the similarities in the various processes and showcase instances were customer specified priorities force modifications to the process. Specifically, this paper will describe the processes that the NASA Langley Research Center (LaRC) Technology Assessment and Integration Team (TAIT) has used for several years and how those processes have been customized to meet customer needs while staying robust and defensible. This paper will show how HEOMD uses these analyses results to assist with making informed portfolio investment decisions. The paper will also highlight which human exploration capabilities and technologies typically rank high regardless of the specific design reference mission. The paper will conclude by describing future capability and technology ranking activities that will continue o leverage subject matter experts (SME) input while also incorporating more model-based analysis.

  5. A rapid prototyping/artificial intelligence approach to space station-era information management and access

    NASA Technical Reports Server (NTRS)

    Carnahan, Richard S., Jr.; Corey, Stephen M.; Snow, John B.

    1989-01-01

    Applications of rapid prototyping and Artificial Intelligence techniques to problems associated with Space Station-era information management systems are described. In particular, the work is centered on issues related to: (1) intelligent man-machine interfaces applied to scientific data user support, and (2) the requirement that intelligent information management systems (IIMS) be able to efficiently process metadata updates concerning types of data handled. The advanced IIMS represents functional capabilities driven almost entirely by the needs of potential users. Space Station-era scientific data projected to be generated is likely to be significantly greater than data currently processed and analyzed. Information about scientific data must be presented clearly, concisely, and with support features to allow users at all levels of expertise efficient and cost-effective data access. Additionally, mechanisms for allowing more efficient IIMS metadata update processes must be addressed. The work reported covers the following IIMS design aspects: IIMS data and metadata modeling, including the automatic updating of IIMS-contained metadata, IIMS user-system interface considerations, including significant problems associated with remote access, user profiles, and on-line tutorial capabilities, and development of an IIMS query and browse facility, including the capability to deal with spatial information. A working prototype has been developed and is being enhanced.

  6. Quantum simulator review

    NASA Astrophysics Data System (ADS)

    Bednar, Earl; Drager, Steven L.

    2007-04-01

    Quantum information processing's objective is to utilize revolutionary computing capability based on harnessing the paradigm shift offered by quantum computing to solve classically hard and computationally challenging problems. Some of our computationally challenging problems of interest include: the capability for rapid image processing, rapid optimization of logistics, protecting information, secure distributed simulation, and massively parallel computation. Currently, one important problem with quantum information processing is that the implementation of quantum computers is difficult to realize due to poor scalability and great presence of errors. Therefore, we have supported the development of Quantum eXpress and QuIDD Pro, two quantum computer simulators running on classical computers for the development and testing of new quantum algorithms and processes. This paper examines the different methods used by these two quantum computing simulators. It reviews both simulators, highlighting each simulators background, interface, and special features. It also demonstrates the implementation of current quantum algorithms on each simulator. It concludes with summary comments on both simulators.

  7. Intelligent On-Board Processing in the Sensor Web

    NASA Astrophysics Data System (ADS)

    Tanner, S.

    2005-12-01

    Most existing sensing systems are designed as passive, independent observers. They are rarely aware of the phenomena they observe, and are even less likely to be aware of what other sensors are observing within the same environment. Increasingly, intelligent processing of sensor data is taking place in real-time, using computing resources on-board the sensor or the platform itself. One can imagine a sensor network consisting of intelligent and autonomous space-borne, airborne, and ground-based sensors. These sensors will act independently of one another, yet each will be capable of both publishing and receiving sensor information, observations, and alerts among other sensors in the network. Furthermore, these sensors will be capable of acting upon this information, perhaps altering acquisition properties of their instruments, changing the location of their platform, or updating processing strategies for their own observations to provide responsive information or additional alerts. Such autonomous and intelligent sensor networking capabilities provide significant benefits for collections of heterogeneous sensors within any environment. They are crucial for multi-sensor observations and surveillance, where real-time communication with external components and users may be inhibited, and the environment may be hostile. In all environments, mission automation and communication capabilities among disparate sensors will enable quicker response to interesting, rare, or unexpected events. Additionally, an intelligent network of heterogeneous sensors provides the advantage that all of the sensors can benefit from the unique capabilities of each sensor in the network. The University of Alabama in Huntsville (UAH) is developing a unique approach to data processing, integration and mining through the use of the Adaptive On-Board Data Processing (AODP) framework. AODP is a key foundation technology for autonomous internetworking capabilities to support situational awareness by sensors and their on-board processes. The two primary research areas for this project are (1) the on-board processing and communications framework itself, and (2) data mining algorithms targeted to the needs and constraints of the on-board environment. The team is leveraging its experience in on-board processing, data mining, custom data processing, and sensor network design. Several unique UAH-developed technologies are employed in the AODP project, including EVE, an EnVironmEnt for on-board processing, and the data mining tools included in the Algorithm Development and Mining (ADaM) toolkit.

  8. Systems Thinking for the Enterprise: A Thought Piece

    NASA Astrophysics Data System (ADS)

    Rebovich, George

    This paper suggests a way of managing the acquisition of capabilities for large-scale government enterprises that is different from traditional "specify and build" approaches commonly employed by U.S. government agencies in acquiring individual systems or systems of systems (SoS). Enterprise capabilities evolve through the emergence and convergence of information and other technologies and their integration into social, institutional and operational organizations and processes. Enterprise capabilities evolve whether or not the enterprise has processes in place to actively manage them. Thus the critical role of enterprise system engineering (ESE) processes should be to shape, enhance and accelerate the "natural" evolution of enterprise capabilities. ESE processes do not replace or add a layer to traditional system engineering (TSE) processes used in developing individual systems or SoS. ESE processes should complement TSE processes by shaping outcome spaces and stimulating interactions among enterprise participants through marketlike mechanisms to reward those that create innovation which moves and accelerates the evolution of the enterprise.

  9. Interactive information processing for NASA's mesoscale analysis and space sensor program

    NASA Technical Reports Server (NTRS)

    Parker, K. G.; Maclean, L.; Reavis, N.; Wilson, G.; Hickey, J. S.; Dickerson, M.; Karitani, S.; Keller, D.

    1985-01-01

    The Atmospheric Sciences Division (ASD) of the Systems Dynamics Laboratory at NASA's Marshall Space Flight Center (MSFC) is currently involved in interactive information processing for the Mesoscale Analysis and Space Sensor (MASS) program. Specifically, the ASD is engaged in the development and implementation of new space-borne remote sensing technology to observe and measure mesoscale atmospheric processes. These space measurements and conventional observational data are being processed together to gain an improved understanding of the mesoscale structure and the dynamical evolution of the atmosphere relative to cloud development and precipitation processes. To satisfy its vast data processing requirements, the ASD has developed a Researcher Computer System consiting of three primary computer systems which provides over 20 scientists with a wide range of capabilities for processing and displaying a large volumes of remote sensing data. Each of the computers performs a specific function according to its unique capabilities.

  10. Recent advances in nuclear magnetic resonance quantum information processing.

    PubMed

    Criger, Ben; Passante, Gina; Park, Daniel; Laflamme, Raymond

    2012-10-13

    Quantum information processors have the potential to drastically change the way we communicate and process information. Nuclear magnetic resonance (NMR) has been one of the first experimental implementations of quantum information processing (QIP) and continues to be an excellent testbed to develop new QIP techniques. We review the recent progress made in NMR QIP, focusing on decoupling, pulse engineering and indirect nuclear control. These advances have enhanced the capabilities of NMR QIP, and have useful applications in both traditional NMR and other QIP architectures.

  11. Display system for imaging scientific telemetric information

    NASA Technical Reports Server (NTRS)

    Zabiyakin, G. I.; Rykovanov, S. N.

    1979-01-01

    A system for imaging scientific telemetric information, based on the M-6000 minicomputer and the SIGD graphic display, is described. Two dimensional graphic display of telemetric information and interaction with the computer, in analysis and processing of telemetric parameters displayed on the screen is provided. The running parameter information output method is presented. User capabilities in the analysis and processing of telemetric information imaged on the display screen and the user language are discussed and illustrated.

  12. Geographic Information System (GIS) capabilities in traffic accident information management: a qualitative approach

    PubMed Central

    Ahmadi, Maryam; Valinejadi, Ali; Goodarzi, Afshin; Safari, Ameneh; Hemmat, Morteza; Majdabadi, Hesamedin Askari; Mohammadi, Ali

    2017-01-01

    Background Traffic accidents are one of the more important national and international issues, and their consequences are important for the political, economical, and social level in a country. Management of traffic accident information requires information systems with analytical and accessibility capabilities to spatial and descriptive data. Objective The aim of this study was to determine the capabilities of a Geographic Information System (GIS) in management of traffic accident information. Methods This qualitative cross-sectional study was performed in 2016. In the first step, GIS capabilities were identified via literature retrieved from the Internet and based on the included criteria. Review of the literature was performed until data saturation was reached; a form was used to extract the capabilities. In the second step, study population were hospital managers, police, emergency, statisticians, and IT experts in trauma, emergency and police centers. Sampling was purposive. Data was collected using a questionnaire based on the first step data; validity and reliability were determined by content validity and Cronbach’s alpha of 75%. Data was analyzed using the decision Delphi technique. Results GIS capabilities were identified in ten categories and 64 sub-categories. Import and process of spatial and descriptive data and so, analysis of this data were the most important capabilities of GIS in traffic accident information management. Conclusion Storing and retrieving of descriptive and spatial data, providing statistical analysis in table, chart and zoning format, management of bad structure issues, determining the cost effectiveness of the decisions and prioritizing their implementation were the most important capabilities of GIS which can be efficient in the management of traffic accident information. PMID:28848627

  13. Improvements to information management systems simulator

    NASA Technical Reports Server (NTRS)

    Bilek, R. W.

    1972-01-01

    The performance of personnel in the augmentation and improvement of the interactive IMSIM information management simulation model is summarized. With this augmented model, NASA now has even greater capabilities for the simulation of computer system configurations, data processing loads imposed on these configurations, and executive software to control system operations. Through these simulations, NASA has an extremely cost effective capability for the design and analysis of computer-based data management systems.

  14. Evaluation of VICAR software capability for land information support system needs. [Elk River quadrangle, Idaho

    NASA Technical Reports Server (NTRS)

    Yao, S. S. (Principal Investigator)

    1981-01-01

    A preliminary evaluation of the processing capability of the VICAR software for land information support system needs is presented. The geometric and radiometric properties of four sets of LANDSAT data taken over the Elk River, Idaho quadrangle were compared. Storage of data sets, the means of location, pixel resolution, and radiometric and geometric characteristics are described. Recommended modifications of VICAR programs are presented.

  15. Encoding Standards for Linguistic Corpora.

    ERIC Educational Resources Information Center

    Ide, Nancy

    The demand for extensive reusability of large language text collections for natural languages processing research requires development of standardized encoding formats. Such formats must be capable of representing different kinds of information across the spectrum of text types and languages, capable of representing different levels of…

  16. 30 CFR 1227.103 - What must a State's delegation proposal contain?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... processing, including compatibility with ONRR automated systems, electronic commerce capabilities, and data storage capabilities; (B) Accessing reference data; (C) Contacting production or royalty reporters; (D...) Maintaining security of confidential and proprietary information; and (H) Providing data to other Federal...

  17. 30 CFR 1227.103 - What must a State's delegation proposal contain?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... processing, including compatibility with ONRR automated systems, electronic commerce capabilities, and data storage capabilities; (B) Accessing reference data; (C) Contacting production or royalty reporters; (D...) Maintaining security of confidential and proprietary information; and (H) Providing data to other Federal...

  18. 30 CFR 1227.103 - What must a State's delegation proposal contain?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... processing, including compatibility with ONRR automated systems, electronic commerce capabilities, and data storage capabilities; (B) Accessing reference data; (C) Contacting production or royalty reporters; (D...) Maintaining security of confidential and proprietary information; and (H) Providing data to other Federal...

  19. On the definition of the concepts thinking, consciousness, and conscience.

    PubMed Central

    Monin, A S

    1992-01-01

    A complex system (CS) is defined as a set of elements, with connections between them, singled out of the environment, capable of getting information from the environment, capable of making decisions (i.e., of choosing between alternatives), and having purposefulness (i.e., an urge towards preferable states or other goals). Thinking is a process that takes place (or which can take place) in some of the CS and consists of (i) receiving information from the environment (and from itself), (ii) memorizing the information, (iii) the subconscious, and (iv) consciousness. Life is a process that takes place in some CS and consists of functions i and ii, as well as (v) reproduction with passing of hereditary information to progeny, and (vi) oriented energy and matter exchange with the environment sufficient for the maintenance of all life processes. Memory is a complex of processes of placing information in memory banks, keeping it there, and producing it according to prescriptions available in the system or to inquiries arising in it. Consciousness is a process of realization by the thinking CS of some set of algorithms consisting of the comparison of its knowledge, intentions, decisions, and actions with reality--i.e., with accumulated and continuously received internal and external information. Conscience is a realization of an algorithm of good and evil pattern recognition. PMID:1631060

  20. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  1. Space processing applications payload equipment study. Volume 2C: Data acquisition and process control

    NASA Technical Reports Server (NTRS)

    Kayton, M.; Smith, A. G.

    1974-01-01

    The services provided by the Spacelab Information Management System are discussed. The majority of the services are provided by the common-support subsystems in the Support Module furnished by the Spacelab manufacturer. The information processing requirements for the space processing applications (SPA) are identified. The requirements and capabilities for electric power, display and control panels, recording and telemetry, intercom, and closed circuit television are analyzed.

  2. Low-cost Landsat digital processing system for state and local information systems

    NASA Technical Reports Server (NTRS)

    Hooper, N. J.; Spann, G. W.; Faust, N. L.; Paludan, C. T. N.

    1979-01-01

    The paper details a minicomputer-based system which is well within the budget of many state, regional, and local agencies that previously could not afford digital processing capability. In order to achieve this goal a workable small-scale Landsat system is examined to provide low-cost automated processing. It is anticipated that the alternative systems will be based on a single minicomputer, but that the peripherals will vary depending on the capability emphasized in a particular system.

  3. The Use of a UNIX-Based Workstation in the Information Systems Laboratory

    DTIC Science & Technology

    1989-03-01

    system. The conclusions of the research and the resulting recommendations are presented in Chapter III. These recommendations include how to manage...required to run the program on a new system, these should not be significant changes. 2. Processing Environment The UNIX processing environment is...interactive with multi-tasking and multi-user capabilities. Multi-tasking refers to the fact that many programs can be run concurrently. This capability

  4. The Convergence of Information Technology, Data, and Management in a Library Imaging Program

    ERIC Educational Resources Information Center

    France, Fenella G.; Emery, Doug; Toth, Michael B.

    2010-01-01

    Integrating advanced imaging and processing capabilities in libraries, archives, and museums requires effective systems and information management to ensure that the large amounts of digital data about cultural artifacts can be readily acquired, stored, archived, accessed, processed, and linked to other data. The Library of Congress is developing…

  5. Information management - Assessing the demand for information

    NASA Technical Reports Server (NTRS)

    Rogers, William H.

    1991-01-01

    Information demand is defined in terms of both information content (what information) and form (when, how, and where it is needed). Providing the information richness required for flight crews to be informed without overwhelming their information processing capabilities will require a great deal of automated intelligence. It is seen that the essence of this intelligence is comprehending and capturing the demand for information.

  6. Future Directions in Navy Electronic System Reliability and Survivability.

    DTIC Science & Technology

    1981-06-01

    CENTERSAN DIEGO, CA 92152 AN ACTIVITY OF THE NAVAL MATERIAL COMMAND SL GUILLE, CAPT, USN HLBLOOD Commander Technical Director ADMINISTRATIVE INFORMATION...maintenancepoiys proposed as one remedy to these problems. To implement this policy, electronic systems which are very reliable and which include health ...distribute vital data, data-processing capability, and communication capability through the use of intraship and intership networks. The capability to

  7. Information Technologies for the 1980's: Lasers and Microprocessors.

    ERIC Educational Resources Information Center

    Mathews, William D.

    This discussion of the development and application of lasers and microprocessors to information processing stresses laser communication in relation to capacity, reliability, and cost and the advantages of this technology to real-time information access and information storage. The increased capabilities of microprocessors are reviewed, and a…

  8. AOIPS - An interactive image processing system. [Atmospheric and Oceanic Information Processing System

    NASA Technical Reports Server (NTRS)

    Bracken, P. A.; Dalton, J. T.; Quann, J. J.; Billingsley, J. B.

    1978-01-01

    The Atmospheric and Oceanographic Information Processing System (AOIPS) was developed to help applications investigators perform required interactive image data analysis rapidly and to eliminate the inefficiencies and problems associated with batch operation. This paper describes the configuration and processing capabilities of AOIPS and presents unique subsystems for displaying, analyzing, storing, and manipulating digital image data. Applications of AOIPS to research investigations in meteorology and earth resources are featured.

  9. MINIS: Multipurpose Interactive NASA Information System

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The Multipurpose Interactive NASA Information Systems (MINIS) was developed in response to the need for a data management system capable of operation on several different minicomputer systems. The desired system had to be capable of performing the functions of a LANDSAT photo descriptive data retrieval system while remaining general in terms of other acceptable user definable data bases. The system also had to be capable of performing data base updates and providing user-formatted output reports. The resultant MINI System provides all of these capabilities and several other features to complement the data management system. The MINI System is currently implemented on two minicomputer systems and is in the process of being installed on another minicomputer system. The MINIS is operational on four different data bases.

  10. IT Acquisition: Expediting the Process to Deliver Business Capabilities to the DoD Enterprise. Revised Edition

    DTIC Science & Technology

    2012-07-01

    effectively manage delivery of information capabilities. Under IT 360, they will need to incorporate constantly evolving, market -driven commercial systems...traditional acquisition system; under IT 360, these processes are largely obsolete and create oversight ambiguities. • Congress requires that funds be...2004). Furthermore, because the product is not available on the commercial market , the development of any complementary updates will also need to be

  11. Bringing the Ocean to the Precollege Classroom through field Investigations at a National Underwater Laboratory

    DTIC Science & Technology

    1998-09-30

    was to use field experiences to 1) enhance educator capability in science content and skills, 2) immerse school systems in an inquiry-driven, active ... learning process, and 3) establish links to real-time scientific information in support of classroom activities. Participants capability in marine

  12. Factors Affecting Relationships between the Contextual Variables and the Information Characteristics of Accounting Information Systems.

    ERIC Educational Resources Information Center

    Choe, Jong-Min; Lee, Jinjoo

    1993-01-01

    Reports on a study of accounting information systems that explored the interactions among influence factors (e.g., user participation in the development process, top management support, capability of information systems personnel, and existence of steering committees), contextual variables (e.g., organizational structure and task characteristics),…

  13. Transforming Information Literacy Conversations to Enhance Student Learning: New Curriculum Dialogues

    ERIC Educational Resources Information Center

    Salisbury, Fiona A.; Karasmanis, Sharon; Robertson, Tracy; Corbin, Jenny; Hulett, Heather; Peseta, Tai L.

    2012-01-01

    Information literacy is an essential component of the La Trobe University inquiry/research graduate capability and it provides the skill set needed for students to take their first steps on the path to engaging with academic information and scholarly communication processes. A deep learning approach to information literacy can be achieved if…

  14. Information Operations Primer

    DTIC Science & Technology

    2010-11-01

    altering drugs ) but must be influenced indirectly through the physical and information dimensions. c. Information Operations modify the three dimensions...restoration of information systems by incorporating protection, detection, and reaction capabilities. (2) Physical Security is that part of security...wargamed using the traditional friendly action, expected enemy reaction , and friendly counteraction methodology. The wargaming process must also occur

  15. Managing Information Overload for Senior Leaders in the 21st Century

    ERIC Educational Resources Information Center

    Jackson, Jason M.

    2013-01-01

    Information overload is a state where information input exceeds processing capability by an individual, and adverse effects of information overload, such as becoming less productive, making bad decisions, and becoming highly selective, are growing. Guided by Glaser and Strauss' work on grounded theory, this study examined adverse impact of…

  16. Information services and information processing

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Attempts made to design and extend space system capabilities are reported. Special attention was given to establishing user needs for information or services which might be provided by space systems. Data given do not attempt to detail scientific, technical, or economic bases for the needs expressed by the users.

  17. Thrust and parry of the SIOP (single integrated operational plan) and sdi (strategic defense initiative). Research report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zank, G.D.

    1989-05-01

    The relationship between strategic offensive capabilities (reflected in the SIOP) and emerging strategic defensive capabilities (reflected by SDI) is not being adequately addressed. A summary of the existing nuclear war planning process is provided, and an analagous defensive process is postulated. Parallels and differences between the two processes are discussed. Potential areas for information exchange and cooperation are identified to enhance deterrence and improve war fighting potential. Operational, technical and political issues requiring resolution are raised and recommendations to resolve these issues are made.

  18. An innovative approach to capability-based emergency operations planning

    PubMed Central

    Keim, Mark E

    2013-01-01

    This paper describes the innovative use information technology for assisting disaster planners with an easily-accessible method for writing and improving evidence-based emergency operations plans. This process is used to identify all key objectives of the emergency response according to capabilities of the institution, community or society. The approach then uses a standardized, objective-based format, along with a consensus-based method for drafting capability-based operational-level plans. This information is then integrated within a relational database to allow for ease of access and enhanced functionality to search, sort and filter and emergency operations plan according to user need and technological capacity. This integrated approach is offered as an effective option for integrating best practices of planning with the efficiency, scalability and flexibility of modern information and communication technology. PMID:28228987

  19. An innovative approach to capability-based emergency operations planning.

    PubMed

    Keim, Mark E

    2013-01-01

    This paper describes the innovative use information technology for assisting disaster planners with an easily-accessible method for writing and improving evidence-based emergency operations plans. This process is used to identify all key objectives of the emergency response according to capabilities of the institution, community or society. The approach then uses a standardized, objective-based format, along with a consensus-based method for drafting capability-based operational-level plans. This information is then integrated within a relational database to allow for ease of access and enhanced functionality to search, sort and filter and emergency operations plan according to user need and technological capacity. This integrated approach is offered as an effective option for integrating best practices of planning with the efficiency, scalability and flexibility of modern information and communication technology.

  20. Synthetic Analog and Digital Circuits for Cellular Computation and Memory

    PubMed Central

    Purcell, Oliver; Lu, Timothy K.

    2014-01-01

    Biological computation is a major area of focus in synthetic biology because it has the potential to enable a wide range of applications. Synthetic biologists have applied engineering concepts to biological systems in order to construct progressively more complex gene circuits capable of processing information in living cells. Here, we review the current state of computational genetic circuits and describe artificial gene circuits that perform digital and analog computation. We then discuss recent progress in designing gene circuits that exhibit memory, and how memory and computation have been integrated to yield more complex systems that can both process and record information. Finally, we suggest new directions for engineering biological circuits capable of computation. PMID:24794536

  1. NOUS: A Knowledge Graph Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knowledge graphs represent information as entities and relationships between them. For tasks such as natural language question answering or automated analysis of text, a knowledge graph provides valuable context to establish the specific type of entities being discussed. It allow us to derive better context about newly arriving information and leads to intelligent reasoning capabilities. We address two primary needs: A) Automated construction of knowledge graphs is a technically challenging, expensive process; and B) The ability to synthesize new information by monitoring newly emerging knowledge is a transformational capability that does not exist in state of the art systems.

  2. A user need study and system plan for an Arizona Natural Resources Information System report to the Arizona state legislature

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A survey instrument was developed and implemented in order to evaluate the current needs for natural resource information in Arizona and to determine which state agencies have information systems capable of coordinating, accessing and analyzing the data. Data and format requirements were determined for the following categories: air quality, animals, cultural resources, geology, land use, soils, water, vegetation, ownership, and social and economic aspects. Hardware and software capabilities were assessed and a data processing plan was developed. Possible future applications with the next generation LANDSAT were also identified.

  3. 78 FR 12271 - Wireline Competition Bureau Seeks Additional Comment In Connect America Cost Model Virtual Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-22

    ... Competition Bureau seeks public input on additional questions relating to modeling voice capability and Annual... submitting comments and additional information on the rulemaking process, see the SUPPLEMENTARY INFORMATION section of this document. FOR FURTHER INFORMATION CONTACT: Katie King, Wireline Competition Bureau at (202...

  4. Data Requirements for Oceanic Processes in the Open Ocean, Coastal Zone, and Cryosphere

    NASA Technical Reports Server (NTRS)

    Nagler, R. G.; Mccandless, S. W., Jr.

    1978-01-01

    The type of information system that is needed to meet the requirements of ocean, coastal, and polar region users was examined. The requisite qualities of the system are: (1) availability, (2) accessibility, (3) responsiveness, (4) utility, (5) continuity, and (6) NASA participation. The system would not displace existing capabilities, but would have to integrate and expand the capabilities of existing systems and resolve the deficiencies that currently exist in producer-to-user information delivery options.

  5. Integrating Thematic Web Portal Capabilities into the NASA Earthdata Web Infrastructure

    NASA Technical Reports Server (NTRS)

    Wong, Minnie; Baynes, Kathleen E.; Huang, Thomas; McLaughlin, Brett

    2015-01-01

    This poster will present the process of integrating thematic web portal capabilities into the NASA Earth data web infrastructure, with examples from the Sea Level Change Portal. The Sea Level Change Portal will be a source of current NASA research, data and information regarding sea level change. The portal will provide sea level change information through articles, graphics, videos and animations, an interactive tool to view and access sea level change data and a dashboard showing sea level change indicators.

  6. Real-time face and gesture analysis for human-robot interaction

    NASA Astrophysics Data System (ADS)

    Wallhoff, Frank; Rehrl, Tobias; Mayer, Christoph; Radig, Bernd

    2010-05-01

    Human communication relies on a large number of different communication mechanisms like spoken language, facial expressions, or gestures. Facial expressions and gestures are one of the main nonverbal communication mechanisms and pass large amounts of information between human dialog partners. Therefore, to allow for intuitive human-machine interaction, a real-time capable processing and recognition of facial expressions, hand and head gestures are of great importance. We present a system that is tackling these challenges. The input features for the dynamic head gestures and facial expressions are obtained from a sophisticated three-dimensional model, which is fitted to the user in a real-time capable manner. Applying this model different kinds of information are extracted from the image data and afterwards handed over to a real-time capable data-transferring framework, the so-called Real-Time DataBase (RTDB). In addition to the head and facial-related features, also low-level image features regarding the human hand - optical flow, Hu-moments are stored into the RTDB for the evaluation process of hand gestures. In general, the input of a single camera is sufficient for the parallel evaluation of the different gestures and facial expressions. The real-time capable recognition of the dynamic hand and head gestures are performed via different Hidden Markov Models, which have proven to be a quick and real-time capable classification method. On the other hand, for the facial expressions classical decision trees or more sophisticated support vector machines are used for the classification process. These obtained results of the classification processes are again handed over to the RTDB, where other processes (like a Dialog Management Unit) can easily access them without any blocking effects. In addition, an adjustable amount of history can be stored by the RTDB buffer unit.

  7. Integrated Systems Health Management for Intelligent Systems

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Melcher, Kevin

    2011-01-01

    The implementation of an integrated system health management (ISHM) capability is fundamentally linked to the management of data, information, and knowledge (DIaK) with the purposeful objective of determining the health of a system. Management implies storage, distribution, sharing, maintenance, processing, reasoning, and presentation. ISHM is akin to having a team of experts who are all individually and collectively observing and analyzing a complex system, and communicating effectively with each other in order to arrive at an accurate and reliable assessment of its health. In this chapter, concepts, procedures, and approaches are presented as a foundation for implementing an ISHM capability relevant to intelligent systems. The capability stresses integration of DIaK from all elements of a system, emphasizing an advance toward an on-board, autonomous capability. Both ground-based and on-board ISHM capabilities are addressed. The information presented is the result of many years of research, development, and maturation of technologies, and of prototype implementations in operational systems.

  8. A Maturity Model for Assessing the Use of ICT in School Education

    ERIC Educational Resources Information Center

    Solar, Mauricio; Sabattin, Jorge; Parada, Victor

    2013-01-01

    This article describes an ICT-based and capability-driven model for assessing ICT in education capabilities and maturity of schools. The proposed model, called ICTE-MM (ICT in School Education Maturity Model), has three elements supporting educational processes: information criteria, ICT resources, and leverage domains. Changing the traditional…

  9. Information Dominance in Military Decision Making.

    DTIC Science & Technology

    1999-06-04

    This study considers how ABCS (Army Battle Command System) capabilities achieve information dominance and how they influence the military decision...making process. The work examines how ABCS enables commanders and staffs to achieve information dominance at the brigade and battalion levels. Further...future digitized systems that will gain information dominance for the future commander. It promotes the continued development information dominance technologies

  10. Image processing mini manual

    NASA Technical Reports Server (NTRS)

    Matthews, Christine G.; Posenau, Mary-Anne; Leonard, Desiree M.; Avis, Elizabeth L.; Debure, Kelly R.; Stacy, Kathryn; Vonofenheim, Bill

    1992-01-01

    The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC.

  11. Navy Information Dominance Corps: Human Capital Strategy 2012-2017

    DTIC Science & Technology

    2012-01-01

    Information Dominance (ID) is the operational advantage gained from fully integrating information functions, capabilities, resources and people to...and information domains. The human component of ID is the Information Dominance Corps (IDC) and it has three core functions in this mission. First, it...processes, delivery of a Corps-wide learning continuum, and cultivation of an identifiable, inclusive Information Dominance culture and ethos. This

  12. Handbook of automated data collection methods for the National Transit Database

    DOT National Transportation Integrated Search

    2003-10-01

    In recent years, with the increasing sophistication and capabilities of information processing technologies, there has been a renewed interest on the part of transit systems to tap the rich information potential of the National Transit Database (NTD)...

  13. Portability scenarios for intelligent robotic control agent software

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2014-06-01

    Portability scenarios are critical in ensuring that a piece of AI control software will run effectively across the collection of craft that it is required to control. This paper presents scenarios for control software that is designed to control multiple craft with heterogeneous movement and functional characteristics. For each prospective target-craft type, its capabilities, mission function, location, communications capabilities and power profile are presented and performance characteristics are reviewed. This work will inform future work related to decision making related to software capabilities, hardware control capabilities and processing requirements.

  14. Acquisition of Programming Skills

    DTIC Science & Technology

    1990-04-01

    skills (e.g., arithmetic reasoning, work knowledge, information processing speed); and c) passive versus active learning style. Ability measures...concurrent storage and processing an individual was capable of doing), and an active learning style. Implications of the findings for the development of

  15. Transitioning mine warfare to network-centric sensor analysis: future PMA technologies & capabilities

    NASA Astrophysics Data System (ADS)

    Stack, J. R.; Guthrie, R. S.; Cramer, M. A.

    2009-05-01

    The purpose of this paper is to outline the requisite technologies and enabling capabilities for network-centric sensor data analysis within the mine warfare community. The focus includes both automated processing and the traditional humancentric post-mission analysis (PMA) of tactical and environmental sensor data. This is motivated by first examining the high-level network-centric guidance and noting the breakdown in the process of distilling actionable requirements from this guidance. Examples are provided that illustrate the intuitive and substantial capability improvement resulting from processing sensor data jointly in a network-centric fashion. Several candidate technologies are introduced including the ability to fully process multi-sensor data given only partial overlap in sensor coverage and the ability to incorporate target identification information in stride. Finally the critical enabling capabilities are outlined including open architecture, open business, and a concept of operations. This ability to process multi-sensor data in a network-centric fashion is a core enabler of the Navy's vision and will become a necessity with the increasing number of manned and unmanned sensor systems and the requirement for their simultaneous use.

  16. Health Information Technologies-Academic and Commercial Evaluation (HIT-ACE) methodology: description and application to clinical feedback systems.

    PubMed

    Lyon, Aaron R; Lewis, Cara C; Melvin, Abigail; Boyd, Meredith; Nicodimos, Semret; Liu, Freda F; Jungbluth, Nathaniel

    2016-09-22

    Health information technologies (HIT) have become nearly ubiquitous in the contemporary healthcare landscape, but information about HIT development, functionality, and implementation readiness is frequently siloed. Theory-driven methods of compiling, evaluating, and integrating information from the academic and commercial sectors are necessary to guide stakeholder decision-making surrounding HIT adoption and to develop pragmatic HIT research agendas. This article presents the Health Information Technologies-Academic and Commercial Evaluation (HIT-ACE) methodology, a structured, theory-driven method for compiling and evaluating information from multiple sectors. As an example demonstration of the methodology, we apply HIT-ACE to mental and behavioral health measurement feedback systems (MFS). MFS are a specific class of HIT that support the implementation of routine outcome monitoring, an evidence-based practice. HIT-ACE is guided by theories and frameworks related to user-centered design and implementation science. The methodology involves four phases: (1) coding academic and commercial materials, (2) developer/purveyor interviews, (3) linking putative implementation mechanisms to hit capabilities, and (4) experimental testing of capabilities and mechanisms. In the current demonstration, phase 1 included a systematic process to identify MFS in mental and behavioral health using academic literature and commercial websites. Using user-centered design, implementation science, and feedback frameworks, the HIT-ACE coding system was developed, piloted, and used to review each identified system for the presence of 38 capabilities and 18 additional characteristics via a consensus coding process. Bibliometic data were also collected to examine the representation of the systems in the scientific literature. As an example, results are presented for the application of HIT-ACE phase 1 to MFS wherein 49 separate MFS were identified, reflecting a diverse array of characteristics and capabilities. Preliminary findings demonstrate the utility of HIT-ACE to represent the scope and diversity of a given class of HIT beyond what can be identified in the academic literature. Phase 2 data collection is expected to confirm and expand the information presented and phases 3 and 4 will provide more nuanced information about the impact of specific HIT capabilities. In all, HIT-ACE is expected to support adoption decisions and additional HIT development and implementation research.

  17. Using lesson study to integrate information literacy throughout the curriculum.

    PubMed

    Stombaugh, Angie; Sperstad, Rita; Vanwormer, Arin; Jennings, Eric; Kishel, Hans; Vogh, Bryan

    2013-01-01

    To develop evidence-based practice skills, students need to be capable of retrieving various levels of scholarly information, evaluating its usefulness, and applying it to clinical practice. The authors discuss the process of developing an information literacy curriculum for a cohort of students over a 5-semester nursing program using lesson study.

  18. NATO initial common operational picture capability project

    NASA Astrophysics Data System (ADS)

    Fanti, Laura; Beach, David

    2002-08-01

    The Common Operational Picture (COP) capability can be defined as the ability to display on a single screen integrated views of the Recognized Maritime, Air and Ground Pictures, enriched by other tactical data, such as theater plans, assets, intelligence and logistics information. The purpose of the COP capability is to provide military forces a comprehensive view of the battle space, thereby enhancing situational awareness and the decision-making process across the military command and control spectrum. The availability of a COP capability throughout the command structure is a high priority operational requirement in NATO. A COP capability for NATO is being procured and implemented in an incremental way within the NATO Automated Information System (Bi-SC AIS) Functional Services programme under the coordination of the NATO Consultation, Command and Control Agency (NC3A) Integrated Programme Team 5 (IPT5). The NATO Initial COP (iCOP) capability project, first step of this evolutionary procurement, will provide an initial COP capability to NATO in a highly pragmatic and low-risk fashion, by using existing operational communications infrastructure and NATO systems, i.e. the NATO-Wide Integrated Command and Control Software for Air Operations (ICC), the Maritime Command and Control Information System (MCCIS), and the Joint Operations and Intelligence Information System (JOIIS), which will provide respectively the Recognized Air, Maritime and Ground Pictures. This paper gives an overview of the NATO Initial COP capability project, including its evolutionary implementation approach, and describes the technical solution selected to satisfy the urgent operational requirement in a timely and cost effective manner.

  19. Implementing Information Assurance - Beyond Process

    DTIC Science & Technology

    2009-01-01

    disabled or properly configured. Tools and scripts are available to expedite the configuration process on some platforms, For example, approved Windows...in the System Security Plan (SSP) or Information Security Plan (lSP). Any PPSs not required for operation by the system must be disabled , This...Services must be disabled , Implementing an 1M capability within the boundary carries many policy and documentation requirements. Usemame and passwords

  20. Capabilities of radar as they might relate to entomological studies

    NASA Technical Reports Server (NTRS)

    Skolnik, M. I.

    1979-01-01

    A tutoral background of radar capabilities and its potential for insect research is provided. The basic principles and concepts of radar were reviewed. Information on current radar equipment was examined. Specific issues related to insect research included; target cross-section, radar frequency, tracking target recognition and false alarms, clutter reduction, radar transmitter power, and ascertained atmospheric processes.

  1. Management Sciences Division Annual Report (9th)

    DTIC Science & Technology

    1992-01-01

    41 Actuarial Process Consolidation and Review ....................................... 43 How M alfunction Code Reduction...47 Sun W ork Stations ............................................................................... 48 Actuarial Process Consolidation and...Information System (WSMIS). Dyna-METRIC is used for wartime supply support capability assessments. The Aircraft Sustainability Model ( ASM ) is the

  2. Synthetic analog and digital circuits for cellular computation and memory.

    PubMed

    Purcell, Oliver; Lu, Timothy K

    2014-10-01

    Biological computation is a major area of focus in synthetic biology because it has the potential to enable a wide range of applications. Synthetic biologists have applied engineering concepts to biological systems in order to construct progressively more complex gene circuits capable of processing information in living cells. Here, we review the current state of computational genetic circuits and describe artificial gene circuits that perform digital and analog computation. We then discuss recent progress in designing gene networks that exhibit memory, and how memory and computation have been integrated to yield more complex systems that can both process and record information. Finally, we suggest new directions for engineering biological circuits capable of computation. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Modelling environmental variables for geohazards and georesources assessment to support sustainable land-use decisions in Zaragoza (Spain)

    NASA Astrophysics Data System (ADS)

    Lamelas, M. T.; Hoppe, A.; de la Riva, J.; Marinoni, O.

    2009-10-01

    Land-use decisions are usually made on the basis of a variety of criteria. While it is common practice to integrate economic, ecological and social (triple bottom line) criteria, explicit geoscientific factors are relatively rarely considered. If a planned land use involves an interaction with the geosphere, geoscientific aspects should be playing a more important role in the process. With the objective to facilitate a sustainable land-use decision-making a research project was initiated. The area around the city of Zaragoza, in the Ebro Basin of northern Spain, was chosen due to its high degree of industrialisation and urbanization. The area is exposed to several geohazards (e.g., sinkholes and erosion) that may have significant negative effects on current and future land uses. Geographical Information System (GIS) technologies are used to process the complex geoscientific information. Further GIS analysis comprised the creation of an erosion susceptibility map that follows the ITC (International Institute for Geo-Information Science and Earth Observation) system of terrain analysis. The agricultural capability of the soil was determined using the Microleis System. We identify geomorphologic units that show high susceptibility to erosion and high agricultural potential and suggest a method to implement this information in a land-use planning process. Degraded slopes developed upon Tertiary rocks show the highest susceptibility to erosion and low values of agricultural capability, whereas the flat valley bottoms and irrigated flood plains have the highest values of agricultural capability.

  4. Hardware for dynamic quantum computing.

    PubMed

    Ryan, Colm A; Johnson, Blake R; Ristè, Diego; Donovan, Brian; Ohki, Thomas A

    2017-10-01

    We describe the hardware, gateware, and software developed at Raytheon BBN Technologies for dynamic quantum information processing experiments on superconducting qubits. In dynamic experiments, real-time qubit state information is fed back or fed forward within a fraction of the qubits' coherence time to dynamically change the implemented sequence. The hardware presented here covers both control and readout of superconducting qubits. For readout, we created a custom signal processing gateware and software stack on commercial hardware to convert pulses in a heterodyne receiver into qubit state assignments with minimal latency, alongside data taking capability. For control, we developed custom hardware with gateware and software for pulse sequencing and steering information distribution that is capable of arbitrary control flow in a fraction of superconducting qubit coherence times. Both readout and control platforms make extensive use of field programmable gate arrays to enable tailored qubit control systems in a reconfigurable fabric suitable for iterative development.

  5. MISSE in the Materials and Processes Technical Information System (MAPTIS )

    NASA Technical Reports Server (NTRS)

    Burns, DeWitt; Finckenor, Miria; Henrie, Ben

    2013-01-01

    Materials International Space Station Experiment (MISSE) data is now being collected and distributed through the Materials and Processes Technical Information System (MAPTIS) at Marshall Space Flight Center in Huntsville, Alabama. MISSE data has been instrumental in many programs and continues to be an important source of data for the space community. To facilitate great access to the MISSE data the International Space Station (ISS) program office and MAPTIS are working to gather this data into a central location. The MISSE database contains information about materials, samples, and flights along with pictures, pdfs, excel files, word documents, and other files types. Major capabilities of the system are: access control, browsing, searching, reports, and record comparison. The search capabilities will search within any searchable files so even if the desired meta-data has not been associated data can still be retrieved. Other functionality will continue to be added to the MISSE database as the Athena Platform is expanded

  6. Global Access-controlled Transfer e-frame (GATe)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-05-30

    Global Access-controlled Transfer e-frame (GATe) was designed to take advantage of the patterns that occur during an electronic record transfer process. The e-frame (or electronic framework or platform) is the foundation for developing secure information transfer to meet classified and unclassified business processes and is particularly useful when there is a need to share information with various entities in a controlled and secure environment. It can share, search, upload, download and retrieve sensitive information, as well as provides reporting capabilities.

  7. A Proposed Operational Concept for the Defense Communications Operations Support System.

    DTIC Science & Technology

    1986-01-01

    Artificial Intelligence AMA Automatic Message Accounting AMIE AUTODIN Management Index System AMPE Automated Message Processing Exchange ANCS AUTOVON Network...Support IMPRESS Inpact/Restoral System INFORM Information Retrieval System 1OC Initial Operational Capability IRU Intellegent Remote Unit I-S/A AMPE

  8. Bringing Business Intelligence to Health Information Technology Curriculum

    ERIC Educational Resources Information Center

    Zheng, Guangzhi; Zhang, Chi; Li, Lei

    2015-01-01

    Business intelligence (BI) and healthcare analytics are the emerging technologies that provide analytical capability to help healthcare industry improve service quality, reduce cost, and manage risks. However, such component on analytical healthcare data processing is largely missed from current healthcare information technology (HIT) or health…

  9. PROCESS INTENSIFICATION: MICROWAVE INITIATED REACTIONS USING A CONTINUOUS FLOW REACTOR

    EPA Science Inventory

    The concept of process intensification has been used to develop a continuous narrow channel reactor at Clarkson capable of carrying out reactions under isothermal conditions whilst being exposed to microwave (MW) irradiation thereby providing information on the true effect of mi...

  10. Composing Models of Geographic Physical Processes

    NASA Astrophysics Data System (ADS)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  11. Joint Intelligence Operations Center (JIOC) Baseline Business Process Model & Capabilities Evaluation Methodology

    DTIC Science & Technology

    2012-03-01

    Targeting Review Board OPLAN Operations Plan OPORD Operations Order OPSIT Operational Situation OSINT Open Source Intelligence OV...Analysis Evaluate FLTREPs MISREPs Unit Assign Assets Feedback Asset Shortfalls Multi-Int Collection Political & Embasy Law Enforcement HUMINT OSINT ...Embassy Information OSINT Manage Theater HUMINT Law Enforcement Collection Sort Requests Platform Information Agency Information M-I Collect

  12. Washington Community Colleges Factbook. Addendum B: A Description of the Community College Management Information System.

    ERIC Educational Resources Information Center

    Meier, Terre; Bundy, Larry

    The Management Information System (MIS) of the Washington State system of community colleges was designed to be responsive to legislative and district requests for information and to enhance the State Board's capabilities to manage the community college system and integrate its budgeting and planning processes. The MIS consists of seven…

  13. White-Light Optical Information Processing and Holography.

    DTIC Science & Technology

    1983-05-03

    Processing, White-Light Holography, Image Subtraction, Image Deblurring , Coherence Requirement, Apparent Transfer Function, Source Encoding, Signal...in this period, also demonstrated several color image processing capabilities. Among those are broadband color image deblurring and color image...Broadband Image Deblurring ..... ......... 6 2.5 Color Image Subtraction ............... 7 2.6 Rainbow Holographic Aberrations . . ..... 7 2.7

  14. Man as an Information Processor: A Bibliography (1972-1976).

    DTIC Science & Technology

    1977-09-01

    Alloway (Eds.), Communication and affect: Language and thought. New York, MY: Academic Press, 1973, 200. Craik , F. I., & Lockhart , R. S. Levels of...Mazuryk, G. F., & Lockhart , R. S. Negative recency and levels of processing in free recall. Canadian Journal of Psychology, 1974, 28(1), 114-123...capability -- to accomplish work in the area of information and decision processes at both the exploratory development and advanced development levels

  15. SUPERFUND TREATABILITY CLEARINGHOUSE: FULL ...

    EPA Pesticide Factsheets

    This treatability study reports on the results of one of a series of field trials using various remedial action technologies that may be capable of restoring Herbicide Orange (HO)XDioxin contaminated sites. A full-scale field trial using a rotary kiln incinerator capable of processing up to 6 tons per hour of dioxin contaminated soil was conducted at the Naval Construction Battalion Center, Gulfport, MS. publish information

  16. The agent-based spatial information semantic grid

    NASA Astrophysics Data System (ADS)

    Cui, Wei; Zhu, YaQiong; Zhou, Yong; Li, Deren

    2006-10-01

    Analyzing the characteristic of multi-Agent and geographic Ontology, The concept of the Agent-based Spatial Information Semantic Grid (ASISG) is defined and the architecture of the ASISG is advanced. ASISG is composed with Multi-Agents and geographic Ontology. The Multi-Agent Systems are composed with User Agents, General Ontology Agent, Geo-Agents, Broker Agents, Resource Agents, Spatial Data Analysis Agents, Spatial Data Access Agents, Task Execution Agent and Monitor Agent. The architecture of ASISG have three layers, they are the fabric layer, the grid management layer and the application layer. The fabric layer what is composed with Data Access Agent, Resource Agent and Geo-Agent encapsulates the data of spatial information system so that exhibits a conceptual interface for the Grid management layer. The Grid management layer, which is composed with General Ontology Agent, Task Execution Agent and Monitor Agent and Data Analysis Agent, used a hybrid method to manage all resources that were registered in a General Ontology Agent that is described by a General Ontology System. The hybrid method is assembled by resource dissemination and resource discovery. The resource dissemination push resource from Local Ontology Agent to General Ontology Agent and the resource discovery pull resource from the General Ontology Agent to Local Ontology Agents. The Local Ontology Agent is derived from special domain and describes the semantic information of local GIS. The nature of the Local Ontology Agents can be filtrated to construct a virtual organization what could provides a global scheme. The virtual organization lightens the burdens of guests because they need not search information site by site manually. The application layer what is composed with User Agent, Geo-Agent and Task Execution Agent can apply a corresponding interface to a domain user. The functions that ASISG should provide are: 1) It integrates different spatial information systems on the semantic The Grid management layer establishes a virtual environment that integrates seamlessly all GIS notes. 2) When the resource management system searches data on different spatial information systems, it transfers the meaning of different Local Ontology Agents rather than access data directly. So the ability of search and query can be said to be on the semantic level. 3) The data access procedure is transparent to guests, that is, they could access the information from remote site as current disk because the General Ontology Agent could automatically link data by the Data Agents that link the Ontology concept to GIS data. 4) The capability of processing massive spatial data. Storing, accessing and managing massive spatial data from TB to PB; efficiently analyzing and processing spatial data to produce model, information and knowledge; and providing 3D and multimedia visualization services. 5) The capability of high performance computing and processing on spatial information. Solving spatial problems with high precision, high quality, and on a large scale; and process spatial information in real time or on time, with high-speed and high efficiency. 6) The capability of sharing spatial resources. The distributed heterogeneous spatial information resources are Shared and realizing integrated and inter-operated on semantic level, so as to make best use of spatial information resources,such as computing resources, storage devices, spatial data (integrating from GIS, RS and GPS), spatial applications and services, GIS platforms, 7) The capability of integrating legacy GIS system. A ASISG can not only be used to construct new advanced spatial application systems, but also integrate legacy GIS system, so as to keep extensibility and inheritance and guarantee investment of users. 8) The capability of collaboration. Large-scale spatial information applications and services always involve different departments in different geographic places, so remote and uniform services are needed. 9) The capability of supporting integration of heterogeneous systems. Large-scale spatial information systems are always synthetically applications, so ASISG should provide interoperation and consistency through adopting open and applied technology standards. 10) The capability of adapting dynamic changes. Business requirements, application patterns, management strategies, and IT products always change endlessly for any departments, so ASISG should be self-adaptive. Two examples are provided in this paper, those examples provide a detailed way on how you design your semantic grid based on Multi-Agent systems and Ontology. In conclusion, the semantic grid of spatial information system could improve the ability of the integration and interoperability of spatial information grid.

  17. General intelligence predicts memory change across sleep.

    PubMed

    Fenn, Kimberly M; Hambrick, David Z

    2015-06-01

    Psychometric intelligence (g) is often conceptualized as the capability for online information processing but it is also possible that intelligence may be related to offline processing of information. Here, we investigated the relationship between psychometric g and sleep-dependent memory consolidation. Participants studied paired-associates and were tested after a 12-hour retention interval that consisted entirely of wake or included a regular sleep phase. We calculated the number of word-pairs that were gained and lost across the retention interval. In a separate session, participants completed a battery of cognitive ability tests to assess g. In the wake group, g was not correlated with either memory gain or memory loss. In the sleep group, we found that g correlated positively with memory gain and negatively with memory loss. Participants with a higher level of general intelligence showed more memory gain and less memory loss across sleep. Importantly, the correlation between g and memory loss was significantly stronger in the sleep condition than in the wake condition, suggesting that the relationship between g and memory loss across time is specific to time intervals that include sleep. The present research suggests that g not only reflects the capability for online cognitive processing, but also reflects capability for offline processes that operate during sleep.

  18. The composite load spectra project

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Ho, H.; Kurth, R. E.

    1990-01-01

    Probabilistic methods and generic load models capable of simulating the load spectra that are induced in space propulsion system components are being developed. Four engine component types (the transfer ducts, the turbine blades, the liquid oxygen posts and the turbopump oxidizer discharge duct) were selected as representative hardware examples. The composite load spectra that simulate the probabilistic loads for these components are typically used as the input loads for a probabilistic structural analysis. The knowledge-based system approach used for the composite load spectra project provides an ideal environment for incremental development. The intelligent database paradigm employed in developing the expert system provides a smooth coupling between the numerical processing and the symbolic (information) processing. Large volumes of engine load information and engineering data are stored in database format and managed by a database management system. Numerical procedures for probabilistic load simulation and database management functions are controlled by rule modules. Rules were hard-wired as decision trees into rule modules to perform process control tasks. There are modules to retrieve load information and models. There are modules to select loads and models to carry out quick load calculations or make an input file for full duty-cycle time dependent load simulation. The composite load spectra load expert system implemented today is capable of performing intelligent rocket engine load spectra simulation. Further development of the expert system will provide tutorial capability for users to learn from it.

  19. SiC/SiC Composites for 1200 C and Above

    NASA Technical Reports Server (NTRS)

    DiCarlo, J. A.; Yun, H.-M.; Morscher, G. N.; Bhatt, R. T.

    2004-01-01

    The successful replacement of metal alloys by ceramic matrix composites (CMC) in high-temperature engine components will require the development of constituent materials and processes that can provide CMC systems with enhanced thermal capability along with the key thermostructural properties required for long-term component service. This chapter presents information concerning processes and properties for five silicon carbide (SiC) fiber-reinforced SiC matrix composite systems recently developed by NASA that can operate under mechanical loading and oxidizing conditions for hundreds of hours at 1204, 1315, and 1427 C, temperatures well above current metal capability. This advanced capability stems in large part from specific NASA-developed processes that significantly improve the creep-rupture and environmental resistance of the SiC fiber as well as the thermal conductivity, creep resistance, and intrinsic thermal stability of the SiC matrices.

  20. Feasibility of Computer Processing of Technical Information on the Design of Instructional Systems. Final Report for the Period 1 July 1972 through 31 March 1973.

    ERIC Educational Resources Information Center

    Scheffler, F. L.; And Others

    A feasibility study examined the capability of a computer-based system's handling of technical information pertinent to the design of instructional systems. Structured interviews were held to assess the information needs of both researchers and practitioners and an investigation was conducted of 10 computer-based information storage and retrieval…

  1. Training Community Modeling and Simulation Business Plan: 2009 Edition

    DTIC Science & Technology

    2010-04-01

    strategic information assurance 33 33 Provide crisis action procedures training 34 34 Provide the IC SOF-specific training at the operational level... information and products • Collaborative analysis processes • Dissemination of information throughout a command and to subordinates by redundant means...centric M&S capabilities will improve training for information warfare, assist with training for homeland defense operations, crisis -management plan- ning

  2. Towards an automated intelligence product generation capability

    NASA Astrophysics Data System (ADS)

    Smith, Alison M.; Hawes, Timothy W.; Nolan, James J.

    2015-05-01

    Creating intelligence information products is a time consuming and difficult process for analysts faced with identifying key pieces of information relevant to a complex set of information requirements. Complicating matters, these key pieces of information exist in multiple modalities scattered across data stores, buried in huge volumes of data. This results in the current predicament analysts find themselves; information retrieval and management consumes huge amounts of time that could be better spent performing analysis. The persistent growth in data accumulation rates will only increase the amount of time spent on these tasks without a significant advance in automated solutions for information product generation. We present a product generation tool, Automated PrOduct Generation and Enrichment (APOGEE), which aims to automate the information product creation process in order to shift the bulk of the analysts' effort from data discovery and management to analysis. APOGEE discovers relevant text, imagery, video, and audio for inclusion in information products using semantic and statistical models of unstructured content. APOGEEs mixed-initiative interface, supported by highly responsive backend mechanisms, allows analysts to dynamically control the product generation process ensuring a maximally relevant result. The combination of these capabilities results in significant reductions in the time it takes analysts to produce information products while helping to increase the overall coverage. Through evaluation with a domain expert, APOGEE has been shown the potential to cut down the time for product generation by 20x. The result is a flexible end-to-end system that can be rapidly deployed in new operational settings.

  3. Report of the Panel on Computer and Information Technology

    NASA Technical Reports Server (NTRS)

    Lundstrom, Stephen F.; Larsen, Ronald L.

    1984-01-01

    Aircraft have become more and more dependent on computers (information processing) for improved performance and safety. It is clear that this activity will grow, since information processing technology has advanced by a factor of 10 every 5 years for the past 35 years and will continue to do so. Breakthroughs in device technology, from vacuum tubes through transistors to integrated circuits, contribute to this rapid pace. This progress is nearly matched by similar, though not as dramatic, advances in numerical software and algorithms. Progress has not been easy. Many technical and nontechnical challenges were surmounted. The outlook is for continued growth in capability but will require surmounting new challenges. The technology forecast presented in this report has been developed by extrapolating current trends and assessing the possibilities of several high-risk research topics. In the process, critical problem areas that require research and development emphasis have been identified. The outlook assumes a positive perspective; the projected capabilities are possible by the year 2000, and adequate resources will be made available to achieve them. Computer and information technology forecasts and the potential impacts of this technology on aeronautics are identified. Critical issues and technical challenges underlying the achievement of forecasted performance and benefits are addressed.

  4. A Forest Fire Sensor Web Concept with UAVSAR

    NASA Astrophysics Data System (ADS)

    Lou, Y.; Chien, S.; Clark, D.; Doubleday, J.; Muellerschoen, R.; Zheng, Y.

    2008-12-01

    We developed a forest fire sensor web concept with a UAVSAR-based smart sensor and onboard automated response capability that will allow us to monitor fire progression based on coarse initial information provided by an external source. This autonomous disturbance detection and monitoring system combines the unique capabilities of imaging radar with high throughput onboard processing technology and onboard automated response capability based on specific science algorithms. In this forest fire sensor web scenario, a fire is initially located by MODIS/RapidFire or a ground-based fire observer. This information is transmitted to the UAVSAR onboard automated response system (CASPER). CASPER generates a flight plan to cover the alerted fire area and executes the flight plan. The onboard processor generates the fuel load map from raw radar data, used with wind and elevation information, predicts the likely fire progression. CASPER then autonomously alters the flight plan to track the fire progression, providing this information to the fire fighting team on the ground. We can also relay the precise fire location to other remote sensing assets with autonomous response capability such as Earth Observation-1 (EO-1)'s hyper-spectral imager to acquire the fire data.

  5. Seismographs, sensors, and satellites: Better technology for safer communities

    USGS Publications Warehouse

    Groat, C.G.

    2004-01-01

    In the past 25 years, our ability to measure, monitor, and model the processes that lead to natural disasters has increased dramatically. Equally important has been the improvement in our technological capability to communicate information about hazards to those whose lives may be affected. These innovations in tracking and communicating the changes-floods, earthquakes, wildfires, volcanic eruptions-in our dynamic planet, supported by a deeper understanding of earth processes, enable us to expand our predictive capabilities and point the way to a safer future. ?? 2004 Elsevier Ltd. All rights reserved.

  6. Y-12 Integrated Materials Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alspaugh, D. H.; Hickerson, T. W.

    2002-06-03

    The Integrated Materials Management System, when fully implemented, will provide the Y-12 National Security Complex with advanced inventory information and analysis capabilities and enable effective assessment, forecasting and management of nuclear materials, critical non-nuclear materials, and certified supplies. These capabilities will facilitate future Y-12 stockpile management work, enhance interfaces to existing National Nuclear Security Administration (NNSA) corporate-level information systems, and enable interfaces to planned NNSA systems. In the current national nuclear defense environment where, for example, weapons testing is not permitted, material managers need better, faster, more complete information about material properties and characteristics. They now must manage non-special nuclearmore » material at the same high-level they have managed SNM, and information capabilities about both must be improved. The full automation and integration of business activities related to nuclear and non-nuclear materials that will be put into effect by the Integrated Materials Management System (IMMS) will significantly improve and streamline the process of providing vital information to Y-12 and NNSA managers. This overview looks at the kinds of information improvements targeted by the IMMS project, related issues, the proposed information architecture, and the progress to date in implementing the system.« less

  7. Technical and Management Information System (TMIS)

    NASA Technical Reports Server (NTRS)

    Rau, Timothy R.

    1987-01-01

    The TMIS goals developed to support the Space Station Program (SSP) mission requirements are outlined. The TMIS will provide common capabilities to all SSP centers and facilitate the flow of technical and management information throughout the program as well as SSP decision-making processes. A summary is presented of the various TMIS phases.

  8. Automating Technical Processes and Reference Services Using SPIRES.

    ERIC Educational Resources Information Center

    Buckley, Joseph James

    1983-01-01

    Examines the capabilities, cost-effectiveness, and flexibility of the Stanford Public Information Retrieval System (SPIRES), an online information retrieval system producing a variety of printed products, and notes its use in the Title I Evaluation Clearinghouse, advantages of SPIRES, programing, and availability. Eleven references and a five-item…

  9. Information Robots and Manipulators.

    ERIC Educational Resources Information Center

    Katys, G. P.; And Others

    In the modern concept a robot is a complex automatic cybernetics system capable of executing various operations in the sphere of human activity and in various respects combining the imitative capacity of the physical and mental activity of man. They are a class of automatic information systems intended for search, collection, processing, and…

  10. Distributed decision support for the 21st century mission space

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2002-07-01

    The past decade has produced significant changes in the conduct of military operations: increased humanitarian missions, asymmetric warfare, the reliance on coalitions and allies, stringent rules of engagement, concern about casualties, and the need for sustained air operations. Future mission commanders will need to assimilate a tremendous amount of information, make quick-response decisions, and quantify the effects of those decisions in the face of uncertainty. Integral to this process is creating situational assessment-understanding the mission space, simulation to analyze alternative futures, current capabilities, planning assessments, course-of-action assessments, and a common operational picture-keeping everyone on the same sheet of paper. Decision support tools in a distributed collaborative environment offer the capability of decomposing these complex multitask processes and distributing them over a dynamic set of execution assets. Decision support technologies can semi-automate activities, such as planning an operation, that have a reasonably well-defined process and provide machine-level interfaces to refine the myriad of information that is not currently fused. The marriage of information and simulation technologies provides the mission commander with a collaborative virtual environment for planning and decision support.

  11. Mediation, Alignment, and Information Services for Semantic interoperability (MAISSI): A Trade Study

    DTIC Science & Technology

    2007-06-01

    Modeling Notation ( BPMN ) • Business Process Definition Metamodel (BPDM) A Business Process (BP) is a defined sequence of steps to be executed in...enterprise applications, to evaluate the capabilities of suppliers, and to compare against the competition. BPMN standardizes flowchart diagrams that

  12. Distributed collaborative environments for virtual capability-based planning

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2003-09-01

    Distributed collaboration is an emerging technology that will significantly change how decisions are made in the 21st century. Collaboration involves two or more geographically dispersed individuals working together to share and exchange data, information, knowledge, and actions. The marriage of information, collaboration, and simulation technologies provides the decision maker with a collaborative virtual environment for planning and decision support. This paper reviews research that is focusing on the applying open standards agent-based framework with integrated modeling and simulation to a new Air Force initiative in capability-based planning and the ability to implement it in a distributed virtual environment. Virtual Capability Planning effort will provide decision-quality knowledge for Air Force resource allocation and investment planning including examining proposed capabilities and cost of alternative approaches, the impact of technologies, identification of primary risk drivers, and creation of executable acquisition strategies. The transformed Air Force business processes are enabled by iterative use of constructive and virtual modeling, simulation, and analysis together with information technology. These tools are applied collaboratively via a technical framework by all the affected stakeholders - warfighter, laboratory, product center, logistics center, test center, and primary contractor.

  13. Diagnosis and Prognosis of Weapon Systems

    NASA Technical Reports Server (NTRS)

    Nolan, Mary; Catania, Rebecca; deMare, Gregory

    2005-01-01

    The Prognostics Framework is a set of software tools with an open architecture that affords a capability to integrate various prognostic software mechanisms and to provide information for operational and battlefield decision-making and logistical planning pertaining to weapon systems. The Prognostics NASA Tech Briefs, February 2005 17 Framework is also a system-level health -management software system that (1) receives data from performance- monitoring and built-in-test sensors and from other prognostic software and (2) processes the received data to derive a diagnosis and a prognosis for a weapon system. This software relates the diagnostic and prognostic information to the overall health of the system, to the ability of the system to perform specific missions, and to needed maintenance actions and maintenance resources. In the development of the Prognostics Framework, effort was focused primarily on extending previously developed model-based diagnostic-reasoning software to add prognostic reasoning capabilities, including capabilities to perform statistical analyses and to utilize information pertaining to deterioration of parts, failure modes, time sensitivity of measured values, mission criticality, historical data, and trends in measurement data. As thus extended, the software offers an overall health-monitoring capability.

  14. Structures and Materials Experimental Facilities and Capabilities Catalog

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G. (Compiler); Kurtz-Husch, Jeanette D. (Compiler)

    2000-01-01

    The NASA Center of Excellent for Structures and Materials at Langley Research Center is responsible for conducting research and developing useable technology in the areas of advanced materials and processing technologies, durability, damage tolerance, structural concepts, advanced sensors, intelligent systems, aircraft ground operations, reliability, prediction tools, performance validation, aeroelastic response, and structural dynamics behavior for aerospace vehicles. Supporting the research activities is a complementary set of facilities and capabilities documented in this report. Because of the volume of information, the information collected was restricted in most cases to one page. Specific questions from potential customers or partners should be directed to the points of contacts provided with the various capabilities. Grouping of the equipment is by location as opposed to function. Geographical information of the various buildings housing the equipment is also provided. Since this is the first time that such an inventory is ever collected at Langley it is by no means complete. It is estimated that over 90 percent of the equipment capabilities at hand are included but equipment is continuously being updated and will be reported in the future.

  15. Clinical Summarization Capabilities of Commercially-available and Internally-developed Electronic Health Records

    PubMed Central

    Laxmisan, A.; McCoy, A.B.; Wright, A.; Sittig, D.F.

    2012-01-01

    Objective Clinical summarization, the process by which relevant patient information is electronically summarized and presented at the point of care, is of increasing importance given the increasing volume of clinical data in electronic health record systems (EHRs). There is a paucity of research on electronic clinical summarization, including the capabilities of currently available EHR systems. Methods We compared different aspects of general clinical summary screens used in twelve different EHR systems using a previously described conceptual model: AORTIS (Aggregation, Organization, Reduction, Interpretation and Synthesis). Results We found a wide variation in the EHRs’ summarization capabilities: all systems were capable of simple aggregation and organization of limited clinical content, but only one demonstrated an ability to synthesize information from the data. Conclusion Improvement of the clinical summary screen functionality for currently available EHRs is necessary. Further research should identify strategies and methods for creating easy to use, well-designed clinical summary screens that aggregate, organize and reduce all pertinent patient information as well as provide clinical interpretations and synthesis as required. PMID:22468161

  16. Mission informed needed information: discoverable, available sensing sources (MINI-DASS): the operators and process flows the magic rabbits must negotiate

    NASA Astrophysics Data System (ADS)

    Kolodny, Michael A.

    2017-05-01

    Today's battlefield space is extremely complex, dealing with an enemy that is neither well-defined nor well-understood. Adversaries are comprised of widely-distributed, loosely-networked groups engaging in nefarious activities. Situational understanding is needed by decision makers; understanding of adversarial capabilities and intent is essential. Information needed at any time is dependent on the mission/task at hand. Information sources potentially providing mission-relevant information are disparate and numerous; they include sensors, social networks, fusion engines, internet, etc. Management of these multi-dimensional informational sources is critical. This paper will present a new approach being undertaken to answer the challenge of enhancing battlefield understanding by optimizing the utilization of available informational sources (means) to required missions/tasks as well as determining the "goodness'" of the information acquired in meeting the capabilities needed. Requirements are usually expressed in terms of a presumed technology solution (e.g., imagery). A metaphor of the "magic rabbits" was conceived to remove presumed technology solutions from requirements by claiming the "required" technology is obsolete. Instead, intelligent "magic rabbits" are used to provide needed information. The question then becomes: "WHAT INFORMATION DO YOU NEED THE RABBITS TO PROVIDE YOU?" This paper will describe a new approach called Mission-Informed Needed Information - Discoverable, Available Sensing Sources (MINI-DASS) that designs a process that builds information acquisition missions and determines what the "magic rabbits" need to provide in a manner that is machine understandable. Also described is the Missions and Means Framework (MMF) model used, the process flow utilized, the approach to developing an ontology of information source means and the approach for determining the value of the information acquired.

  17. Information systems on human resources for health: a global review

    PubMed Central

    2012-01-01

    Background Although attainment of the health-related Millennium Development Goals relies on countries having adequate numbers of human resources for health (HRH) and their appropriate distribution, global understanding of the systems used to generate information for monitoring HRH stock and flows, known as human resources information systems (HRIS), is minimal. While HRIS are increasingly recognized as integral to health system performance assessment, baseline information regarding their scope and capability around the world has been limited. We conducted a review of the available literature on HRIS implementation processes in order to draw this baseline. Methods Our systematic search initially retrieved 11 923 articles in four languages published in peer-reviewed and grey literature. Following the selection of those articles which detailed HRIS implementation processes, reviews of their contents were conducted using two-person teams, each assigned to a national system. A data abstraction tool was developed and used to facilitate objective assessment. Results Ninety-five articles with relevant HRIS information were reviewed, mostly from the grey literature, which comprised 84 % of all documents. The articles represented 63 national HRIS and two regionally integrated systems. Whereas a high percentage of countries reported the capability to generate workforce supply and deployment data, few systems were documented as being used for HRH planning and decision-making. Of the systems examined, only 23 % explicitly stated they collect data on workforce attrition. The majority of countries experiencing crisis levels of HRH shortages (56 %) did not report data on health worker qualifications or professional credentialing as part of their HRIS. Conclusion Although HRIS are critical for evidence-based human resource policy and practice, there is a dearth of information about these systems, including their current capabilities. The absence of standardized HRIS profiles (including documented processes for data collection, management, and use) limits understanding of the availability and quality of information that can be used to support effective and efficient HRH strategies and investments at the national, regional, and global levels. PMID:22546089

  18. Input Processing at First Exposure to a Sign Language

    ERIC Educational Resources Information Center

    Ortega, Gerardo; Morgan, Gary

    2015-01-01

    There is growing interest in learners' cognitive capacities to process a second language (L2) at first exposure to the target language. Evidence suggests that L2 learners are capable of processing novel words by exploiting phonological information from their first language (L1). Hearing adult learners of a sign language, however, cannot fall back…

  19. Design requirements for operational earth resources ground data processing

    NASA Technical Reports Server (NTRS)

    Baldwin, C. J.; Bradford, L. H.; Burnett, E. S.; Hutson, D. E.; Kinsler, B. A.; Kugle, D. R.; Webber, D. S.

    1972-01-01

    Realistic tradeoff data and evaluation techniques were studied that permit conceptual design of operational earth resources ground processing systems. Methodology for determining user requirements that utilize the limited information available from users is presented along with definitions of sensor capabilities projected into the shuttle/station era. A tentative method is presented for synthesizing candidate ground processing concepts.

  20. Conceptual information processing: A robust approach to KBS-DBMS integration

    NASA Technical Reports Server (NTRS)

    Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond

    1987-01-01

    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.

  1. Matching Alternative Addresses: a Semantic Web Approach

    NASA Astrophysics Data System (ADS)

    Ariannamazi, S.; Karimipour, F.; Hakimpour, F.

    2015-12-01

    Rapid development of crowd-sourcing or volunteered geographic information (VGI) provides opportunities for authoritatives that deal with geospatial information. Heterogeneity of multiple data sources and inconsistency of data types is a key characteristics of VGI datasets. The expansion of cities resulted in the growing number of POIs in the OpenStreetMap, a well-known VGI source, which causes the datasets to outdate in short periods of time. These changes made to spatial and aspatial attributes of features such as names and addresses might cause confusion or ambiguity in the processes that require feature's literal information like addressing and geocoding. VGI sources neither will conform specific vocabularies nor will remain in a specific schema for a long period of time. As a result, the integration of VGI sources is crucial and inevitable in order to avoid duplication and the waste of resources. Information integration can be used to match features and qualify different annotation alternatives for disambiguation. This study enhances the search capabilities of geospatial tools with applications able to understand user terminology to pursuit an efficient way for finding desired results. Semantic web is a capable tool for developing technologies that deal with lexical and numerical calculations and estimations. There are a vast amount of literal-spatial data representing the capability of linguistic information in knowledge modeling, but these resources need to be harmonized based on Semantic Web standards. The process of making addresses homogenous generates a helpful tool based on spatial data integration and lexical annotation matching and disambiguating.

  2. Connecting Biology to Electronics: Molecular Communication via Redox Modality.

    PubMed

    Liu, Yi; Li, Jinyang; Tschirhart, Tanya; Terrell, Jessica L; Kim, Eunkyoung; Tsao, Chen-Yu; Kelly, Deanna L; Bentley, William E; Payne, Gregory F

    2017-12-01

    Biology and electronics are both expert at for accessing, analyzing, and responding to information. Biology uses ions, small molecules, and macromolecules to receive, analyze, store, and transmit information, whereas electronic devices receive input in the form of electromagnetic radiation, process the information using electrons, and then transmit output as electromagnetic waves. Generating the capabilities to connect biology-electronic modalities offers exciting opportunities to shape the future of biosensors, point-of-care medicine, and wearable/implantable devices. Redox reactions offer unique opportunities for bio-device communication that spans the molecular modalities of biology and electrical modality of devices. Here, an approach to search for redox information through an interactive electrochemical probing that is analogous to sonar is adopted. The capabilities of this approach to access global chemical information as well as information of specific redox-active chemical entities are illustrated using recent examples. An example of the use of synthetic biology to recognize external molecular information, process this information through intracellular signal transduction pathways, and generate output responses that can be detected by electrical modalities is also provided. Finally, exciting results in the use of redox reactions to actuate biology are provided to illustrate that synthetic biology offers the potential to guide biological response through electrical cues. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. NASA's Earth Observing System Data and Information System - EOSDIS

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram K.

    2011-01-01

    This slide presentation reviews the work of NASA's Earth Observing System Data and Information System (EOSDIS), a petabyte-scale archive of environmental data that supports global climate change research. The Earth Science Data Systems provide end-to-end capabilities to deliver data and information products to users in support of understanding the Earth system. The presentation contains photographs from space of recent events, (i.e., the effects of the tsunami in Japan, and the wildfires in Australia.) It also includes details of the Data Centers that provide the data to EOSDIS and Science Investigator-led Processing Systems. Information about the Land, Atmosphere Near-real-time Capability for EOS (LANCE) and some of the uses that the system has made possible are reviewed. Also included is information about how to access the data, and evolutionary plans for the future of the system.

  4. Creating Communications, Computing, and Networking Technology Development Road Maps for Future NASA Human and Robotic Missions

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Hayden, Jeffrey L.

    2005-01-01

    For human and robotic exploration missions in the Vision for Exploration, roadmaps are needed for capability development and investments based on advanced technology developments. A roadmap development process was undertaken for the needed communications, and networking capabilities and technologies for the future human and robotics missions. The underlying processes are derived from work carried out during development of the future space communications architecture, an d NASA's Space Architect Office (SAO) defined formats and structures for accumulating data. Interrelationships were established among emerging requirements, the capability analysis and technology status, and performance data. After developing an architectural communications and networking framework structured around the assumed needs for human and robotic exploration, in the vicinity of Earth, Moon, along the path to Mars, and in the vicinity of Mars, information was gathered from expert participants. This information was used to identify the capabilities expected from the new infrastructure and the technological gaps in the way of obtaining them. We define realistic, long-term space communication architectures based on emerging needs and translate the needs into interfaces, functions, and computer processing that will be required. In developing our roadmapping process, we defined requirements for achieving end-to-end activities that will be carried out by future NASA human and robotic missions. This paper describes: 10 the architectural framework developed for analysis; 2) our approach to gathering and analyzing data from NASA, industry, and academia; 3) an outline of the technology research to be done, including milestones for technology research and demonstrations with timelines; and 4) the technology roadmaps themselves.

  5. Reference Service and Bounded Rationality: Helping Students with Research.

    ERIC Educational Resources Information Center

    Chu, Felix T.

    1994-01-01

    In university libraries, reference librarians often get ambiguous questions to which they try to give appropriate answers. Because of limitations on resources, time, and mental capability for information processing, the decision-making process involved in answering reference questions becomes bounded by the rationality of these constraints.…

  6. Near Real-time Scientific Data Analysis and Visualization with the ArcGIS Platform

    NASA Astrophysics Data System (ADS)

    Shrestha, S. R.; Viswambharan, V.; Doshi, A.

    2017-12-01

    Scientific multidimensional data are generated from a variety of sources and platforms. These datasets are mostly produced by earth observation and/or modeling systems. Agencies like NASA, NOAA, USGS, and ESA produce large volumes of near real-time observation, forecast, and historical data that drives fundamental research and its applications in larger aspects of humanity from basic decision making to disaster response. A common big data challenge for organizations working with multidimensional scientific data and imagery collections is the time and resources required to manage and process such large volumes and varieties of data. The challenge of adopting data driven real-time visualization and analysis, as well as the need to share these large datasets, workflows, and information products to wider and more diverse communities, brings an opportunity to use the ArcGIS platform to handle such demand. In recent years, a significant effort has put in expanding the capabilities of ArcGIS to support multidimensional scientific data across the platform. New capabilities in ArcGIS to support scientific data management, processing, and analysis as well as creating information products from large volumes of data using the image server technology are becoming widely used in earth science and across other domains. We will discuss and share the challenges associated with big data by the geospatial science community and how we have addressed these challenges in the ArcGIS platform. We will share few use cases, such as NOAA High Resolution Refresh Radar (HRRR) data, that demonstrate how we access large collections of near real-time data (that are stored on-premise or on the cloud), disseminate them dynamically, process and analyze them on-the-fly, and serve them to a variety of geospatial applications. We will also share how on-the-fly processing using raster functions capabilities, can be extended to create persisted data and information products using raster analytics capabilities that exploit distributed computing in an enterprise environment.

  7. Real-time physiological monitoring with distributed networks of sensors and object-oriented programming techniques

    NASA Astrophysics Data System (ADS)

    Wiesmann, William P.; Pranger, L. Alex; Bogucki, Mary S.

    1998-05-01

    Remote monitoring of physiologic data from individual high- risk workers distributed over time and space is a considerable challenge. This is often due to an inadequate capability to accurately integrate large amounts of data into usable information in real time. In this report, we have used the vertical and horizontal organization of the 'fireground' as a framework to design a distributed network of sensors. In this system, sensor output is linked through a hierarchical object oriented programing process to accurately interpret physiological data, incorporate these data into a synchronous model and relay processed data, trends and predictions to members of the fire incident command structure. There are several unique aspects to this approach. The first includes a process to account for variability in vital parameter values for each individual's normal physiologic response by including an adaptive network in each data process. This information is used by the model in an iterative process to baseline a 'normal' physiologic response to a given stress for each individual and to detect deviations that indicate dysfunction or a significant insult. The second unique capability of the system orders the information for each user including the subject, local company officers, medical personnel and the incident commanders. Information can be retrieved and used for training exercises and after action analysis. Finally this system can easily be adapted to existing communication and processing links along with incorporating the best parts of current models through the use of object oriented programming techniques. These modern software techniques are well suited to handling multiple data processes independently over time in a distributed network.

  8. High-Resolution Characterization of UMo Alloy Microstructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Devaraj, Arun; Kovarik, Libor; Joshi, Vineet V.

    2016-11-30

    This report highlights the capabilities and procedure for high-resolution characterization of UMo fuels in PNNL. Uranium-molybdenum (UMo) fuel processing steps, from casting to forming final fuel, directly affect the microstructure of the fuel, which in turn dictates the in-reactor performance of the fuel under irradiation. In order to understand the influence of processing on UMo microstructure, microstructure characterization techniques are necessary. Higher-resolution characterization techniques like transmission electron microscopy (TEM) and atom probe tomography (APT) are needed to interrogate the details of the microstructure. The findings from TEM and APT are also directly beneficial for developing predictive multiscale modeling tools thatmore » can predict the microstructure as a function of process parameters. This report provides background on focused-ion-beam–based TEM and APT sample preparation, TEM and APT analysis procedures, and the unique information achievable through such advanced characterization capabilities for UMo fuels, from a fuel fabrication capability viewpoint.« less

  9. Aerothermodynamic Flight Simulation Capabilities for Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Miller, Charles G.

    1998-01-01

    Aerothermodynamics, encompassing aerodynamics, aeroheating, and fluid dynamics and physical processes, is the genesis for the design and development of advanced space transportation vehicles and provides crucial information to other disciplines such as structures, materials, propulsion, avionics, and guidance, navigation and control. Sources of aerothermodynamic information are ground-based facilities, Computational Fluid Dynamic (CFD) and engineering computer codes, and flight experiments. Utilization of this aerothermodynamic triad provides the optimum aerothermodynamic design to safely satisfy mission requirements while reducing design conservatism, risk and cost. The iterative aerothermodynamic process for initial screening/assessment of aerospace vehicle concepts, optimization of aerolines to achieve/exceed mission requirements, and benchmark studies for final design and establishment of the flight data book are reviewed. Aerothermodynamic methodology centered on synergism between ground-based testing and CFD predictions is discussed for various flow regimes encountered by a vehicle entering the Earth s atmosphere from low Earth orbit. An overview of the resources/infrastructure required to provide accurate/creditable aerothermodynamic information in a timely manner is presented. Impacts on Langley s aerothermodynamic capabilities due to recent programmatic changes such as Center reorganization, downsizing, outsourcing, industry (as opposed to NASA) led programs, and so forth are discussed. Sample applications of these capabilities to high Agency priority, fast-paced programs such as Reusable Launch Vehicle (RLV)/X-33 Phases I and 11, X-34, Hyper-X and X-38 are presented and lessons learned discussed. Lastly, enhancements in ground-based testing/CFD capabilities necessary to partially/fully satisfy future requirements are addressed.

  10. Command and Control Common Semantic Core Required to Enable Net-centric Operations

    DTIC Science & Technology

    2008-05-20

    automated processing capability. A former US Marine Corps component C4 director during Operation Iraqi Freedom identified the problems of 1) uncertainty...interoperability improvements to warfighter community processes, thanks to ubiquitous automated processing , are likely high and somewhat easier to quantify. A...synchronized with the actions of other partners / warfare communities. This requires high- quality information, rapid sharing and automated processing – which

  11. Learning, Unlearning and Relearning--Knowledge Life Cycles in Library and Information Science Education

    ERIC Educational Resources Information Center

    Bedford, Denise A. D.

    2015-01-01

    The knowledge life cycle is applied to two core capabilities of library and information science (LIS) education--teaching, and research and development. The knowledge claim validation, invalidation and integration steps of the knowledge life cycle are translated to learning, unlearning and relearning processes. Mixed methods are used to determine…

  12. Computers and Information Flow.

    ERIC Educational Resources Information Center

    Patrick, R. L.

    This paper is designed to fill the need for an easily understood introduction to the computing and data processing field for the layman who has, or can expect to have, some contact with it. Information provided includes the unique terminology and jargon of the field, the various types of computers and the scope of computational capabilities, and…

  13. Small Business Innovations (Automated Information)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Bruce G. Jackson & Associates Document Director is an automated tool that combines word processing and database management technologies to offer the flexibility and convenience of text processing with the linking capability of database management. Originally developed for NASA, it provides a means to collect and manage information associated with requirements development. The software system was used by NASA in the design of the Assured Crew Return Vehicle, as well as by other government and commercial organizations including the Southwest Research Institute.

  14. Large-scale neuromorphic computing systems

    NASA Astrophysics Data System (ADS)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  15. Fusing Sensor Paradigms to Acquire Chemical Information: An Integrative Role for Smart Biopolymeric Hydrogels

    PubMed Central

    Kim, Eunkyoung; Liu, Yi; Ben-Yoav, Hadar; Winkler, Thomas E.; Yan, Kun; Shi, Xiaowen; Shen, Jana; Kelly, Deanna L.; Ghodssi, Reza; Bentley, William E.

    2017-01-01

    The Information Age transformed our lives but it has had surprisingly little impact on the way chemical information (e.g., from our biological world) is acquired, analyzed and communicated. Sensor systems are poised to change this situation by providing rapid access to chemical information. This access will be enabled by technological advances from various fields: biology enables the synthesis, design and discovery of molecular recognition elements as well as the generation of cell-based signal processors; physics and chemistry are providing nano-components that facilitate the transmission and transduction of signals rich with chemical information; microfabrication is yielding sensors capable of receiving these signals through various modalities; and signal processing analysis enhances the extraction of chemical information. The authors contend that integral to the development of functional sensor systems will be materials that (i) enable the integrative and hierarchical assembly of various sensing components (for chemical recognition and signal transduction) and (ii) facilitate meaningful communication across modalities. It is suggested that stimuli-responsive self-assembling biopolymers can perform such integrative functions, and redox provides modality-spanning communication capabilities. Recent progress toward the development of electrochemical sensors to manage schizophrenia is used to illustrate the opportunities and challenges for enlisting sensors for chemical information processing. PMID:27616350

  16. Defence Test and Evaluation Roadmap

    DTIC Science & Technology

    2008-01-01

    T&E can be employed to prove, demonstrate or assess the ability of proposed and existing capability systems, new or upgraded, to satisfy specified...t&e T&E is a process to obtain information to support the objective assessment of a Capability System with known confidence, and to confirm whether...for the ADF is a ‘balanced, networked, and deployable force, staffed by dedicated and professional people, that operates within a culture of

  17. Synthetic genetic polymers capable of heredity and evolution.

    PubMed

    Pinheiro, Vitor B; Taylor, Alexander I; Cozens, Christopher; Abramov, Mikhail; Renders, Marleen; Zhang, Su; Chaput, John C; Wengel, Jesper; Peak-Chew, Sew-Yeu; McLaughlin, Stephen H; Herdewijn, Piet; Holliger, Philipp

    2012-04-20

    Genetic information storage and processing rely on just two polymers, DNA and RNA, yet whether their role reflects evolutionary history or fundamental functional constraints is currently unknown. With the use of polymerase evolution and design, we show that genetic information can be stored in and recovered from six alternative genetic polymers based on simple nucleic acid architectures not found in nature [xeno-nucleic acids (XNAs)]. We also select XNA aptamers, which bind their targets with high affinity and specificity, demonstrating that beyond heredity, specific XNAs have the capacity for Darwinian evolution and folding into defined structures. Thus, heredity and evolution, two hallmarks of life, are not limited to DNA and RNA but are likely to be emergent properties of polymers capable of information storage.

  18. Techniques and potential capabilities of multi-resolutional information (knowledge) processing

    NASA Technical Reports Server (NTRS)

    Meystel, A.

    1989-01-01

    A concept of nested hierarchical (multi-resolutional, pyramidal) information (knowledge) processing is introduced for a variety of systems including data and/or knowledge bases, vision, control, and manufacturing systems, industrial automated robots, and (self-programmed) autonomous intelligent machines. A set of practical recommendations is presented using a case study of a multiresolutional object representation. It is demonstrated here that any intelligent module transforms (sometimes, irreversibly) the knowledge it deals with, and this tranformation affects the subsequent computation processes, e.g., those of decision and control. Several types of knowledge transformation are reviewed. Definite conditions are analyzed, satisfaction of which is required for organization and processing of redundant information (knowledge) in the multi-resolutional systems. Providing a definite degree of redundancy is one of these conditions.

  19. Guidelines to Data Processing Management.

    ERIC Educational Resources Information Center

    Data Processing Management Association, Park Ridge, IL.

    This is a revised and updated version of an earlier published set of guidelines. As in the instance of the first edition, this volume contains contributions by some of the most capable consultants in the information processing field. Their comments are based on sound, proved judgment tested in day-to-day operations at installations throughout the…

  20. Quantity and unit extraction for scientific and technical intelligence analysis

    NASA Astrophysics Data System (ADS)

    David, Peter; Hawes, Timothy

    2017-05-01

    Scientific and Technical (S and T) intelligence analysts consume huge amounts of data to understand how scientific progress and engineering efforts affect current and future military capabilities. One of the most important types of information S and T analysts exploit is the quantities discussed in their source material. Frequencies, ranges, size, weight, power, and numerous other properties and measurements describing the performance characteristics of systems and the engineering constraints that define them must be culled from source documents before quantified analysis can begin. Automating the process of finding and extracting the relevant quantities from a wide range of S and T documents is difficult because information about quantities and their units is often contained in unstructured text with ad hoc conventions used to convey their meaning. Currently, even simple tasks, such as searching for documents discussing RF frequencies in a band of interest, is a labor intensive and error prone process. This research addresses the challenges facing development of a document processing capability that extracts quantities and units from S and T data, and how Natural Language Processing algorithms can be used to overcome these challenges.

  1. An optoelectronic system for fringe pattern analysis

    NASA Astrophysics Data System (ADS)

    Sciammarella, C. A.; Ahmadshahi, M.

    A system capable of retrieving and processing information recorded in fringe patterns is reported. The principal components are described as well as the architecture in which they are assembled. An example of application is given.

  2. Exploiting the Capabilities of NASA's Giovanni System for Oceanographic Education

    NASA Technical Reports Server (NTRS)

    Acker, James G.; Petrucio, Emil; Leptoukh, Gregory; Shen, Suhung

    2007-01-01

    The NASA Goddard Earth Science Data and Information Services Center (GES DISC) Giovanni system [GES DISC Interactive Online Visualization ANd aNalysis Infrastructure] has significant capabilities for oceanographic education and independent research utilizing ocean color radiometry data products. Giovanni allows Web-based data discovery and basic analyses, and can be used both for guided illustration of a variety of marine processes and phenomena, and for independent research investigations. Giovanni's capabilities are particularly suited for advanced secondary school science and undergraduate (college) education. This presentation will describe a variety of ways that Giovanni can be used for oceanographic education. Auxiliary information resources that can be utilized will also be described. Several testimonies of Giovanni usage for instruction will be provided, and a recent case history of Giovanni utilization for instruction and research at the undergraduate level is highlighted.

  3. Considerations Affecting Satellite and Space Probe Research with Emphasis on the "Scout" as a Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Posner, Jack (Editor)

    1961-01-01

    This report reviews a number of the factors which influence space flight experiments. Included are discussions of payload considerations, payload design and packaging, environmental tests, launch facilities, tracking and telemetry requirements, data acquisition, processing and analysis procedures, communication of information, and project management. Particular emphasis is placed on the "Scout" as a launching vehicle. The document includes a description of the geometry of the "Scout" as well as its flight capabilities and limitations. Although oriented toward the "Scout" vehicle and its payload capabilities, the information presented is sufficiently general to be equally applicable to most space vehicle systems.

  4. Sensing Super-Position: Human Sensing Beyond the Visual Spectrum

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Schipper, John F.

    2007-01-01

    The coming decade of fast, cheap and miniaturized electronics and sensory devices opens new pathways for the development of sophisticated equipment to overcome limitations of the human senses. This paper addresses the technical feasibility of augmenting human vision through Sensing Super-position by mixing natural Human sensing. The current implementation of the device translates visual and other passive or active sensory instruments into sounds, which become relevant when the visual resolution is insufficient for very difficult and particular sensing tasks. A successful Sensing Super-position meets many human and pilot vehicle system requirements. The system can be further developed into cheap, portable, and low power taking into account the limited capabilities of the human user as well as the typical characteristics of his dynamic environment. The system operates in real time, giving the desired information for the particular augmented sensing tasks. The Sensing Super-position device increases the image resolution perception and is obtained via an auditory representation as well as the visual representation. Auditory mapping is performed to distribute an image in time. The three-dimensional spatial brightness and multi-spectral maps of a sensed image are processed using real-time image processing techniques (e.g. histogram normalization) and transformed into a two-dimensional map of an audio signal as a function of frequency and time. This paper details the approach of developing Sensing Super-position systems as a way to augment the human vision system by exploiting the capabilities of Lie human hearing system as an additional neural input. The human hearing system is capable of learning to process and interpret extremely complicated and rapidly changing auditory patterns. The known capabilities of the human hearing system to learn and understand complicated auditory patterns provided the basic motivation for developing an image-to-sound mapping system. The human brain is superior to most existing computer systems in rapidly extracting relevant information from blurred, noisy, and redundant images. From a theoretical viewpoint, this means that the available bandwidth is not exploited in an optimal way. While image-processing techniques can manipulate, condense and focus the information (e.g., Fourier Transforms), keeping the mapping as direct and simple as possible might also reduce the risk of accidentally filtering out important clues. After all, especially a perfect non-redundant sound representation is prone to loss of relevant information in the non-perfect human hearing system. Also, a complicated non-redundant image-to-sound mapping may well be far more difficult to learn and comprehend than a straightforward mapping, while the mapping system would increase in complexity and cost. This work will demonstrate some basic information processing for optimal information capture for headmounted systems.

  5. An overview of the National Space Science data Center Standard Information Retrieval System (SIRS)

    NASA Technical Reports Server (NTRS)

    Shapiro, A.; Blecher, S.; Verson, E. E.; King, M. L. (Editor)

    1974-01-01

    A general overview is given of the National Space Science Data Center (NSSDC) Standard Information Retrieval System. A description, in general terms, the information system that contains the data files and the software system that processes and manipulates the files maintained at the Data Center. Emphasis is placed on providing users with an overview of the capabilities and uses of the NSSDC Standard Information Retrieval System (SIRS). Examples given are taken from the files at the Data Center. Detailed information about NSSDC data files is documented in a set of File Users Guides, with one user's guide prepared for each file processed by SIRS. Detailed information about SIRS is presented in the SIRS Users Guide.

  6. Autonomous cloud based site monitoring through hydro geophysical data assimilation, processing and result delivery

    NASA Astrophysics Data System (ADS)

    Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.

    2017-12-01

    Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical geophysical ingestion and processing and the ability for co analysis and visualization of the raw and processed data with other data of interest (e.g. soil temperature, soil moisture, precipitation). We will demonstrate current PAF capabilities and discuss future efforts.

  7. Wood transportation systems-a spin-off of a computerized information and mapping technique

    Treesearch

    William W. Phillips; Thomas J. Corcoran

    1978-01-01

    A computerized mapping system originally developed for planning the control of the spruce budworm in Maine has been extended into a tool for planning road net-work development and optimizing transportation costs. A budgetary process and a mathematical linear programming routine are used interactively with the mapping and information retrieval capabilities of the system...

  8. Digital Avionics Information System (DAIS): Mid-1980's Maintenance Task Analysis. Final Report.

    ERIC Educational Resources Information Center

    Czuchry, Andrew J.; And Others

    The fundamental objective of the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study is to provide the Air Force with an enhanced in-house capability to incorporate LCC considerations during all stages of the system acquisition process. The purpose of this report is to describe the technical approach, results, and conclusions…

  9. Practical Applications of Space Systems, Supporting Paper 13: Information Services and Information Processing.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC. Assembly of Engineering.

    This report summarizes the findings of one of fourteen panels that studied progress in space science applications and defined user needs potentially capable of being met by space-system applications. The study was requested by the National Aeronautics and Space Administration (NASA) and was conducted by the Space Applications Board. The panels…

  10. Processing multilevel secure test and evaluation information

    NASA Astrophysics Data System (ADS)

    Hurlburt, George; Hildreth, Bradley; Acevedo, Teresa

    1994-07-01

    The Test and Evaluation Community Network (TECNET) is building a Multilevel Secure (MLS) system. This system features simultaneous access to classified and unclassified information and easy access through widely available communications channels. It provides the necessary separation of classification levels, assured through the use of trusted system design techniques, security assessments and evaluations. This system enables cleared T&E users to view and manipulate classified and unclassified information resources either using a single terminal interface or multiple windows in a graphical user interface. TECNET is in direct partnership with the National Security Agency (NSA) to develop and field the MLS TECNET capability in the near term. The centerpiece of this partnership is a state-of-the-art Concurrent Systems Security Engineering (CSSE) process. In developing the MLS TECNET capability, TECNET and NSA are providing members, with various expertise and diverse backgrounds, to participate in the CSSE process. The CSSE process is founded on the concepts of both Systems Engineering and Concurrent Engineering. Systems Engineering is an interdisciplinary approach to evolve and verify an integrated and life cycle balanced set of system product and process solutions that satisfy customer needs (ASD/ENS-MIL STD 499B 1992). Concurrent Engineering is design and development using the simultaneous, applied talents of a diverse group of people with the appropriate skills. Harnessing diverse talents to support CSSE requires active participation by team members in an environment that both respects and encourages diversity.

  11. Technical Standards for Command and Control Information Systems (CCISs) and Information Technology

    DTIC Science & Technology

    1994-02-01

    formatting, transmitting, receiving, and processing imagery and imagery-related information. The N1TFS is in essence the suite of individual standards...also known as Limited Operational Capability-Europe) and the German Joint Analysis System Military Intelligence ( JASMIN ). Among the approaches being... essence , the other systems utilize a one-level address space where addressing consists of identifying the fire support unit. However, AFATDS utilizes a two

  12. Concepts for a global resources information system

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.; Urena, J. L.

    1984-01-01

    The objective of the Global Resources Information System (GRIS) is to establish an effective and efficient information management system to meet the data access requirements of NASA and NASA-related scientists conducting large-scale, multi-disciplinary, multi-mission scientific investigations. Using standard interfaces and operating guidelines, diverse data systems can be integrated to provide the capabilities to access and process multiple geographically dispersed data sets and to develop the necessary procedures and algorithms to derive global resource information.

  13. IEEE Computer Society/Software Engineering Institute Watts S. Humphrey Software Process Achievement (SPA) Award 2016: Nationwide

    DTIC Science & Technology

    2017-04-05

    Information Technology at Nationwide v Abstract vi 1 Business Imperatives 1 1.1 Deliver the Right Work 1 1.2 Deliver the Right Way 1 1.3 Deliver with...an Engaged Workforce 1 2 Challenges and Opportunities 2 2.1 Responding to Demand 2 2.2 Standards and Capabilities 2 2.3 Information Technology ...release and unlimited distribution. Information Technology at Nationwide Nationwide Information Technology (IT) is comprised of seven offices

  14. Designing an effective microbial forensics program for law enforcement and national security purposes.

    PubMed

    Murch, Randall S

    2014-06-01

    Forensic capabilities that provide lead information, and investigative, intelligence, prosecution and policy decision support can be invaluable for responding to and resolving bioterrorism events. Attributing biological attacks through scientific and other resources and processes is an important goal, for which science can be instrumental. Some even believe that having effective microbial forensics capabilities along with others can even deter adversaries from using biological weapons. For those nations that do not have such or wish to integrate or upgrade capabilities, thoughtful analysis and consideration of certain design principles will increase the likelihood that success will be attained.

  15. Preprocessing of emotional visual information in the human piriform cortex.

    PubMed

    Schulze, Patrick; Bestgen, Anne-Kathrin; Lech, Robert K; Kuchinke, Lars; Suchan, Boris

    2017-08-23

    This study examines the processing of visual information by the olfactory system in humans. Recent data point to the processing of visual stimuli by the piriform cortex, a region mainly known as part of the primary olfactory cortex. Moreover, the piriform cortex generates predictive templates of olfactory stimuli to facilitate olfactory processing. This study fills the gap relating to the question whether this region is also capable of preprocessing emotional visual information. To gain insight into the preprocessing and transfer of emotional visual information into olfactory processing, we recorded hemodynamic responses during affective priming using functional magnetic resonance imaging (fMRI). Odors of different valence (pleasant, neutral and unpleasant) were primed by images of emotional facial expressions (happy, neutral and disgust). Our findings are the first to demonstrate that the piriform cortex preprocesses emotional visual information prior to any olfactory stimulation and that the emotional connotation of this preprocessing is subsequently transferred and integrated into an extended olfactory network for olfactory processing.

  16. The ends of uncertainty: Air quality science and planning in Central California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fine, James

    Air quality planning in Central California is complicated and controversial despite millions of dollars invested to improve scientific understanding. This research describes and critiques the use of photochemical air quality simulation modeling studies in planning to attain standards for ground-level ozone in the San Francisco Bay Area and the San Joaquin Valley during the 1990's. Data are gathered through documents and interviews with planners, modelers, and policy-makers at public agencies and with representatives from the regulated and environmental communities. Interactions amongst organizations are diagramed to identify significant nodes of interaction. Dominant policy coalitions are described through narratives distinguished by theirmore » uses of and responses to uncertainty, their exposures to risks, and their responses to the principles of conservatism, civil duty, and caution. Policy narratives are delineated using aggregated respondent statements to describe and understand advocacy coalitions. I found that models impacted the planning process significantly, but were used not purely for their scientific capabilities. Modeling results provided justification for decisions based on other constraints and political considerations. Uncertainties were utilized opportunistically by stakeholders instead of managed explicitly. Ultimately, the process supported the partisan views of those in control of the modeling. Based on these findings, as well as a review of model uncertainty analysis capabilities, I recommend modifying the planning process to allow for the development and incorporation of uncertainty information, while addressing the need for inclusive and meaningful public participation. By documenting an actual air quality planning process these findings provide insights about the potential for using new scientific information and understanding to achieve environmental goals, most notably the analysis of uncertainties in modeling applications. Concurrently, needed uncertainty information is identified and capabilities to produce it are assessed. Practices to facilitate incorporation of uncertainty information are suggested based on research findings, as well as theory from the literatures of the policy sciences, decision sciences, science and technology studies, consensus-based and communicative planning, and modeling.« less

  17. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  18. Satellite on-board real-time SAR processor prototype

    NASA Astrophysics Data System (ADS)

    Bergeron, Alain; Doucet, Michel; Harnisch, Bernd; Suess, Martin; Marchese, Linda; Bourqui, Pascal; Desnoyers, Nicholas; Legros, Mathieu; Guillot, Ludovic; Mercier, Luc; Châteauneuf, François

    2017-11-01

    A Compact Real-Time Optronic SAR Processor has been successfully developed and tested up to a Technology Readiness Level of 4 (TRL4), the breadboard validation in a laboratory environment. SAR, or Synthetic Aperture Radar, is an active system allowing day and night imaging independent of the cloud coverage of the planet. The SAR raw data is a set of complex data for range and azimuth, which cannot be compressed. Specifically, for planetary missions and unmanned aerial vehicle (UAV) systems with limited communication data rates this is a clear disadvantage. SAR images are typically processed electronically applying dedicated Fourier transformations. This, however, can also be performed optically in real-time. Originally the first SAR images were optically processed. The optical Fourier processor architecture provides inherent parallel computing capabilities allowing real-time SAR data processing and thus the ability for compression and strongly reduced communication bandwidth requirements for the satellite. SAR signal return data are in general complex data. Both amplitude and phase must be combined optically in the SAR processor for each range and azimuth pixel. Amplitude and phase are generated by dedicated spatial light modulators and superimposed by an optical relay set-up. The spatial light modulators display the full complex raw data information over a two-dimensional format, one for the azimuth and one for the range. Since the entire signal history is displayed at once, the processor operates in parallel yielding real-time performances, i.e. without resulting bottleneck. Processing of both azimuth and range information is performed in a single pass. This paper focuses on the onboard capabilities of the compact optical SAR processor prototype that allows in-orbit processing of SAR images. Examples of processed ENVISAT ASAR images are presented. Various SAR processor parameters such as processing capabilities, image quality (point target analysis), weight and size are reviewed.

  19. Library Information-Processing System

    NASA Technical Reports Server (NTRS)

    1985-01-01

    System works with Library of Congress MARC II format. System composed of subsystems that provide wide range of library informationprocessing capabilities. Format is American National Standards Institute (ANSI) format for machine-readable bibliographic data. Adaptable to any medium-to-large library.

  20. Knowledge Retrieval Solutions.

    ERIC Educational Resources Information Center

    Khan, Kamran

    1998-01-01

    Excalibur RetrievalWare offers true knowledge retrieval solutions. Its fundamental technologies, Adaptive Pattern Recognition Processing and Semantic Networks, have capabilities for knowledge discovery and knowledge management of full-text, structured and visual information. The software delivers a combination of accuracy, extensibility,…

  1. Industrial implementation of spatial variability control by real-time SPC

    NASA Astrophysics Data System (ADS)

    Roule, O.; Pasqualini, F.; Borde, M.

    2016-10-01

    Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.

  2. Graded, Dynamically Routable Information Processing with Synfire-Gated Synfire Chains.

    PubMed

    Wang, Zhuo; Sornborger, Andrew T; Tao, Louis

    2016-06-01

    Coherent neural spiking and local field potentials are believed to be signatures of the binding and transfer of information in the brain. Coherent activity has now been measured experimentally in many regions of mammalian cortex. Recently experimental evidence has been presented suggesting that neural information is encoded and transferred in packets, i.e., in stereotypical, correlated spiking patterns of neural activity. Due to their relevance to coherent spiking, synfire chains are one of the main theoretical constructs that have been appealed to in order to describe coherent spiking and information transfer phenomena. However, for some time, it has been known that synchronous activity in feedforward networks asymptotically either approaches an attractor with fixed waveform and amplitude, or fails to propagate. This has limited the classical synfire chain's ability to explain graded neuronal responses. Recently, we have shown that pulse-gated synfire chains are capable of propagating graded information coded in mean population current or firing rate amplitudes. In particular, we showed that it is possible to use one synfire chain to provide gating pulses and a second, pulse-gated synfire chain to propagate graded information. We called these circuits synfire-gated synfire chains (SGSCs). Here, we present SGSCs in which graded information can rapidly cascade through a neural circuit, and show a correspondence between this type of transfer and a mean-field model in which gating pulses overlap in time. We show that SGSCs are robust in the presence of variability in population size, pulse timing and synaptic strength. Finally, we demonstrate the computational capabilities of SGSC-based information coding by implementing a self-contained, spike-based, modular neural circuit that is triggered by streaming input, processes the input, then makes a decision based on the processed information and shuts itself down.

  3. Programmable DNA-Mediated Multitasking Processor.

    PubMed

    Shu, Jian-Jun; Wang, Qi-Wen; Yong, Kian-Yan; Shao, Fangwei; Lee, Kee Jin

    2015-04-30

    Because of DNA appealing features as perfect material, including minuscule size, defined structural repeat and rigidity, programmable DNA-mediated processing is a promising computing paradigm, which employs DNAs as information storing and processing substrates to tackle the computational problems. The massive parallelism of DNA hybridization exhibits transcendent potential to improve multitasking capabilities and yield a tremendous speed-up over the conventional electronic processors with stepwise signal cascade. As an example of multitasking capability, we present an in vitro programmable DNA-mediated optimal route planning processor as a functional unit embedded in contemporary navigation systems. The novel programmable DNA-mediated processor has several advantages over the existing silicon-mediated methods, such as conducting massive data storage and simultaneous processing via much fewer materials than conventional silicon devices.

  4. Putting Integrated Systems Health Management Capabilities to Work: Development of an Advanced Caution and Warning System for Next-Generation Crewed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Mccann, Robert S.; Spirkovska, Lilly; Smith, Irene

    2013-01-01

    Integrated System Health Management (ISHM) technologies have advanced to the point where they can provide significant automated assistance with real-time fault detection, diagnosis, guided troubleshooting, and failure consequence assessment. To exploit these capabilities in actual operational environments, however, ISHM information must be integrated into operational concepts and associated information displays in ways that enable human operators to process and understand the ISHM system information rapidly and effectively. In this paper, we explore these design issues in the context of an advanced caution and warning system (ACAWS) for next-generation crewed spacecraft missions. User interface concepts for depicting failure diagnoses, failure effects, redundancy loss, "what-if" failure analysis scenarios, and resolution of ambiguity groups are discussed and illustrated.

  5. Digital Process and Product: Engaging the Next Generation of Art Education Researchers

    ERIC Educational Resources Information Center

    Baer, Stephanie A.; Danker, Stephanie

    2017-01-01

    As art teacher educators, we want our students to be passionate, informed advocates for art education and capable of conducting action research as artist/teacher/researchers. Students are constantly in the process of understanding what it means to teach with and through the arts. In our art education program, we work to exemplify this complex…

  6. Frequency domain laser velocimeter signal processor

    NASA Technical Reports Server (NTRS)

    Meyers, James F.; Murphy, R. Jay

    1991-01-01

    A new scheme for processing signals from laser velocimeter systems is described. The technique utilizes the capabilities of advanced digital electronics to yield a signal processor capable of operating in the frequency domain maximizing the information obtainable from each signal burst. This allows a sophisticated approach to signal detection and processing, with a more accurate measurement of the chirp frequency resulting in an eight-fold increase in measurable signals over the present high-speed burst counter technology. Further, the required signal-to-noise ratio is reduced by a factor of 32, allowing measurements within boundary layers of wind tunnel models. Measurement accuracy is also increased up to a factor of five.

  7. Earth Science System of the Future: Observing, Processing, and Delivering Data Products Directly to Users

    NASA Technical Reports Server (NTRS)

    Crisp, David; Komar, George (Technical Monitor)

    2001-01-01

    Advancement of our predictive capabilities will require new scientific knowledge, improvement of our modeling capabilities, and new observation strategies to generate the complex data sets needed by coupled modeling networks. New observation strategies must support remote sensing from a variety of vantage points and will include "sensorwebs" of small satellites in low Earth orbit, large aperture sensors in Geostationary orbits, and sentinel satellites at L1 and L2 to provide day/night views of the entire globe. Onboard data processing and high speed computing and communications will enable near real-time tailoring and delivery of information products (i.e., predictions) directly to users.

  8. Aircraft Icing Weather Data Reporting and Dissemination System

    NASA Technical Reports Server (NTRS)

    Bass, Ellen J.; Minsk, Brian; Lindholm, Tenny; Politovich, Marcia; Reehorst, Andrew (Technical Monitor)

    2002-01-01

    The long-term operational concept of this research is to develop an onboard aircraft system that assesses and reports atmospheric icing conditions automatically and in a timely manner in order to improve aviation safety and the efficiency of aircraft operations via improved real-time and forecast weather products. The idea is to use current measurement capabilities on aircraft equipped with icing sensors and in-flight data communication technologies as a reporting source. Without requiring expensive avionics upgrades, aircraft data must be processed and available for downlink. Ideally, the data from multiple aircraft can then be integrated (along with other real-time and modeled data) on the ground such that aviation-centered icing hazard metrics for volumes of airspace can be assessed. As the effect of icing on different aircraft types can vary, the information should be displayed in meaningful ways such that multiple types of users can understand the information. That is, information must be presented in a manner to allow users to understand the icing conditions with respect to individual concerns and aircraft capabilities. This research provides progress toward this operational concept by: identifying an aircraft platform capable of digitally capturing, processing, and downlinking icing data; identifying the required in situ icing data processing; investigating the requirements for routing the icing data for use by weather products; developing an icing case study in order to gain insight into major air carrier needs; developing and prototyping icing display concepts based on the National Center for Atmospheric Research's existing diagnostic and forecast experimental icing products; and conducting a usability study for the prototyped icing display concepts.

  9. A systems approach for data compression and latency reduction in cortically controlled brain machine interfaces.

    PubMed

    Oweiss, Karim G

    2006-07-01

    This paper suggests a new approach for data compression during extracutaneous transmission of neural signals recorded by high-density microelectrode array in the cortex. The approach is based on exploiting the temporal and spatial characteristics of the neural recordings in order to strip the redundancy and infer the useful information early in the data stream. The proposed signal processing algorithms augment current filtering and amplification capability and may be a viable replacement to on chip spike detection and sorting currently employed to remedy the bandwidth limitations. Temporal processing is devised by exploiting the sparseness capabilities of the discrete wavelet transform, while spatial processing exploits the reduction in the number of physical channels through quasi-periodic eigendecomposition of the data covariance matrix. Our results demonstrate that substantial improvements are obtained in terms of lower transmission bandwidth, reduced latency and optimized processor utilization. We also demonstrate the improvements qualitatively in terms of superior denoising capabilities and higher fidelity of the obtained signals.

  10. Integrating reliability and maintainability into a concurrent engineering environment

    NASA Astrophysics Data System (ADS)

    Phillips, Clifton B.; Peterson, Robert R.

    1993-02-01

    This paper describes the results of a reliability and maintainability study conducted at the University of California, San Diego and supported by private industry. Private industry thought the study was important and provided the university access to innovative tools under cooperative agreement. The current capability of reliability and maintainability tools and how they fit into the design process is investigated. The evolution of design methodologies leading up to today's capability is reviewed for ways to enhance the design process while keeping cost under control. A method for measuring the consequences of reliability and maintainability policy for design configurations in an electronic environment is provided. The interaction of selected modern computer tool sets is described for reliability, maintainability, operations, and other elements of the engineering design process. These tools provide a robust system evaluation capability that brings life cycle performance improvement information to engineers and their managers before systems are deployed, and allow them to monitor and track performance while it is in operation.

  11. The intelligent user interface for NASA's advanced information management systems

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Short, Nicholas, Jr.; Rolofs, Larry H.; Wattawa, Scott L.

    1987-01-01

    NASA has initiated the Intelligent Data Management Project to design and develop advanced information management systems. The project's primary goal is to formulate, design and develop advanced information systems that are capable of supporting the agency's future space research and operational information management needs. The first effort of the project was the development of a prototype Intelligent User Interface to an operational scientific database, using expert systems and natural language processing technologies. An overview of Intelligent User Interface formulation and development is given.

  12. An Integrated Nursing Management Information System: From Concept to Reality

    PubMed Central

    Pinkley, Connie L.; Sommer, Patricia K.

    1988-01-01

    This paper addresses the transition from the conceptualization of a Nursing Management Information System (NMIS) integrated and interdependent with the Hospital Information System (HIS) to its realization. Concepts of input, throughout, and output are presented to illustrate developmental strategies used to achieve nursing information products. Essential processing capabilities include: 1) ability to interact with multiple data sources; 2) database management, statistical, and graphics software packages; 3) online, batch and reporting; and 4) interactive data analysis. Challenges encountered in system construction are examined.

  13. Behavioral and Organizational Considerations in the Design of Information Systems and Processes for Planning and Decision Support,

    DTIC Science & Technology

    1981-06-01

    analysis and display capability provided by management information systems to include interpretation and aggregation of information and values such as...accomplishment of these) 2. analysis of the issue d) systems analysis and modeling (determination of the structure of the decision situation, the...existingltrtie2) Surveying lsata i situation’ alternatives I altraivDsad Is this alternative -" altrnav acceptable? ANALYSIS o NOYES SHave a sufficient

  14. Information Diffusion in Facebook-Like Social Networks Under Information Overload

    NASA Astrophysics Data System (ADS)

    Li, Pei; Xing, Kai; Wang, Dapeng; Zhang, Xin; Wang, Hui

    2013-07-01

    Research on social networks has received remarkable attention, since many people use social networks to broadcast information and stay connected with their friends. However, due to the information overload in social networks, it becomes increasingly difficult for users to find useful information. This paper takes Facebook-like social networks into account, and models the process of information diffusion under information overload. The term view scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated is proposed to characterize the information diffusion efficiency. Through theoretical analysis, we find that factors such as network structure and view scope number have no impact on the information diffusion efficiency, which is a surprising result. To verify the results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly.

  15. Population dynamics of mottled sculpin (PISCES) in a variable environment: information theoretic approaches

    Treesearch

    Gary D. Grossman; Robert E Ratajczak; J. Todd Petty; Mark D. Hunter; James T. Peterson; Gael Grenouillet

    2006-01-01

    We used strong inference with Akaike's Information Criterion (AIC) to assess the processes capable of explaining long-term (1984-1995) variation in the per capita rate of change of mottled sculpin (Cottus bairdi) populations in the Coweeta Creek drainage (USA). We sampled two fourth- and one fifth-order sites (BCA [uppermost], BCB, and CC [lowermost])...

  16. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorensek, M.; Hamm, L.; Garcia, H.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come frommore » many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.« less

  17. Climbing the ladder: capability maturity model integration level 3

    NASA Astrophysics Data System (ADS)

    Day, Bryce; Lutteroth, Christof

    2011-02-01

    This article details the attempt to form a complete workflow model for an information and communication technologies (ICT) company in order to achieve a capability maturity model integration (CMMI) maturity rating of 3. During this project, business processes across the company's core and auxiliary sectors were documented and extended using modern enterprise modelling tools and a The Open Group Architectural Framework (TOGAF) methodology. Different challenges were encountered with regard to process customisation and tool support for enterprise modelling. In particular, there were problems with the reuse of process models, the integration of different project management methodologies and the integration of the Rational Unified Process development process framework that had to be solved. We report on these challenges and the perceived effects of the project on the company. Finally, we point out research directions that could help to improve the situation in the future.

  18. Preserved learning of novel information in amnesia: evidence for multiple memory systems.

    PubMed

    Gordon, B

    1988-06-01

    Four of five patients with marked global amnesia, and others with new learning impairments, showed normal processing facilitation for novel stimuli (nonwords) and/or for familiar stimuli (words) on a word/nonword (lexical) decision task. The data are interpreted as a reflection of the learning capabilities of in-line neural processing stages with multiple, distinct, informational codes. These in-line learning processes are separate from the recognition/recall memory impaired by amygdalohippocampal/dosomedial thalamic damage, but probably supplement such memory in some tasks in normal individuals. Preserved learning of novel information seems incompatible with explanations of spared learning in amnesia that are based on the episodic/semantic or memory/habit distinctions, but is consistent with the procedural/declarative hypothesis.

  19. Capability of Rolling Efficiency for 100M High-Speed Rails

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Howard

    2014-03-22

    OG Technologies, Inc. (OGT), along with its academic and industrial partners, proposes this CORE project for the Capability of Rolling Efficiency for 100m high-speed rails. The goal is to establish the competitive advantage, and thus the sustainability of the US-based rail manufacturers by greatly enhanced efficiency through innovative in-line metrology technology, in-depth process knowledge, and advanced process control to overcome detrimental factors such as higher labor costs that are saddling the US manufacturing sector. This Phase I project was carried out by an industrial-academia team over 9 months. The R&D team successfully completed all technical tasks and accomplished the objectivesmore » for the Phase I. In addition to the technical efforts, the introductory information of this project as well as anticipated progress was disseminated to steel mills interested in the project. The Phase I project has established the technical and commercial basis for additional development. There are needs to further completing the in-line sensing capability, deepening the capability of metamodeling, and supporting the process monitoring and control. The R&D team plans to submit a Phase II proposal based on the findings.« less

  20. Extending the NIF DISCO framework to automate complex workflow: coordinating the harvest and integration of data from diverse neuroscience information resources

    PubMed Central

    Marenco, Luis N.; Wang, Rixin; Bandrowski, Anita E.; Grethe, Jeffrey S.; Shepherd, Gordon M.; Miller, Perry L.

    2014-01-01

    This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF’s data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO’s current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation. PMID:25018728

  1. Extending the NIF DISCO framework to automate complex workflow: coordinating the harvest and integration of data from diverse neuroscience information resources.

    PubMed

    Marenco, Luis N; Wang, Rixin; Bandrowski, Anita E; Grethe, Jeffrey S; Shepherd, Gordon M; Miller, Perry L

    2014-01-01

    This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF's data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO's current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation.

  2. Diffusion processes of fragmentary information on scale-free networks

    NASA Astrophysics Data System (ADS)

    Li, Xun; Cao, Lang

    2016-05-01

    Compartmental models of diffusion over contact networks have proven representative of real-life propagation phenomena among interacting individuals. However, there is a broad class of collective spreading mechanisms departing from compartmental representations, including those for diffusive objects capable of fragmentation and transmission unnecessarily as a whole. Here, we consider a continuous-state susceptible-infected-susceptible (SIS) model as an ideal limit-case of diffusion processes of fragmentary information on networks, where individuals possess fractions of the information content and update them by selectively exchanging messages with partners in the vicinity. Specifically, we incorporate local information, such as neighbors' node degrees and carried contents, into the individual partner choice, and examine the roles of a variety of such strategies in the information diffusion process, both qualitatively and quantitatively. Our method provides an effective and flexible route of modulating continuous-state diffusion dynamics on networks and has potential in a wide array of practical applications.

  3. The essential role of reconfiguration capabilities in the implementation of HIV-related health information exchanges.

    PubMed

    Steward, Wayne T; Koester, Kimberly A; Collins, Shane P; Maiorana, Andre; Myers, Janet J

    2012-10-01

    To understand the dynamic capabilities that enabled the six demonstration projects of the Information Technology Networks of Care Initiative to implement health information exchanges (HIEs) tailored to their local HIV epidemics and regional care systems. We conducted 111 semi-structured interviews with project staff and information technology (IT) specialists associated with the demonstration projects, staff from community-based organizations and public health agencies collaborating in the design and implementation of the HIEs, and providers who used each HIE. The dynamic capability framework guided analyses. In the context of a HIE, the framework's components include information systems (the actual technological exchange systems and capacity to update them), absorptive capacity (the ability to implement an operating HIE), reconfiguration capacity (the ability to adapt workflows and clinical practices in response to a HIE), and organizational size and human resources (characteristics likely to affect a clinic's ability to respond). Across the projects, we found evidence for the importance of three dynamic capabilities: information systems, reconfiguration capacity, and organizational size and human resources. However, of these three, reconfiguration capacity was the most salient. Implementation outcomes at all six of the projects were shaped substantially by the degree of attention dedicated to reworking procedures and practices so that HIE usage became routine. Electronic information exchange offers the promise of improved coordination of care. However, implementation of HIEs goes beyond programing and hardware installation challenges, and requires close attention to the needs of the HIEs end-users. Providers need to discern value from a HIE because their active participation is essential to ensuring that clinic and agency practices and procedures are reconfigured to incorporate new systems into daily work processes. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  4. Using health information technology to manage a patient population in accountable care organizations.

    PubMed

    Wu, Frances M; Rundall, Thomas G; Shortell, Stephen M; Bloom, Joan R

    2016-06-20

    Purpose - The purpose of this paper is to describe the current landscape of health information technology (HIT) in early accountable care organizations (ACOs), the different strategies ACOs are using to develop HIT-based capabilities, and how ACOs are using these capabilities within their care management processes to advance health outcomes for their patient population. Design/methodology/approach - Mixed methods study pairing data from a cross-sectional National Survey of ACOs with in-depth, semi-structured interviews with leaders from 11 ACOs (both completed in 2013). Findings - Early ACOs vary widely in their electronic health record, data integration, and analytic capabilities. The most common HIT capability was drug-drug and drug-allergy interaction checks, with 53.2 percent of respondents reporting that the ACO possessed the capability to a high degree. Outpatient and inpatient data integration was the least common HIT capability (8.1 percent). In the interviews, ACO leaders commented on different HIT development strategies to gain a more comprehensive picture of patient needs and service utilization. ACOs realize the necessity for robust data analytics, and are exploring a variety of approaches to achieve it. Research limitations/implications - Data are self-reported. The qualitative portion was based on interviews with 11 ACOs, limiting generalizability to the universe of ACOs but allowing for a range of responses. Practical implications - ACOs are challenged with the development of sophisticated HIT infrastructure. They may benefit from targeted assistance and incentives to implement health information exchanges with other providers to promote more coordinated care management for their patient population. Originality/value - Using new empirical data, this study increases understanding of the extent of ACOs' current and developing HIT capabilities to support ongoing care management.

  5. Simulation of a Multidimensional Input Quantum Perceptron

    NASA Astrophysics Data System (ADS)

    Yamamoto, Alexandre Y.; Sundqvist, Kyle M.; Li, Peng; Harris, H. Rusty

    2018-06-01

    In this work, we demonstrate the improved data separation capabilities of the Multidimensional Input Quantum Perceptron (MDIQP), a fundamental cell for the construction of more complex Quantum Artificial Neural Networks (QANNs). This is done by using input controlled alterations of ancillary qubits in combination with phase estimation and learning algorithms. The MDIQP is capable of processing quantum information and classifying multidimensional data that may not be linearly separable, extending the capabilities of the classical perceptron. With this powerful component, we get much closer to the achievement of a feedforward multilayer QANN, which would be able to represent and classify arbitrary sets of data (both quantum and classical).

  6. Adaptive Classification of Landscape Process and Function: An Integration of Geoinformatics and Self-Organizing Maps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Andre M.

    2009-07-17

    The advanced geospatial information extraction and analysis capabilities of a Geographic Information System (GISs) and Artificial Neural Networks (ANNs), particularly Self-Organizing Maps (SOMs), provide a topology-preserving means for reducing and understanding complex data relationships in the landscape. The Adaptive Landscape Classification Procedure (ALCP) is presented as an adaptive and evolutionary capability where varying types of data can be assimilated to address different management needs such as hydrologic response, erosion potential, habitat structure, instrumentation placement, and various forecast or what-if scenarios. This paper defines how the evaluation and analysis of spatial and/or temporal patterns in the landscape can provide insight intomore » complex ecological, hydrological, climatic, and other natural and anthropogenic-influenced processes. Establishing relationships among high-dimensional datasets through neurocomputing based pattern recognition methods can help 1) resolve large volumes of data into a structured and meaningful form; 2) provide an approach for inferring landscape processes in areas that have limited data available but exhibit similar landscape characteristics; and 3) discover the value of individual variables or groups of variables that contribute to specific processes in the landscape. Classification of hydrologic patterns in the landscape is demonstrated.« less

  7. Realistic terrain visualization based on 3D virtual world technology

    NASA Astrophysics Data System (ADS)

    Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai

    2009-09-01

    The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.

  8. Realistic terrain visualization based on 3D virtual world technology

    NASA Astrophysics Data System (ADS)

    Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai

    2010-11-01

    The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.

  9. Cogeneration technology alternatives study. Volume 6: Computer data

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The potential technical capabilities of energy conversion systems in the 1985 - 2000 time period were defined with emphasis on systems using coal, coal-derived fuels or alternate fuels. Industrial process data developed for the large energy consuming industries serve as a framework for the cogeneration applications. Ground rules for the study were established and other necessary equipment (balance-of-plant) was defined. This combination of technical information, energy conversion system data ground rules, industrial process information and balance-of-plant characteristics was analyzed to evaluate energy consumption, capital and operating costs and emissions. Data in the form of computer printouts developed for 3000 energy conversion system-industrial process combinations are presented.

  10. FPGA-based real time processing of the Plenoptic Wavefront Sensor

    NASA Astrophysics Data System (ADS)

    Rodríguez-Ramos, L. F.; Marín, Y.; Díaz, J. J.; Piqueras, J.; García-Jiménez, J.; Rodríguez-Ramos, J. M.

    The plenoptic wavefront sensor combines measurements at pupil and image planes in order to obtain simultaneously wavefront information from different points of view, being capable to sample the volume above the telescope to extract the tomographic information of the atmospheric turbulence. The advantages of this sensor are presented elsewhere at this conference (José M. Rodríguez-Ramos et al). This paper will concentrate in the processing required for pupil plane phase recovery, and its computation in real time using FPGAs (Field Programmable Gate Arrays). This technology eases the implementation of massive parallel processing and allows tailoring the system to the requirements, maintaining flexibility, speed and cost figures.

  11. Integrated Information Increases with Fitness in the Evolution of Animats

    PubMed Central

    Edlund, Jeffrey A.; Chaumont, Nicolas; Hintze, Arend; Koch, Christof; Tononi, Giulio; Adami, Christoph

    2011-01-01

    One of the hallmarks of biological organisms is their ability to integrate disparate information sources to optimize their behavior in complex environments. How this capability can be quantified and related to the functional complexity of an organism remains a challenging problem, in particular since organismal functional complexity is not well-defined. We present here several candidate measures that quantify information and integration, and study their dependence on fitness as an artificial agent (“animat”) evolves over thousands of generations to solve a navigation task in a simple, simulated environment. We compare the ability of these measures to predict high fitness with more conventional information-theoretic processing measures. As the animat adapts by increasing its “fit” to the world, information integration and processing increase commensurately along the evolutionary line of descent. We suggest that the correlation of fitness with information integration and with processing measures implies that high fitness requires both information processing as well as integration, but that information integration may be a better measure when the task requires memory. A correlation of measures of information integration (but also information processing) and fitness strongly suggests that these measures reflect the functional complexity of the animat, and that such measures can be used to quantify functional complexity even in the absence of fitness data. PMID:22028639

  12. Intra-enterprise telecommunication satellites

    NASA Astrophysics Data System (ADS)

    Henry, A. J.

    1981-11-01

    Information transfer in the mid 1980's is sketched. The use of geostationary satellites for internal requirements of businesses is an important factor in the growth of information transfer. Protection of transferred information is achieved through encryption. The companies which use satellites are those whose telecommunication costs are already significant; who have large computing capabilities including distributed data processing; who use national and international leased circuits; and whose establishments are dispersed. Uses include teleconferencing, voice and data transmission, and text and facsimile communication.

  13. The Human Thalamus Is an Integrative Hub for Functional Brain Networks

    PubMed Central

    Bertolero, Maxwell A.

    2017-01-01

    The thalamus is globally connected with distributed cortical regions, yet the functional significance of this extensive thalamocortical connectivity remains largely unknown. By performing graph-theoretic analyses on thalamocortical functional connectivity data collected from human participants, we found that most thalamic subdivisions display network properties that are capable of integrating multimodal information across diverse cortical functional networks. From a meta-analysis of a large dataset of functional brain-imaging experiments, we further found that the thalamus is involved in multiple cognitive functions. Finally, we found that focal thalamic lesions in humans have widespread distal effects, disrupting the modular organization of cortical functional networks. This converging evidence suggests that the human thalamus is a critical hub region that could integrate diverse information being processed throughout the cerebral cortex as well as maintain the modular structure of cortical functional networks. SIGNIFICANCE STATEMENT The thalamus is traditionally viewed as a passive relay station of information from sensory organs or subcortical structures to the cortex. However, the thalamus has extensive connections with the entire cerebral cortex, which can also serve to integrate information processing between cortical regions. In this study, we demonstrate that multiple thalamic subdivisions display network properties that are capable of integrating information across multiple functional brain networks. Moreover, the thalamus is engaged by tasks requiring multiple cognitive functions. These findings support the idea that the thalamus is involved in integrating information across cortical networks. PMID:28450543

  14. DESIGN AND EVALUATION OF INDIVIDUAL ELEMENTS OF THE INTERFACE FOR AN AGRICULTURAL MACHINE.

    PubMed

    Rakhra, Aadesh K; Mann, Danny D

    2018-01-29

    If a user-centered approach is not used to design information displays, the quantity and quality of information presented to the user may not match the needs of the user, or it may exceed the capability of the human operator for processing and using that information. The result may be an excessive mental workload and reduced situation awareness of the operator, which can negatively affect the machine performance and operational outcomes. The increasing use of technology in agricultural machines may expose the human operator to excessive and undesirable information if the operator's information needs and information processing capabilities are ignored. In this study, a user-centered approach was used to design specific interface elements for an agricultural air seeder. Designs of the interface elements were evaluated in a laboratory environment by developing high-fidelity prototypes. Evaluations of the user interface elements yielded significant improvement in situation awareness (up to 11%; overall mean difference = 5.0 (4.8%), 95% CI (6.4728, 3.5939), p 0.0001). Mental workload was reduced by up to 19.7% (overall mean difference = -5.2 (-7.9%), n = 30, a = 0.05). Study participants rated the overall performance of the newly designed user-centered interface elements higher in comparison to the previous designs (overall mean difference = 27.3 (189.8%), 99% CI (35.150, 19.384), p 0.0001. Copyright© by the American Society of Agricultural Engineers.

  15. Processing reafferent and exafferent visual information for action and perception.

    PubMed

    Reichenbach, Alexandra; Diedrichsen, Jörn

    2015-01-01

    A recent study suggests that reafferent hand-related visual information utilizes a privileged, attention-independent processing channel for motor control. This process was termed visuomotor binding to reflect its proposed function: linking visual reafferences to the corresponding motor control centers. Here, we ask whether the advantage of processing reafferent over exafferent visual information is a specific feature of the motor processing stream or whether the improved processing also benefits the perceptual processing stream. Human participants performed a bimanual reaching task in a cluttered visual display, and one of the visual hand cursors could be displaced laterally during the movement. We measured the rapid feedback responses of the motor system as well as matched perceptual judgments of which cursor was displaced. Perceptual judgments were either made by watching the visual scene without moving or made simultaneously to the reaching tasks, such that the perceptual processing stream could also profit from the specialized processing of reafferent information in the latter case. Our results demonstrate that perceptual judgments in the heavily cluttered visual environment were improved when performed based on reafferent information. Even in this case, however, the filtering capability of the perceptual processing stream suffered more from the increasing complexity of the visual scene than the motor processing stream. These findings suggest partly shared and partly segregated processing of reafferent information for vision for motor control versus vision for perception.

  16. Design requirements for SRB production control system. Volume 3: Package evaluation, modification and hardware

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The software package evaluation was designed to analyze commercially available, field-proven, production control or manufacturing resource planning management technology and software package. The analysis was conducted by comparing SRB production control software requirements and conceptual system design to software package capabilities. The methodology of evaluation and the findings at each stage of evaluation are described. Topics covered include: vendor listing; request for information (RFI) document; RFI response rate and quality; RFI evaluation process; and capabilities versus requirements.

  17. Stochastic Feedforward Control Technique

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim

    1990-01-01

    Class of commanded trajectories modeled as stochastic process. Advanced Transport Operating Systems (ATOPS) research and development program conducted by NASA Langley Research Center aimed at developing capabilities for increases in capacities of airports, safe and accurate flight in adverse weather conditions including shear, winds, avoidance of wake vortexes, and reduced consumption of fuel. Advances in techniques for design of modern controls and increased capabilities of digital flight computers coupled with accurate guidance information from Microwave Landing System (MLS). Stochastic feedforward control technique developed within context of ATOPS program.

  18. PIV Data Validation Software Package

    NASA Technical Reports Server (NTRS)

    Blackshire, James L.

    1997-01-01

    A PIV data validation and post-processing software package was developed to provide semi-automated data validation and data reduction capabilities for Particle Image Velocimetry data sets. The software provides three primary capabilities including (1) removal of spurious vector data, (2) filtering, smoothing, and interpolating of PIV data, and (3) calculations of out-of-plane vorticity, ensemble statistics, and turbulence statistics information. The software runs on an IBM PC/AT host computer working either under Microsoft Windows 3.1 or Windows 95 operating systems.

  19. Application of smart optical fiber sensors for structural load monitoring

    NASA Astrophysics Data System (ADS)

    Davies, Heddwyn; Everall, Lorna A.; Gallon, Andrew M.

    2001-06-01

    This paper describes a smart monitoring system, incorporating optical fiber sensing techniques, capable of providing important structural information to designers and users alike. This technology has wide industrial and commercial application in areas including aerospace, civil, maritime and automotive engineering. In order to demonstrate the capability of the sensing system it has been installed in a 35m free-standing carbon fiber yacht mast, where a complete optical network of strain and temperature sensors were embedded into a composite mast and boom during lay-up. The system was able to monitor the behavior of the composite rig through a range of handling conditions. The resulting strain information can be used by engineers to improve the structural design process. Embedded fiber optic sensors have wide ranging application for structural load monitoring. Due to their small size, optical fiber sensors can be readily embedded into composite materials. Other advantages include their immediate multiplexing capability and immunity to electro-magnetic interference. The capability of this system has been demonstrated within the maritime and industrial environment, but can be adapted for any application.

  20. DOD Manufacturing Arsenals: Actions Needed to Identify and Sustain Critical Capabilities

    DTIC Science & Technology

    2015-11-01

    to each develop their own unique method. A senior OSD official described the resulting process as unsound . Each manufacturing arsenal declared what...Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments

  1. The Ultimate Big Data Enterprise Initiative: Defining Functional Capabilities for an International Information System (IIS) for Orbital Space Data (OSD)

    NASA Astrophysics Data System (ADS)

    Raygan, R.

    Global collaboration in support of an International Information System (IIS) for Orbital Space Data (OSD) literally requires a global enterprise. As with many information technology enterprise initiatives attempting to coral the desires of business with the budgets and limitations of technology, Space Situational Awareness (SSA) includes many of the same challenges: 1) Adaptive / Intuitive Dash Board that facilitates User Experience Design for a variety of users. 2) Asset Management of hundreds of thousands of objects moving at thousands of miles per hour hundreds of miles in space. 3) Normalization and integration of diverse data in various languages, possibly hidden or protected from easy access. 4) Expectations of near real-time information availability coupled with predictive analysis to affect decisions before critical points of no return, such as Space Object Conjunction Assessment (CA). 5) Data Ownership, management, taxonomy, and accuracy. 6) Integrated metrics and easily modified algorithms for "what if" analysis. This paper proposes an approach to define the functional capabilities for an IIS for OSD. These functional capabilities not only address previously identified gaps in current systems but incorporate lessons learned from other big data, enterprise, and agile information technology initiatives that correlate to the space domain. Viewing the IIS as the "data service provider" allows adoption of existing information technology processes which strengthen governance and ensure service consumers certain levels of service dependability and accuracy.

  2. Heterogeneous delivering capability promotes traffic efficiency in complex networks

    NASA Astrophysics Data System (ADS)

    Zhu, Yan-Bo; Guan, Xiang-Min; Zhang, Xue-Jun

    2015-12-01

    Traffic is one of the most fundamental dynamical processes in networked systems. With the homogeneous delivery capability of nodes, the global dynamic routing strategy proposed by Ling et al. [Phys. Rev. E81, 016113 (2010)] adequately uses the dynamic information during the process and thus it can reach a quite high network capacity. In this paper, based on the global dynamic routing strategy, we proposed a heterogeneous delivery allocation strategy of nodes on scale-free networks with consideration of nodes degree. It is found that the network capacity as well as some other indexes reflecting transportation efficiency are further improved. Our work may be useful for the design of more efficient routing strategies in communication or transportation systems.

  3. ABM Drag_Pass Report Generator

    NASA Technical Reports Server (NTRS)

    Fisher, Forest; Gladden, Roy; Khanampornpan, Teerapat

    2008-01-01

    dragREPORT software was developed in parallel with abmREPORT, which is described in the preceding article. Both programs were built on the capabilities created during that process. This tool generates a drag_pass report that summarizes vital information from the MRO aerobreaking drag_pass build process to facilitate both sequence reviews and provide a high-level summarization of the sequence for mission management. The script extracts information from the ENV, SSF, FRF, SCMFmax, and OPTG files, presenting them in a single, easy-to-check report providing the majority of parameters needed for cross check and verification as part of the sequence review process. Prior to dragReport, all the needed information was spread across a number of different files, each in a different format. This software is a Perl script that extracts vital summarization information and build-process details from a number of source files into a single, concise report format used to aid the MPST sequence review process and to provide a high-level summarization of the sequence for mission management reference. This software could be adapted for future aerobraking missions to provide similar reports, review and summarization information.

  4. Low cost solar array project: Experimental process system development unit for producing semiconductor-grade silicon using silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The design, fabrication, and installation of an experimental process system development unit (EPSDU) were analyzed. Supporting research and development were performed to provide an information data base usable for the EPSDU and for technological design and economical analysis for potential scale-up of the process. Iterative economic analyses were conducted for the estimated product cost for the production of semiconductor grade silicon in a facility capable of producing 1000-MT/Yr.

  5. Communicating Genetic Risk Information for Common Disorders in the Era of Genomic Medicine

    PubMed Central

    Lautenbach, Denise M.; Christensen, Kurt D.; Sparks, Jeffrey A.; Green, Robert C.

    2013-01-01

    Communicating genetic risk information in ways that maximize understanding and promote health is increasingly important given the rapidly expanding availability and capabilities of genomic technologies. A well-developed literature on risk communication in general provides guidance for best practices, including presentation of information in multiple formats, attention to framing effects, use of graphics, sensitivity to the way numbers are presented, parsimony of information, attentiveness to emotions, and interactivity as part of the communication process. Challenges to communicating genetic risk information include deciding how best to tailor it, streamlining the process, deciding what information to disclose, accepting that communications may have limited influence, and understanding the impact of context. Meeting these challenges has great potential for empowering individuals to adopt healthier lifestyles and improve public health, but will require multidisciplinary approaches and collaboration. PMID:24003856

  6. An Extreme Learning Machine-Based Neuromorphic Tactile Sensing System for Texture Recognition.

    PubMed

    Rasouli, Mahdi; Chen, Yi; Basu, Arindam; Kukreja, Sunil L; Thakor, Nitish V

    2018-04-01

    Despite significant advances in computational algorithms and development of tactile sensors, artificial tactile sensing is strikingly less efficient and capable than the human tactile perception. Inspired by efficiency of biological systems, we aim to develop a neuromorphic system for tactile pattern recognition. We particularly target texture recognition as it is one of the most necessary and challenging tasks for artificial sensory systems. Our system consists of a piezoresistive fabric material as the sensor to emulate skin, an interface that produces spike patterns to mimic neural signals from mechanoreceptors, and an extreme learning machine (ELM) chip to analyze spiking activity. Benefiting from intrinsic advantages of biologically inspired event-driven systems and massively parallel and energy-efficient processing capabilities of the ELM chip, the proposed architecture offers a fast and energy-efficient alternative for processing tactile information. Moreover, it provides the opportunity for the development of low-cost tactile modules for large-area applications by integration of sensors and processing circuits. We demonstrate the recognition capability of our system in a texture discrimination task, where it achieves a classification accuracy of 92% for categorization of ten graded textures. Our results confirm that there exists a tradeoff between response time and classification accuracy (and information transfer rate). A faster decision can be achieved at early time steps or by using a shorter time window. This, however, results in deterioration of the classification accuracy and information transfer rate. We further observe that there exists a tradeoff between the classification accuracy and the input spike rate (and thus energy consumption). Our work substantiates the importance of development of efficient sparse codes for encoding sensory data to improve the energy efficiency. These results have a significance for a wide range of wearable, robotic, prosthetic, and industrial applications.

  7. Rotorcraft Diagnostics

    NASA Technical Reports Server (NTRS)

    Haste, Deepak; Azam, Mohammad; Ghoshal, Sudipto; Monte, James

    2012-01-01

    Health management (HM) in any engineering systems requires adequate understanding about the system s functioning; a sufficient amount of monitored data; the capability to extract, analyze, and collate information; and the capability to combine understanding and information for HM-related estimation and decision-making. Rotorcraft systems are, in general, highly complex. Obtaining adequate understanding about functioning of such systems is quite difficult, because of the proprietary (restricted access) nature of their designs and dynamic models. Development of an EIM (exact inverse map) solution for rotorcraft requires a process that can overcome the abovementioned difficulties and maximally utilize monitored information for HM facilitation via employing advanced analytic techniques. The goal was to develop a versatile HM solution for rotorcraft for facilitation of the Condition Based Maintenance Plus (CBM+) capabilities. The effort was geared towards developing analytic and reasoning techniques, and proving the ability to embed the required capabilities on a rotorcraft platform, paving the way for implementing the solution on an aircraft-level system for consolidation and reporting. The solution for rotorcraft can he used offboard or embedded directly onto a rotorcraft system. The envisioned solution utilizes available monitored and archived data for real-time fault detection and identification, failure precursor identification, and offline fault detection and diagnostics, health condition forecasting, optimal guided troubleshooting, and maintenance decision support. A variant of the onboard version is a self-contained hardware and software (HW+SW) package that can be embedded on rotorcraft systems. The HM solution comprises components that gather/ingest data and information, perform information/feature extraction, analyze information in conjunction with the dependency/diagnostic model of the target system, facilitate optimal guided troubleshooting, and offer decision support for optimal maintenance.

  8. Multipurpose Interactive NASA Information Systems (MINIS)

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The Multipurpose Interactive NASA Information System was developed to provide remote, interactive information retrieval capability for various types of data bases to be processed on different types of small and medium size computers. Use of the system for three different data bases is decribed: (1) LANDSAT photo look-up, (2) land use, and (3) census/socioeconomic. Each of the data base elements is shown together with other detailed information that a user would require to contact the system remotely, to transmit inquiries on commands, and to receive the results of the queries or commands.

  9. Social behavior of bacteria: from physics to complex organization

    NASA Astrophysics Data System (ADS)

    Ben-Jacob, E.

    2008-10-01

    I describe how bacteria develop complex colonial patterns by utilizing intricate communication capabilities, such as quorum sensing, chemotactic signaling and exchange of genetic information (plasmids) Bacteria do not store genetically all the information required for generating the patterns for all possible environments. Instead, additional information is cooperatively generated as required for the colonial organization to proceed. Each bacterium is, by itself, a biotic autonomous system with its own internal cellular informatics capabilities (storage, processing and assessments of information). These afford the cell certain plasticity to select its response to biochemical messages it receives, including self-alteration and broadcasting messages to initiate alterations in other bacteria. Hence, new features can collectively emerge during self-organization from the intra-cellular level to the whole colony. Collectively bacteria store information, perform decision make decisions (e.g. to sporulate) and even learn from past experience (e.g. exposure to antibiotics)-features we begin to associate with bacterial social behavior and even rudimentary intelligence. I also take Schrdinger’s’ “feeding on negative entropy” criteria further and propose that, in addition organisms have to extract latent information embedded in the environment. By latent information we refer to the non-arbitrary spatio-temporal patterns of regularities and variations that characterize the environmental dynamics. In other words, bacteria must be able to sense the environment and perform internal information processing for thriving on latent information embedded in the complexity of their environment. I then propose that by acting together, bacteria can perform this most elementary cognitive function more efficiently as can be illustrated by their cooperative behavior.

  10. Integrating thematic web portal capabilities into the NASA Earthdata Web Infrastructure

    NASA Astrophysics Data System (ADS)

    Wong, M. M.; McLaughlin, B. D.; Huang, T.; Baynes, K.

    2015-12-01

    The National Aeronautics and Space Administration (NASA) acquires and distributes an abundance of Earth science data on a daily basis to a diverse user community worldwide. To assist the scientific community and general public in achieving a greater understanding of the interdisciplinary nature of Earth science and of key environmental and climate change topics, the NASA Earthdata web infrastructure is integrating new methods of presenting and providing access to Earth science information, data, research and results. This poster will present the process of integrating thematic web portal capabilities into the NASA Earthdata web infrastructure, with examples from the Sea Level Change Portal. The Sea Level Change Portal will be a source of current NASA research, data and information regarding sea level change. The portal will provide sea level change information through articles, graphics, videos and animations, an interactive tool to view and access sea level change data and a dashboard showing sea level change indicators. Earthdata is a part of the Earth Observing System Data and Information System (EOSDIS) project. EOSDIS is a key core capability in NASA's Earth Science Data Systems Program. It provides end-to-end capabilities for managing NASA's Earth science data from various sources - satellites, aircraft, field measurements, and various other programs. It is comprised of twelve Distributed Active Archive Centers (DAACs), Science Computing Facilities (SCFs), data discovery and service access client (Reverb and Earthdata Search), dataset directory (Global Change Master Directory - GCMD), near real-time data (Land Atmosphere Near real-time Capability for EOS - LANCE), Worldview (an imagery visualization interface), Global Imagery Browse Services, the Earthdata Code Collaborative and a host of other discipline specific data discovery, data access, data subsetting and visualization tools.

  11. A new approach to power quality and electricity reliability monitoring-case study illustrations of the capabilities of the I-GridTM system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Divan, Deepak; Brumsickle, William; Eto, Joseph

    2003-04-01

    This report describes a new approach for collecting information on power quality and reliability and making it available in the public domain. Making this information readily available in a form that is meaningful to electricity consumers is necessary for enabling more informed private and public decisions regarding electricity reliability. The system dramatically reduces the cost (and expertise) needed for customers to obtain information on the most significant power quality events, called voltage sags and interruptions. The system also offers widespread access to information on power quality collected from multiple sites and the potential for capturing information on the impacts ofmore » power quality problems, together enabling a wide variety of analysis and benchmarking to improve system reliability. Six case studies demonstrate selected functionality and capabilities of the system, including: Linking measured power quality events to process interruption and downtime; Demonstrating the ability to correlate events recorded by multiple monitors to narrow and confirm the causes of power quality events; and Benchmarking power quality and reliability on a firm and regional basis.« less

  12. What Librarians Need to Know to Survive in an Age of Technology.

    ERIC Educational Resources Information Center

    Malinconico, S. Michael

    1992-01-01

    Discusses the changing library environment, with greater reliance on technology; and describes relevant skills for librarians, including communicating with nonlibrarians, working with group processes, understanding the capabilities of information-handling technologies, and developing management and marketing abilities. (21 references) (EA)

  13. Digital Image Processing Overview For Helmet Mounted Displays

    NASA Astrophysics Data System (ADS)

    Parise, Michael J.

    1989-09-01

    Digital image processing provides a means to manipulate an image and presents a user with a variety of display formats that are not available in the analog image processing environment. When performed in real time and presented on a Helmet Mounted Display, system capability and flexibility are greatly enhanced. The information content of a display can be increased by the addition of real time insets and static windows from secondary sensor sources, near real time 3-D imaging from a single sensor can be achieved, graphical information can be added, and enhancement techniques can be employed. Such increased functionality is generating a considerable amount of interest in the military and commercial markets. This paper discusses some of these image processing techniques and their applications.

  14. NOAA's Satellite Climate Data Records: The Research to Operations Process and Current State

    NASA Astrophysics Data System (ADS)

    Privette, J. L.; Bates, J. J.; Kearns, E. J.; NOAA's Climate Data Record Program

    2011-12-01

    In support of NOAA's mandate to provide climate products and services to the Nation, the National Climatic Data Center initiated the satellite Climate Data Record (CDR) Program. The Program develops and sustains climate information products derived from satellite data that NOAA has collected over the past 30+ years. These are the longest sets of continuous global measurements in existence. Data from other satellite programs, including those in NASA, the Department of Defense, and foreign space agencies, are also used. NOAA is now applying advanced analysis techniques to these historic data. This process is unraveling underlying climate trend and variability information and returning new value from the data. However, the transition of complex data processing chains, voluminous data products and documentation into an systematic, configuration controlled context involves many challenges. In this presentation, we focus on the Program's process for research-to-operations transition and the evolving systems designed to ensure transparency, security, economy and authoritative value. The Program has adopted a two-phase process defined by an Initial Operational Capability (IOC) and a Full Operational Capability (FOC). The principles and procedures for IOC are described, as well as the process for moving CDRs from IOC to FOC. Finally, we will describe the state of the CDRs in all phases the Program, with an emphasis on the seven community-developed CDRs transitioned to NOAA in 2011. Details on CDR access and distribution will be provided.

  15. Lessons Learned for Cx PRACA. Constellation Program Problem Reporting, Analysis and Corrective Action Process and System

    NASA Technical Reports Server (NTRS)

    Kelle, Pido I.; Ratterman, Christian; Gibbs, Cecil

    2009-01-01

    This slide presentation reviews the Constellation Program Problem Reporting, Analysis and Corrective Action Process and System (Cx PRACA). The goal of the Cx PRACA is to incorporate Lessons learned from the Shuttle, ISS, and Orbiter programs by creating a single tool for managing the PRACA process, that clearly defines the scope of PRACA applicability and what must be reported, and defines the ownership and responsibility for managing the PRACA process including disposition approval authority. CxP PRACA is a process, supported by a single information gathering data module which will be integrated with a single CxP Information System, providing interoperability, import and export capability making the CxP PRACA a more effective and user friendly technical and management tool.

  16. Femtosecond pulse laser-oriented recording on dental prostheses: a trial introduction.

    PubMed

    Ichikawa, Tetsuo; Hayasaki, Yoshio; Fujita, Keiji; Nagao, Kan; Murata, Masayo; Kawano, Takanori; Chen, JianRong

    2006-12-01

    The purpose of this study was to evaluate the feasibility of using a femtosecond pulse laser processing technique to store information on a dental prosthesis. Commercially pure titanium plates were processed by a femtosecond pulse laser system. The processed surface structure was observed with a reflective illumination microscope, scanning electron microscope, and atomic force microscope. Processed area was an almost conical pit with a clear boundary. When laser pulse energy was 2 microJ, the diameter and depth were approximately 10microm and 0.2 microm respectively--whereby both increased with laser pulse energy. Further, depth of pit increased with laser pulse number without any thermal effect. This study showed that the femtosecond pulse processing system was capable of recording personal identification and optional additional information on a dental prosthesis.

  17. Dynamics and regulation of the southern brook trout (Salvelinus fontinalis) population in an Appalachian stream

    Treesearch

    Gary D. Grossman; Robert E. Ratajczak; C. Michael Wagner; J. Todd Petty

    2010-01-01

    1. We used information theoretic statistics [Akaike’s Information Criterion (AIC)] and regression analysis in a multiple hypothesis testing approach to assess the processes capable of explaining long-term demographic variation in a lightly exploited brook trout population in Ball Creek, NC. We sampled a 100-m-long second-order site during both spring and autumn 1991–...

  18. Aircraft Engine-Monitoring System And Display

    NASA Technical Reports Server (NTRS)

    Abbott, Terence S.; Person, Lee H., Jr.

    1992-01-01

    Proposed Engine Health Monitoring System and Display (EHMSD) provides enhanced means for pilot to control and monitor performances of engines. Processes raw sensor data into information meaningful to pilot. Provides graphical information about performance capabilities, current performance, and operational conditions in components or subsystems of engines. Provides means to control engine thrust directly and innovative means to monitor performance of engine system rapidly and reliably. Features reduce pilot workload and increase operational safety.

  19. Further Investigations of Content Analytic Techniques for Extracting the Differentiating Information Contained in the Narrative Sections of Performance Evaluations for Navy Enlisted Personnel. Technical Report No. 75-1.

    ERIC Educational Resources Information Center

    Ramsey-Klee, Diane M.; Richman, Vivian

    The purpose of this research is to develop content analytic techniques capable of extracting the differentiating information in narrative performance evaluations for enlisted personnel in order to aid in the process of selecting personnel for advancement, duty assignment, training, or quality retention. Four tasks were performed. The first task…

  20. JIMM: the next step for mission-level models

    NASA Astrophysics Data System (ADS)

    Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.

    2001-09-01

    The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.

  1. Information processing in echo state networks at the edge of chaos.

    PubMed

    Boedecker, Joschka; Obst, Oliver; Lizier, Joseph T; Mayer, N Michael; Asada, Minoru

    2012-09-01

    We investigate information processing in randomly connected recurrent neural networks. It has been shown previously that the computational capabilities of these networks are maximized when the recurrent layer is close to the border between a stable and an unstable dynamics regime, the so called edge of chaos. The reasons, however, for this maximized performance are not completely understood. We adopt an information-theoretical framework and are for the first time able to quantify the computational capabilities between elements of these networks directly as they undergo the phase transition to chaos. Specifically, we present evidence that both information transfer and storage in the recurrent layer are maximized close to this phase transition, providing an explanation for why guiding the recurrent layer toward the edge of chaos is computationally useful. As a consequence, our study suggests self-organized ways of improving performance in recurrent neural networks, driven by input data. Moreover, the networks we study share important features with biological systems such as feedback connections and online computation on input streams. A key example is the cerebral cortex, which was shown to also operate close to the edge of chaos. Consequently, the behavior of model systems as studied here is likely to shed light on reasons why biological systems are tuned into this specific regime.

  2. NASA/SPoRt: GOES-R Activities in Support of Product Development, Management, and Training

    NASA Technical Reports Server (NTRS)

    Fuell, Kevin; Jedlovec, Gary; Molthan, Andrew; Stano, Geoffrey

    2012-01-01

    SPoRT is using current capabilities of MODIS and VIIRS, combined with current GOES (i.e. Hybrid Imagery) to demonstrate mesoscale capabilities of future ABI instrument. SPoRT is transitioning RGBs from EUMETSAT standard "recipes" to demonstrate a method to more efficiently handle the increase channels/frequency of ABI. Challenges for RGB production exist. Internal vs. external production, Bit depth needed, Adding quantitative information, etc. SPoRT forming group to address these issues. SPoRT is leading efforts on the application of total lightning in operations and to educate users of this new capability. Training in many forms is used to support testbed activities and is a key part to the transition process.

  3. Optimal nonlinear information processing capacity in delay-based reservoir computers

    NASA Astrophysics Data System (ADS)

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-09-01

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature.

  4. Optimal nonlinear information processing capacity in delay-based reservoir computers.

    PubMed

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-09-11

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature.

  5. Optimal nonlinear information processing capacity in delay-based reservoir computers

    PubMed Central

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-01-01

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature. PMID:26358528

  6. Tunable Low Energy, Compact and High Performance Neuromorphic Circuit for Spike-Based Synaptic Plasticity

    PubMed Central

    Rahimi Azghadi, Mostafa; Iannella, Nicolangelo; Al-Sarawi, Said; Abbott, Derek

    2014-01-01

    Cortical circuits in the brain have long been recognised for their information processing capabilities and have been studied both experimentally and theoretically via spiking neural networks. Neuromorphic engineers are primarily concerned with translating the computational capabilities of biological cortical circuits, using the Spiking Neural Network (SNN) paradigm, into in silico applications that can mimic the behaviour and capabilities of real biological circuits/systems. These capabilities include low power consumption, compactness, and relevant dynamics. In this paper, we propose a new accelerated-time circuit that has several advantages over its previous neuromorphic counterparts in terms of compactness, power consumption, and capability to mimic the outcomes of biological experiments. The presented circuit simulation results demonstrate that, in comparing the new circuit to previous published synaptic plasticity circuits, reduced silicon area and lower energy consumption for processing each spike is achieved. In addition, it can be tuned in order to closely mimic the outcomes of various spike timing- and rate-based synaptic plasticity experiments. The proposed circuit is also investigated and compared to other designs in terms of tolerance to mismatch and process variation. Monte Carlo simulation results show that the proposed design is much more stable than its previous counterparts in terms of vulnerability to transistor mismatch, which is a significant challenge in analog neuromorphic design. All these features make the proposed design an ideal circuit for use in large scale SNNs, which aim at implementing neuromorphic systems with an inherent capability that can adapt to a continuously changing environment, thus leading to systems with significant learning and computational abilities. PMID:24551089

  7. Tunable low energy, compact and high performance neuromorphic circuit for spike-based synaptic plasticity.

    PubMed

    Rahimi Azghadi, Mostafa; Iannella, Nicolangelo; Al-Sarawi, Said; Abbott, Derek

    2014-01-01

    Cortical circuits in the brain have long been recognised for their information processing capabilities and have been studied both experimentally and theoretically via spiking neural networks. Neuromorphic engineers are primarily concerned with translating the computational capabilities of biological cortical circuits, using the Spiking Neural Network (SNN) paradigm, into in silico applications that can mimic the behaviour and capabilities of real biological circuits/systems. These capabilities include low power consumption, compactness, and relevant dynamics. In this paper, we propose a new accelerated-time circuit that has several advantages over its previous neuromorphic counterparts in terms of compactness, power consumption, and capability to mimic the outcomes of biological experiments. The presented circuit simulation results demonstrate that, in comparing the new circuit to previous published synaptic plasticity circuits, reduced silicon area and lower energy consumption for processing each spike is achieved. In addition, it can be tuned in order to closely mimic the outcomes of various spike timing- and rate-based synaptic plasticity experiments. The proposed circuit is also investigated and compared to other designs in terms of tolerance to mismatch and process variation. Monte Carlo simulation results show that the proposed design is much more stable than its previous counterparts in terms of vulnerability to transistor mismatch, which is a significant challenge in analog neuromorphic design. All these features make the proposed design an ideal circuit for use in large scale SNNs, which aim at implementing neuromorphic systems with an inherent capability that can adapt to a continuously changing environment, thus leading to systems with significant learning and computational abilities.

  8. The Role of Advanced Information System Technology in Remote Sensing for NASA's Earth Science Enterprise in the 21st Century

    NASA Technical Reports Server (NTRS)

    Prescott, Glenn; Komar, George (Technical Monitor)

    2001-01-01

    Future NASA Earth observing satellites will carry high-precision instruments capable of producing large amounts of scientific data. The strategy will be to network these instrument-laden satellites into a web-like array of sensors to facilitate the collection, processing, transmission, storage, and distribution of data and data products - the essential elements of what we refer to as "Information Technology." Many of these Information Technologies will enable the satellite and ground information systems to function effectively in real-time, providing scientists with the capability of customizing data collection activities on a satellite or group of satellites directly from the ground. In future systems, extremely large quantities of data collected by scientific instruments will require the fastest processors, the highest communication channel transfer rates, and the largest data storage capacity to insure that data flows smoothly from the satellite-based instrument to the ground-based archive. Autonomous systems will control all essential processes and play a key role in coordinating the data flow through space-based communication networks. In this paper, we will discuss those critical information technologies for Earth observing satellites that will support the next generation of space-based scientific measurements of planet Earth, and insure that data and data products provided by these systems will be accessible to scientists and the user community in general.

  9. Fast massive preventive security and information communication systems

    NASA Astrophysics Data System (ADS)

    Akopian, David; Chen, Philip; Miryakar, Susheel; Kumar, Abhinav

    2008-04-01

    We present a fast massive information communication system for data collection from distributive sources such as cell phone users. As a very important application one can mention preventive notification systems when timely notification and evidence communication may help to improve safety and security through wide public involvement by ensuring easy-to-access and easy-to-communicate information systems. The technology significantly simplifies the response to the events and will help e.g. special agencies to gather crucial information in time and respond as quickly as possible. Cellular phones are nowadays affordable for most of the residents and became a common personal accessory. The paper describes several ways to design such systems including existing internet access capabilities of cell phones or downloadable specialized software. We provide examples of such designs. The main idea is in structuring information in predetermined way and communicating data through a centralized gate-server which will automatically process information and forward it to a proper destination. The gate-server eliminates a need in knowing contact data and specific local community infrastructure. All the cell phones will have self-localizing capability according to FCC E911 mandate, thus the communicated information can be further tagged automatically by location and time information.

  10. LANDSAT demonstration/application and GIS integration in south central Alaska

    NASA Technical Reports Server (NTRS)

    Burns, A. W.; Derrenbacher, W.

    1981-01-01

    Automated geographic information systems were developed for two sites in Southcentral Alaska to serve as tests for both the process of integrating classified LANDSAT data into a comprehensive environmental data base and the process of using automated information in land capability/suitability analysis and environmental planning. The Big Lake test site, located approximately 20 miles north of the City of Anchorage, comprises an area of approximately 150 square miles. The Anchorage Hillside test site, lying approximately 5 miles southeast of the central part of the city, extends over an area of some 25 square miles. Map construction and content is described.

  11. From ecological test site to geographic information system: lessons for the 1980's

    USGS Publications Warehouse

    Alexander, Robert H.

    1981-01-01

    Geographic information systems were common elements in two kinds of interdisciplinary regional demonstration projects in the 1970's. Ecological test sits attempted to provide for more efficient remote-sensing data delivery for regional environmental management. Regional environmental systems analysis attempted to formally describe and model the interacting regional social and environmental processes, including the resource-use decision making process. Lessons for the 1980's are drawn from recent evaluations and assessments of these programs, focusing on cost, rates of system development and technology transfer, program coordination, integrative analysis capability, and the involvement of system users and decision makers.

  12. In situ flash x-ray high-speed computed tomography for the quantitative analysis of highly dynamic processes

    NASA Astrophysics Data System (ADS)

    Moser, Stefan; Nau, Siegfried; Salk, Manfred; Thoma, Klaus

    2014-02-01

    The in situ investigation of dynamic events, ranging from car crash to ballistics, often is key to the understanding of dynamic material behavior. In many cases the important processes and interactions happen on the scale of milli- to microseconds at speeds of 1000 m s-1 or more. Often, 3D information is necessary to fully capture and analyze all relevant effects. High-speed 3D-visualization techniques are thus required for the in situ analysis. 3D-capable optical high-speed methods often are impaired by luminous effects and dust, while flash x-ray based methods usually deliver only 2D data. In this paper, a novel 3D-capable flash x-ray based method, in situ flash x-ray high-speed computed tomography is presented. The method is capable of producing 3D reconstructions of high-speed processes based on an undersampled dataset consisting of only a few (typically 3 to 6) x-ray projections. The major challenges are identified, discussed and the chosen solution outlined. The application is illustrated with an exemplary application of a 1000 m s-1 high-speed impact event on the scale of microseconds. A quantitative analysis of the in situ measurement of the material fragments with a 3D reconstruction with 1 mm voxel size is presented and the results are discussed. The results show that the HSCT method allows gaining valuable visual and quantitative mechanical information for the understanding and interpretation of high-speed events.

  13. Sensor-based architecture for medical imaging workflow analysis.

    PubMed

    Silva, Luís A Bastião; Campos, Samuel; Costa, Carlos; Oliveira, José Luis

    2014-08-01

    The growing use of computer systems in medical institutions has been generating a tremendous quantity of data. While these data have a critical role in assisting physicians in the clinical practice, the information that can be extracted goes far beyond this utilization. This article proposes a platform capable of assembling multiple data sources within a medical imaging laboratory, through a network of intelligent sensors. The proposed integration framework follows a SOA hybrid architecture based on an information sensor network, capable of collecting information from several sources in medical imaging laboratories. Currently, the system supports three types of sensors: DICOM repository meta-data, network workflows and examination reports. Each sensor is responsible for converting unstructured information from data sources into a common format that will then be semantically indexed in the framework engine. The platform was deployed in the Cardiology department of a central hospital, allowing identification of processes' characteristics and users' behaviours that were unknown before the utilization of this solution.

  14. Technology transfer and commercialization initiatives at TRI/Austin: Resources and examples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzkanin, G.A.; Dingus, M.L.

    1995-12-31

    Located at TRI/Austin, and operated under a Department of Defense contract, is the Nondestructive Testing Information Analysis Center (NTIAC). This is a full service Information Analysis Center sponsored by the Defense Technical Information Center (DTIC), although services of NTIAC are available to other government agencies, government contractors, industry and academia. The principal objective of NTIAC is to help increase the productivity of the nation`s scientists, engineers, and technical managers involved in, or requiring, nondestructive testing by providing broad information analysis services of technical excellence. TRI/Austin is actively pursuing commercialization of several products based on results from outside funded R andmore » D programs. As a small business, TRI/Austin has limited capabilities for large scale fabrication, production, marketing or distribution. Thus, part of a successful commercialization process involves making appropriate collaboration arrangements with other organizations to augment TRI/Austin`s capabilities. Brief descriptions are given here of two recent commercialization efforts at TRI/Austin.« less

  15. Implementing health information exchange for public health reporting: a comparison of decision and risk management of three regional health information organizations in New York state

    PubMed Central

    Phillips, Andrew B; Wilson, Rosalind V; Kaushal, Rainu; Merrill, Jacqueline A

    2014-01-01

    Health information exchange (HIE) is a significant component of healthcare transformation strategies at both the state and national levels. HIE is expected to improve care coordination, and advance public health, but implementation is massively complex and involves significant risk. In New York, three regional health information organizations (RHIOs) implemented an HIE use case for public health reporting by demonstrating capability to deliver accurate responses to electronic queries via a set of services called the Universal Public Health Node. We investigated process and outcomes of the implementation with a comparative case study. Qualitative analysis was structured around a decision and risk matrix. Although each RHIO had a unique operational model, two common factors influenced risk management and implementation success: leadership capable of agile decision-making and commitment to a strong organizational vision. While all three RHIOs achieved certification for the public health reporting, only one has elected to deploy a production version. PMID:23975626

  16. Implementing health information exchange for public health reporting: a comparison of decision and risk management of three regional health information organizations in New York state.

    PubMed

    Phillips, Andrew B; Wilson, Rosalind V; Kaushal, Rainu; Merrill, Jacqueline A

    2014-02-01

    Health information exchange (HIE) is a significant component of healthcare transformation strategies at both the state and national levels. HIE is expected to improve care coordination, and advance public health, but implementation is massively complex and involves significant risk. In New York, three regional health information organizations (RHIOs) implemented an HIE use case for public health reporting by demonstrating capability to deliver accurate responses to electronic queries via a set of services called the Universal Public Health Node. We investigated process and outcomes of the implementation with a comparative case study. Qualitative analysis was structured around a decision and risk matrix. Although each RHIO had a unique operational model, two common factors influenced risk management and implementation success: leadership capable of agile decision-making and commitment to a strong organizational vision. While all three RHIOs achieved certification for the public health reporting, only one has elected to deploy a production version.

  17. Web-GIS platform for monitoring and forecasting of regional climate and ecological changes

    NASA Astrophysics Data System (ADS)

    Gordov, E. P.; Krupchatnikov, V. N.; Lykosov, V. N.; Okladnikov, I.; Titov, A. G.; Shulgina, T. M.

    2012-12-01

    Growing volume of environmental data from sensors and model outputs makes development of based on modern information-telecommunication technologies software infrastructure for information support of integrated scientific researches in the field of Earth sciences urgent and important task (Gordov et al, 2012, van der Wel, 2005). It should be considered that original heterogeneity of datasets obtained from different sources and institutions not only hampers interchange of data and analysis results but also complicates their intercomparison leading to a decrease in reliability of analysis results. However, modern geophysical data processing techniques allow combining of different technological solutions for organizing such information resources. Nowadays it becomes a generally accepted opinion that information-computational infrastructure should rely on a potential of combined usage of web- and GIS-technologies for creating applied information-computational web-systems (Titov et al, 2009, Gordov et al. 2010, Gordov, Okladnikov and Titov, 2011). Using these approaches for development of internet-accessible thematic information-computational systems, and arranging of data and knowledge interchange between them is a very promising way of creation of distributed information-computation environment for supporting of multidiscipline regional and global research in the field of Earth sciences including analysis of climate changes and their impact on spatial-temporal vegetation distribution and state. Experimental software and hardware platform providing operation of a web-oriented production and research center for regional climate change investigations which combines modern web 2.0 approach, GIS-functionality and capabilities of running climate and meteorological models, large geophysical datasets processing, visualization, joint software development by distributed research groups, scientific analysis and organization of students and post-graduate students education is presented. Platform software developed (Shulgina et al, 2012, Okladnikov et al, 2012) includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also data preprocessing, run and visualization of modeling results of models WRF and «Planet Simulator» integrated into the platform is provided. All functions of the center are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of visualization of processing results, selection of geographical region of interest (pan and zoom) and data layers manipulation (order, enable/disable, features extraction). Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches (Shulgina et al, 2011). Using it even unskilled user without specific knowledge can perform computational processing and visualization of large meteorological, climatological and satellite monitoring datasets through unified graphical web-interface.

  18. Total Quality Management Implementation at the Defense Technical Information Center

    DTIC Science & Technology

    1989-09-01

    industry user groups, as well as the contracting process to emphasize the benefits of TQM to the private sector . 5. Demonstrate an Uncompromising... worklife enhancements are important elements in creating the environment in which DTIC personnel will grow, gain new experiences and capabilities, and

  19. 75 FR 4031 - Streamlining Hard-Copy Postage Statement Processing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-26

    ... account via the Business Customer Gateway. Additional information on the Business Customer Gateway is found at https://gateway.usps.com/bcg or by contacting their district Manager, Business Mail Entry. In... mailer or agent. Postal facilities with PostalOne! capability would enter mailing data electronically and...

  20. On the formal definition of the systems' interoperability capability: an anthropomorphic approach

    NASA Astrophysics Data System (ADS)

    Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav

    2017-03-01

    The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.

  1. Infrared Spectroscopic Imaging: The Next Generation

    PubMed Central

    Bhargava, Rohit

    2013-01-01

    Infrared (IR) spectroscopic imaging seemingly matured as a technology in the mid-2000s, with commercially successful instrumentation and reports in numerous applications. Recent developments, however, have transformed our understanding of the recorded data, provided capability for new instrumentation, and greatly enhanced the ability to extract more useful information in less time. These developments are summarized here in three broad areas— data recording, interpretation of recorded data, and information extraction—and their critical review is employed to project emerging trends. Overall, the convergence of selected components from hardware, theory, algorithms, and applications is one trend. Instead of similar, general-purpose instrumentation, another trend is likely to be diverse and application-targeted designs of instrumentation driven by emerging component technologies. The recent renaissance in both fundamental science and instrumentation will likely spur investigations at the confluence of conventional spectroscopic analyses and optical physics for improved data interpretation. While chemometrics has dominated data processing, a trend will likely lie in the development of signal processing algorithms to optimally extract spectral and spatial information prior to conventional chemometric analyses. Finally, the sum of these recent advances is likely to provide unprecedented capability in measurement and scientific insight, which will present new opportunities for the applied spectroscopist. PMID:23031693

  2. System enhancements of Mesoscale Analysis and Space Sensor (MASS) computer system

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The interactive information processing for the mesoscale analysis and space sensor (MASS) program is reported. The development and implementation of new spaceborne remote sensing technology to observe and measure atmospheric processes is described. The space measurements and conventional observational data are processed together to gain an improved understanding of the mesoscale structure and dynamical evolution of the atmosphere relative to cloud development and precipitation processes. A Research Computer System consisting of three primary computers was developed (HP-1000F, Perkin-Elmer 3250, and Harris/6) which provides a wide range of capabilities for processing and displaying interactively large volumes of remote sensing data. The development of a MASS data base management and analysis system on the HP-1000F computer and extending these capabilities by integration with the Perkin-Elmer and Harris/6 computers using the MSFC's Apple III microcomputer workstations is described. The objectives are: to design hardware enhancements for computer integration and to provide data conversion and transfer between machines.

  3. MIRADS-2 Implementation Manual

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The Marshall Information Retrieval and Display System (MIRADS) which is a data base management system designed to provide the user with a set of generalized file capabilities is presented. The system provides a wide variety of ways to process the contents of the data base and includes capabilities to search, sort, compute, update, and display the data. The process of creating, defining, and loading a data base is generally called the loading process. The steps in the loading process which includes (1) structuring, (2) creating, (3) defining, (4) and implementing the data base for use by MIRADS are defined. The execution of several computer programs is required to successfully complete all steps of the loading process. This library must be established as a cataloged mass storage file as the first step in MIRADS implementation. The procedure for establishing the MIRADS Library is given. The system is currently operational for the UNIVAC 1108 computer system utilizing the Executive Operating System. All procedures relate to the use of MIRADS on the U-1108 computer.

  4. Space Station Mission Planning System (MPS) development study. Volume 2

    NASA Technical Reports Server (NTRS)

    Klus, W. J.

    1987-01-01

    The process and existing software used for Spacelab payload mission planning were studied. A complete baseline definition of the Spacelab payload mission planning process was established, along with a definition of existing software capabilities for potential extrapolation to the Space Station. This information was used as a basis for defining system requirements to support Space Station mission planning. The Space Station mission planning concept was reviewed for the purpose of identifying areas where artificial intelligence concepts might offer substantially improved capability. Three specific artificial intelligence concepts were to be investigated for applicability: natural language interfaces; expert systems; and automatic programming. The advantages and disadvantages of interfacing an artificial intelligence language with existing FORTRAN programs or of converting totally to a new programming language were identified.

  5. Description and operational status of the National Transonic Facility computer complex

    NASA Technical Reports Server (NTRS)

    Boyles, G. B., Jr.

    1986-01-01

    This paper describes the National Transonic Facility (NTF) computer complex and its support of tunnel operations. The capabilities of the research data acquisition and reduction are discussed along with the types of data that can be acquired and presented. Pretest, test, and posttest capabilities are also outlined along with a discussion of the computer complex to monitor the tunnel control processes and provide the tunnel operators with information needed to control the tunnel. Planned enhancements to the computer complex for support of future testing are presented.

  6. Learning class descriptions from a data base of spectral reflectance with multiple view angles

    NASA Technical Reports Server (NTRS)

    Kimes, Daniel S.; Harrison, Patrick R.; Harrison, P. A.

    1992-01-01

    A learning program has been developed which combines 'learning by example' with the generate-and-test paradigm to furnish a robust learning environment capable of handling error-prone data. The problem is shown to be capable of learning class descriptions from positive and negative training examples of spectral and directional reflectance data taken from soil and vegetation. The program, which used AI techniques to automate very tedious processes, found the sequence of relationships that contained the most important information which could distinguish the classes.

  7. Problem-Solving Environments (PSEs) to Support Innovation Clustering

    NASA Technical Reports Server (NTRS)

    Gill, Zann

    1999-01-01

    This paper argues that there is need for high level concepts to inform the development of Problem-Solving Environment (PSE) capability. A traditional approach to PSE implementation is to: (1) assemble a collection of tools; (2) integrate the tools; and (3) assume that collaborative work begins after the PSE is assembled. I argue for the need to start from the opposite premise, that promoting human collaboration and observing that process comes first, followed by the development of supporting tools, and finally evolution of PSE capability through input from collaborating project teams.

  8. Information literacy of U.S. and Indian engineering undergraduates.

    PubMed

    Taraban, Roman; Suar, Damodar; Oliver, Kristin

    2013-12-01

    To be competitive, contemporary engineers must be capable of both processing and communicating information effectively. Available research suggests that Indian students would be disadvantaged in information literacy in their language of instruction (English) compared to U.S. students because English is not Indian students' native language. Compared to U.S. students, Indian students (a) were predicted to apply practical text processing strategies to a greater extent than analytic strategies and (b) endorse the direct transmission of information over critical, interpretive analysis of information. Two validated scales measuring self-reported use of reading strategies and beliefs about interpreting and critiquing written information were administered to engineering students at an Indian Institute of Technology in their freshman to senior years. Neither prediction was supported: Indian students reported applying analytic strategies over pragmatic strategies and were more disposed to critically analyze information rather than accept it passively. Further, Indian students reported being more analytic and more reflective in their reading behaviors than U.S. engineering students. Additional data indicated that U.S. and Indian students' text-processing strategies and beliefs are associated with the texts that they read and their academic behaviors.

  9. OPTICAL PROCESSING OF INFORMATION: Potential applications of quasi-cw partially coherent radiation in optical data recording and processing

    NASA Astrophysics Data System (ADS)

    Volkov, L. V.; Larkin, A. I.

    1994-04-01

    Theoretical and experimental investigations are reported of the potential applications of quasi-cw partially coherent radiation in optical systems based on diffraction—interference principles. It is shown that the spectral characteristics of quasi-cw radiation influence the data-handling capabilities of a holographic correlator and of a partially coherent holographic system for data acquisition. Relevant experimental results are reported.

  10. Requirements for company-wide management

    NASA Technical Reports Server (NTRS)

    Southall, J. W.

    1980-01-01

    Computing system requirements were developed for company-wide management of information and computer programs in an engineering data processing environment. The requirements are essential to the successful implementation of a computer-based engineering data management system; they exceed the capabilities provided by the commercially available data base management systems. These requirements were derived from a study entitled The Design Process, which was prepared by design engineers experienced in development of aerospace products.

  11. Improving Air Force Imagery Reconnaissance Support to Ground Commanders.

    DTIC Science & Technology

    1983-06-03

    reconnaissance support in Southeast Asia due to the long response times of film recovery and 26 processing capabilities and inadequate command and control...reconnaissance is an integral part of the C31 information explosion. Traditional silver halide film products, chemically processed and manually distributed are...being replaced with electronic near-real-time (NRT) imaging sensors. The term "imagery" now includes not only conventional film based products (black

  12. A DNA-based pattern classifier with in vitro learning and associative recall for genomic characterization and biosensing without explicit sequence knowledge.

    PubMed

    Lee, Ju Seok; Chen, Junghuei; Deaton, Russell; Kim, Jin-Woo

    2014-01-01

    Genetic material extracted from in situ microbial communities has high promise as an indicator of biological system status. However, the challenge is to access genomic information from all organisms at the population or community scale to monitor the biosystem's state. Hence, there is a need for a better diagnostic tool that provides a holistic view of a biosystem's genomic status. Here, we introduce an in vitro methodology for genomic pattern classification of biological samples that taps large amounts of genetic information from all genes present and uses that information to detect changes in genomic patterns and classify them. We developed a biosensing protocol, termed Biological Memory, that has in vitro computational capabilities to "learn" and "store" genomic sequence information directly from genomic samples without knowledge of their explicit sequences, and that discovers differences in vitro between previously unknown inputs and learned memory molecules. The Memory protocol was designed and optimized based upon (1) common in vitro recombinant DNA operations using 20-base random probes, including polymerization, nuclease digestion, and magnetic bead separation, to capture a snapshot of the genomic state of a biological sample as a DNA memory and (2) the thermal stability of DNA duplexes between new input and the memory to detect similarities and differences. For efficient read out, a microarray was used as an output method. When the microarray-based Memory protocol was implemented to test its capability and sensitivity using genomic DNA from two model bacterial strains, i.e., Escherichia coli K12 and Bacillus subtilis, results indicate that the Memory protocol can "learn" input DNA, "recall" similar DNA, differentiate between dissimilar DNA, and detect relatively small concentration differences in samples. This study demonstrated not only the in vitro information processing capabilities of DNA, but also its promise as a genomic pattern classifier that could access information from all organisms in a biological system without explicit genomic information. The Memory protocol has high potential for many applications, including in situ biomonitoring of ecosystems, screening for diseases, biosensing of pathological features in water and food supplies, and non-biological information processing of memory devices, among many.

  13. Potential capabilities for compression of information of certain data processing systems

    NASA Technical Reports Server (NTRS)

    Khodarev, Y. K.; Yevdokimov, V. P.; Pokras, V. M.

    1974-01-01

    This article undertakes to study a generalized block diagram of a data collection and processing system of a spacecraft in which a number of sensors or outputs of scientific instruments are cyclically interrogated by a commutator, methods of writing the supplementary information in a frame on the example of a certain hypothetical telemetry system, and the influence of statistics of number of active channels in a frame on frame compression factor. The separation of the data compression factor of the collection and processing system of spacecraft into two parts used in this work allows determination of the compression factor of an active frame depending not only on the statistics of activity of channels in the telemetry frame, but also on the method of introduction of the additional address and time information to each frame.

  14. Processing and analysis of commercial satellite image data of the nuclear accident near Chernobyl, U. S. S. R

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadowski, F.G.; Covington, S.J.

    1987-01-01

    Advanced digital processing techniques were applied to Landsat-5 Thematic Mapper (TM) data and SPOT high-resolution visible (HRV) panchromatic data to maximize the utility of images of a nuclear power plant emergency at Chernobyl in the Soviet Ukraine. The results of the data processing and analysis illustrate the spectral and spatial capabilities of the two sensor systems and provide information about the severity and duration of the events occurring at the power plant site.

  15. The Influence of Learning Strategies in the Acquisition, Retention, and Transfer of a Procedural Task.

    DTIC Science & Technology

    1979-08-01

    Lockhart , R. S. Levels of processing : A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 1972, 11, 671-684. Craik , F. I. M...learning and memory research. In F. I. f. Craik & L. S. Cermak (Eds.), Levels of processing and theories of memory. Hillsdale, N. J.: Erlbaum, 1978...R. N., Gerson, R. F., & Kim, K. Information processing capabilities in performers differing in levels of motor skill. (Tech. Rep. TR-79-A4). a

  16. Semantic Service Matchmaking in the ATM Domain Considering Infrastructure Capability Constraints

    NASA Astrophysics Data System (ADS)

    Moser, Thomas; Mordinyi, Richard; Sunindyo, Wikan Danar; Biffl, Stefan

    In a service-oriented environment business processes flexibly build on software services provided by systems in a network. A key design challenge is the semantic matchmaking of business processes and software services in two steps: 1. Find for one business process the software services that meet or exceed the BP requirements; 2. Find for all business processes the software services that can be implemented within the capability constraints of the underlying network, which poses a major problem since even for small scenarios the solution space is typically very large. In this chapter we analyze requirements from mission-critical business processes in the Air Traffic Management (ATM) domain and introduce an approach for semi-automatic semantic matchmaking for software services, the “System-Wide Information Sharing” (SWIS) business process integration framework. A tool-supported semantic matchmaking process like SWIS can provide system designers and integrators with a set of promising software service candidates and therefore strongly reduces the human matching effort by focusing on a much smaller space of matchmaking candidates. We evaluate the feasibility of the SWIS approach in an industry use case from the ATM domain.

  17. Orchestration of Molecular Information through Higher Order Chemical Recognition

    NASA Astrophysics Data System (ADS)

    Frezza, Brian M.

    Broadly defined, higher order chemical recognition is the process whereby discrete chemical building blocks capable of specifically binding to cognate moieties are covalently linked into oligomeric chains. These chains, or sequences, are then able to recognize and bind to their cognate sequences with a high degree of cooperativity. Principally speaking, DNA and RNA are the most readily obtained examples of this chemical phenomenon, and function via Watson-Crick cognate pairing: guanine pairs with cytosine and adenine with thymine (DNA) or uracil (RNA), in an anti-parallel manner. While the theoretical principles, techniques, and equations derived herein apply generally to any higher-order chemical recognition system, in practice we utilize DNA oligomers as a model-building material to experimentally investigate and validate our hypotheses. Historically, general purpose information processing has been a task limited to semiconductor electronics. Molecular computing on the other hand has been limited to ad hoc approaches designed to solve highly specific and unique computation problems, often involving components or techniques that cannot be applied generally in a manner suitable for precise and predictable engineering. Herein, we provide a fundamental framework for harnessing high-order recognition in a modular and programmable fashion to synthesize molecular information process networks of arbitrary construction and complexity. This document provides a solid foundation for routinely embedding computational capability into chemical and biological systems where semiconductor electronics are unsuitable for practical application.

  18. Modeling of information diffusion in Twitter-like social networks under information overload.

    PubMed

    Li, Pei; Li, Wei; Wang, Hui; Zhang, Xin

    2014-01-01

    Due to the existence of information overload in social networks, it becomes increasingly difficult for users to find useful information according to their interests. This paper takes Twitter-like social networks into account and proposes models to characterize the process of information diffusion under information overload. Users are classified into different types according to their in-degrees and out-degrees, and user behaviors are generalized into two categories: generating and forwarding. View scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated by a given type user is adopted to characterize the information diffusion efficiency, which is calculated theoretically. To verify the accuracy of theoretical analysis results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly. These results are of importance to understand the diffusion dynamics in social networks, and this analysis framework can be extended to consider more realistic situations.

  19. Modeling of Information Diffusion in Twitter-Like Social Networks under Information Overload

    PubMed Central

    Li, Wei

    2014-01-01

    Due to the existence of information overload in social networks, it becomes increasingly difficult for users to find useful information according to their interests. This paper takes Twitter-like social networks into account and proposes models to characterize the process of information diffusion under information overload. Users are classified into different types according to their in-degrees and out-degrees, and user behaviors are generalized into two categories: generating and forwarding. View scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated by a given type user is adopted to characterize the information diffusion efficiency, which is calculated theoretically. To verify the accuracy of theoretical analysis results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly. These results are of importance to understand the diffusion dynamics in social networks, and this analysis framework can be extended to consider more realistic situations. PMID:24795541

  20. Optical information processing for NASA's space exploration

    NASA Technical Reports Server (NTRS)

    Chao, Tien-Hsin; Ochoa, Ellen; Juday, Richard

    1990-01-01

    The development status of optical processing techniques under development at NASA-JPL, NASA-Ames, and NASA-Johnson, is evaluated with a view to their potential applications in future NASA planetary exploration missions. It is projected that such optical processing systems can yield major reductions in mass, volume, and power requirements relative to exclusively electronic systems of comparable processing capabilities. Attention is given to high-order neural networks for distortion-invariant classification and pattern recognition, multispectral imaging using an acoustooptic tunable filter, and an optical matrix processor for control problems.

  1. Creation of a full color geologic map by computer: A case history from the Port Moller project resource assessment, Alaska Peninsula: A section in Geologic studies in Alaska by the U.S. Geological Survey, 1988

    USGS Publications Warehouse

    Wilson, Frederic H.

    1989-01-01

    Graphics programs on computers can facilitate the compilation and production of geologic maps, including full color maps of publication quality. This paper describes the application of two different programs, GSMAP and ARC/INFO, to the production of a geologic map of the Port Meller and adjacent 1:250,000-scale quadrangles on the Alaska Peninsula. GSMAP was used at first because of easy digitizing on inexpensive computer hardware. Limitations in its editing capability led to transfer of the digital data to ARC/INFO, a Geographic Information System, which has better editing and also added data analysis capability. Although these improved capabilities are accompanied by increased complexity, the availability of ARC/INFO's data analysis capability provides unanticipated advantages. It allows digital map data to be processed as one of multiple data layers for mineral resource assessment. As a result of development of both software packages, it is now easier to apply both software packages to geologic map production. Both systems accelerate the drafting and revision of maps and enhance the compilation process. Additionally, ARC/ INFO's analysis capability enhances the geologist's ability to develop answers to questions of interest that were previously difficult or impossible to obtain.

  2. Non-Discretionary Access Control for Decentralized Computing Systems

    DTIC Science & Technology

    1977-05-01

    Semaphores are inherently read-write objects to all users. Reed and Kanodia <Reed77> propose a scheme for process synchronization using...Capabilities 84 8.2.4.4 UNIX Style Naming 85 8.2.5 Garbage Collection 86 8.3 Synchronization Without Writing 87 9. Downgrading Information 89 9.1...intruder could not follow the rapid exchange of messages and would be unable to extract information. Farber and Larsen describe synchronization and

  3. Essentials of LIDAR multiangle data processing methodology for smoke polluted atmospheres

    Treesearch

    V. A. Kovalev; A. Petkov; C. Wold; S. Urbanski; W. M. Hao

    2009-01-01

    Mobile scanning lidar is the most appropriate tool for monitoring wildfire smoke-plume dynamics and optical properties. Lidar is the only remote sensing instrument capable of obtaining detailed three-dimensional range-resolved information for smoke distributions and optical properties over ranges of 10+ km at different wavelengths simultaneously.

  4. An Eclectic Model for Assessing E-Learning Readiness in the Iranian Universities

    ERIC Educational Resources Information Center

    Darab, B.; Montazer, Gh. A.

    2011-01-01

    Information technologies have caused the accumulation and interaction of knowledge to be increasingly reshaped with significant ramifications affecting the processes of acquisition, communication and dissemination of knowledge in almost all societies. In the meantime, assessing the capabilities of the educational system for the successful…

  5. Intelligence-Driven Border Security: A Promethean View of U.S. Border Patrol Intelligence Operations

    DTIC Science & Technology

    2015-12-01

    USBP agent, intelligence ( BPA -I), information sharing, capability gap analysis process (CGAP), Tucson Sector Red Team 15. NUMBER OF PAGES 109 16...27 2. BPA -I .............................................................................................28 3. BPA -I Requirements...71 APPENDIX A. PROFESSIONAL INTELLIGENCE ASSOCIATIONS— ADDITIONAL OPPORTUNITIES FOR BPA -IS

  6. Past developments and future directions for the AHP in natural resources

    Treesearch

    Daniel L. Schmoldt; G.A. Mendoza; Jyrki Kangas

    2001-01-01

    The analytic hierarchy process (AHP) possesses certain characteristics that make it a useful tool for natural resource decision making. The AHP’s capabilities include: participatory decision making, problem structuring and alternative development, group facilitation, consensus building, fairness, qualitative and quantitative information, conflict resolution, decision...

  7. Managing the data explosion

    USGS Publications Warehouse

    Hooper, Richard P.; Aulenbach, Brent T.

    1993-01-01

    The 'data explosion' brought on by electronic sensors and automatic samplers can strain the capabilities of existing water-quality data-management systems just when they're needed most to process the information. The U.S. Geological Survey has responded to the problem by setting up an innovative system that allows rapid data analysis.

  8. The Company of Others: Generating Knowhow in Later Life

    ERIC Educational Resources Information Center

    Kimberley, Helen; Golding, Barry; Simons, Bonnie

    2016-01-01

    This paper explores some important aspects of the generation of practical knowledge through later life. It is about the relationship between knowledge generation, agency and capability, developed informally through the life experiences in and through the Company of Others. It emphasises how the everyday processes of socialisation create invaluable…

  9. Investigating Storage and Retrieval Processes of Directed Forgetting: A Model-Based Approach

    ERIC Educational Resources Information Center

    Rummel, Jan; Marevic, Ivan; Kuhlmann, Beatrice G.

    2016-01-01

    Intentional forgetting of previously learned information is an adaptive cognitive capability of humans but its cognitive underpinnings are not yet well understood. It has been argued that it strongly depends on the presentation method whether forgetting instructions alter storage or retrieval stages (Basden, Basden, & Gargano, 1993). In…

  10. Doctrine Development Process in the Kenya Army: Bridging the Gap

    DTIC Science & Technology

    2014-06-13

    concepts, and principles . It must broadly follow three doctrine development phases: the collection/information gathering phase; the formulation and...a capable lead organization. The organization must eliminate terminological and utility confusion among doctrine, concepts, and principles . It must...15 The relationship Between Military Doctrine, Concept and Principle

  11. The Perception of Human Resources Enterprise Architecture within the Department of Defense

    ERIC Educational Resources Information Center

    Delaquis, Richard Serge

    2012-01-01

    The Clinger Cohen Act of 1996 requires that all major Federal Government Information Technology (IT) systems prepare an Enterprise Architecture prior to IT acquisitions. Enterprise Architecture, like house blueprints, represents the system build, capabilities, processes, and data across the enterprise of IT systems. Enterprise Architecture is used…

  12. Structural and process factors affecting the implementation of antimicrobial resistance prevention and control strategies in U.S. hospitals.

    PubMed

    Chou, Ann F; Yano, Elizabeth M; McCoy, Kimberly D; Willis, Deanna R; Doebbeling, Bradley N

    2008-01-01

    To address increases in the incidence of infection with antimicrobial-resistant pathogens, the National Foundation for Infectious Diseases and Centers for Disease Control and Prevention proposed two sets of strategies to (a) optimize antibiotic use and (b) prevent the spread of antimicrobial resistance and control transmission. However, little is known about the implementation of these strategies. Our objective is to explore organizational structural and process factors that facilitate the implementation of National Foundation for Infectious Diseases/Centers for Disease Control and Prevention strategies in U.S. hospitals. We surveyed 448 infection control professionals from a national sample of hospitals. Clinically anchored in the Donabedian model that defines quality in terms of structural and process factors, with the structural domain further informed by a contingency approach, we modeled the degree to which National Foundation for Infectious Diseases and Centers for Disease Control and Prevention strategies were implemented as a function of formalization and standardization of protocols, centralization of decision-making hierarchy, information technology capabilities, culture, communication mechanisms, and interdepartmental coordination, controlling for hospital characteristics. Formalization, standardization, centralization, institutional culture, provider-management communication, and information technology use were associated with optimal antibiotic use and enhanced implementation of strategies that prevent and control antimicrobial resistance spread (all p < .001). However, interdepartmental coordination for patient care was inversely related with antibiotic use in contrast to antimicrobial resistance spread prevention and control (p < .0001). Formalization and standardization may eliminate staff role conflict, whereas centralized authority may minimize ambiguity. Culture and communication likely promote internal trust, whereas information technology use helps integrate and support these organizational processes. These findings suggest concrete strategies for evaluating current capabilities to implement effective practices and foster and sustain a culture of patient safety.

  13. A learnable parallel processing architecture towards unity of memory and computing

    NASA Astrophysics Data System (ADS)

    Li, H.; Gao, B.; Chen, Z.; Zhao, Y.; Huang, P.; Ye, H.; Liu, L.; Liu, X.; Kang, J.

    2015-08-01

    Developing energy-efficient parallel information processing systems beyond von Neumann architecture is a long-standing goal of modern information technologies. The widely used von Neumann computer architecture separates memory and computing units, which leads to energy-hungry data movement when computers work. In order to meet the need of efficient information processing for the data-driven applications such as big data and Internet of Things, an energy-efficient processing architecture beyond von Neumann is critical for the information society. Here we show a non-von Neumann architecture built of resistive switching (RS) devices named “iMemComp”, where memory and logic are unified with single-type devices. Leveraging nonvolatile nature and structural parallelism of crossbar RS arrays, we have equipped “iMemComp” with capabilities of computing in parallel and learning user-defined logic functions for large-scale information processing tasks. Such architecture eliminates the energy-hungry data movement in von Neumann computers. Compared with contemporary silicon technology, adder circuits based on “iMemComp” can improve the speed by 76.8% and the power dissipation by 60.3%, together with a 700 times aggressive reduction in the circuit area.

  14. A learnable parallel processing architecture towards unity of memory and computing.

    PubMed

    Li, H; Gao, B; Chen, Z; Zhao, Y; Huang, P; Ye, H; Liu, L; Liu, X; Kang, J

    2015-08-14

    Developing energy-efficient parallel information processing systems beyond von Neumann architecture is a long-standing goal of modern information technologies. The widely used von Neumann computer architecture separates memory and computing units, which leads to energy-hungry data movement when computers work. In order to meet the need of efficient information processing for the data-driven applications such as big data and Internet of Things, an energy-efficient processing architecture beyond von Neumann is critical for the information society. Here we show a non-von Neumann architecture built of resistive switching (RS) devices named "iMemComp", where memory and logic are unified with single-type devices. Leveraging nonvolatile nature and structural parallelism of crossbar RS arrays, we have equipped "iMemComp" with capabilities of computing in parallel and learning user-defined logic functions for large-scale information processing tasks. Such architecture eliminates the energy-hungry data movement in von Neumann computers. Compared with contemporary silicon technology, adder circuits based on "iMemComp" can improve the speed by 76.8% and the power dissipation by 60.3%, together with a 700 times aggressive reduction in the circuit area.

  15. An eHealth Capabilities Framework for Graduates and Health Professionals: Mixed-Methods Study

    PubMed Central

    McGregor, Deborah; Keep, Melanie; Janssen, Anna; Spallek, Heiko; Quinn, Deleana; Jones, Aaron; Tseris, Emma; Yeung, Wilson; Togher, Leanne; Solman, Annette; Shaw, Tim

    2018-01-01

    Background The demand for an eHealth-ready and adaptable workforce is placing increasing pressure on universities to deliver eHealth education. At present, eHealth education is largely focused on components of eHealth rather than considering a curriculum-wide approach. Objective This study aimed to develop a framework that could be used to guide health curriculum design based on current evidence, and stakeholder perceptions of eHealth capabilities expected of tertiary health graduates. Methods A 3-phase, mixed-methods approach incorporated the results of a literature review, focus groups, and a Delphi process to develop a framework of eHealth capability statements. Results Participants (N=39) with expertise or experience in eHealth education, practice, or policy provided feedback on the proposed framework, and following the fourth iteration of this process, consensus was achieved. The final framework consisted of 4 higher-level capability statements that describe the learning outcomes expected of university graduates across the domains of (1) digital health technologies, systems, and policies; (2) clinical practice; (3) data analysis and knowledge creation; and (4) technology implementation and codesign. Across the capability statements are 40 performance cues that provide examples of how these capabilities might be demonstrated. Conclusions The results of this study inform a cross-faculty eHealth curriculum that aligns with workforce expectations. There is a need for educational curriculum to reinforce existing eHealth capabilities, adapt existing capabilities to make them transferable to novel eHealth contexts, and introduce new learning opportunities for interactions with technologies within education and practice encounters. As such, the capability framework developed may assist in the application of eHealth by emerging and existing health care professionals. Future research needs to explore the potential for integration of findings into workforce development programs. PMID:29764794

  16. Space Shuttle Payload Information Source

    NASA Technical Reports Server (NTRS)

    Griswold, Tom

    2000-01-01

    The Space Shuttle Payload Information Source Compact Disk (CD) is a joint NASA and USA project to introduce Space Shuttle capabilities, payload services and accommodations, and the payload integration process. The CD will be given to new payload customers or to organizations outside of NASA considering using the Space Shuttle as a launch vehicle. The information is high-level in a visually attractive format with a voice over. The format is in a presentation style plus 360 degree views, videos, and animation. Hyperlinks are provided to connect to the Internet for updates and more detailed information on how payloads are integrated into the Space Shuttle.

  17. Landsat 4 results and their implications for agricultural surveys

    NASA Technical Reports Server (NTRS)

    Erickson, J. D.; Bizzell, R. M.; Pitts, D. E.; Thompson, D. R.

    1983-01-01

    Progress on defining the minimum Landsat-4 data characteristics needed for agricultural information in the U.S. and assessing the value-added capability of current technology to extract that level of information is reported. Emphasis is laid on the thematic mapper (TM) data and the ground processing facilities. TM data from all 7 bands for a rural Arkansas scene were examined in terms of radiometric, spatial, and geometric fidelity characteristics. Another scene sensed over Iowa was analyzed using three two-channel data sets. Although the TM data were an improvement over MSS data, no value differential was perceived. However, the development of further analysis techniques is still necessary to determine the actual worth of the improved sensor capabilities available with the TM, which actually has an MSS within itself.

  18. Framework Support For Knowledge-Based Software Development

    NASA Astrophysics Data System (ADS)

    Huseth, Steve

    1988-03-01

    The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.

  19. Technology for On-Chip Qubit Control with Microfabricated Surface Ion Traps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Highstrete, Clark; Scott, Sean Michael; Nordquist, Christopher D.

    2013-11-01

    Trapped atomic ions are a leading physical system for quantum information processing. However, scalability and operational fidelity remain limiting technical issues often associated with optical qubit control. One promising approach is to develop on-chip microwave electronic control of ion qubits based on the atomic hyperfine interaction. This project developed expertise and capabilities at Sandia toward on-chip electronic qubit control in a scalable architecture. The project developed a foundation of laboratory capabilities, including trapping the 171Yb + hyperfine ion qubit and developing an experimental microwave coherent control capability. Additionally, the project investigated the integration of microwave device elements with surface ionmore » traps utilizing Sandia’s state-of-the-art MEMS microfabrication processing. This effort culminated in a device design for a multi-purpose ion trap experimental platform for investigating on-chip microwave qubit control, laying the groundwork for further funded R&D to develop on-chip microwave qubit control in an architecture that is suitable to engineering development.« less

  20. Information Systems for NASA's Aeronautics and Space Enterprises

    NASA Technical Reports Server (NTRS)

    Kutler, Paul

    1998-01-01

    The aerospace industry is being challenged to reduce costs and development time as well as utilize new technologies to improve product performance. Information technology (IT) is the key to providing revolutionary solutions to the challenges posed by the increasing complexity of NASA's aeronautics and space missions and the sophisticated nature of the systems that enable them. The NASA Ames vision is to develop technologies enabling the information age, expanding the frontiers of knowledge for aeronautics and space, improving America's competitive position, and inspiring future generations. Ames' missions to accomplish that vision include: 1) performing research to support the American aviation community through the unique integration of computation, experimentation, simulation and flight testing, 2) studying the health of our planet, understanding living systems in space and the origins of the universe, developing technologies for space flight, and 3) to research, develop and deliver information technologies and applications. Information technology may be defined as the use of advance computing systems to generate data, analyze data, transform data into knowledge and to use as an aid in the decision-making process. The knowledge from transformed data can be displayed in visual, virtual and multimedia environments. The decision-making process can be fully autonomous or aided by a cognitive processes, i.e., computational aids designed to leverage human capacities. IT Systems can learn as they go, developing the capability to make decisions or aid the decision making process on the basis of experiences gained using limited data inputs. In the future, information systems will be used to aid space mission synthesis, virtual aerospace system design, aid damaged aircraft during landing, perform robotic surgery, and monitor the health and status of spacecraft and planetary probes. NASA Ames through the Center of Excellence for Information Technology Office is leading the effort in pursuit of revolutionary, IT-based approaches to satisfying NASA's aeronautics and space requirements. The objective of the effort is to incorporate information technologies within each of the Agency's four Enterprises, i.e., Aeronautics and Space Transportation Technology, Earth, Science, Human Exploration and Development of Space and Space Sciences. The end results of these efforts for Enterprise programs and projects should be reduced cost, enhanced mission capability and expedited mission completion.

  1. A ride in the time machine: information management capabilities health departments will need.

    PubMed

    Foldy, Seth; Grannis, Shaun; Ross, David; Smith, Torney

    2014-09-01

    We have proposed needed information management capabilities for future US health departments predicated on trends in health care reform and health information technology. Regardless of whether health departments provide direct clinical services (and many will), they will manage unprecedented quantities of sensitive information for the public health core functions of assurance and assessment, including population-level health surveillance and metrics. Absent improved capabilities, health departments risk vestigial status, with consequences for vulnerable populations. Developments in electronic health records, interoperability and information exchange, public information sharing, decision support, and cloud technologies can support information management if health departments have appropriate capabilities. The need for national engagement in and consensus on these capabilities and their importance to health department sustainability make them appropriate for consideration in the context of accreditation.

  2. Organizational Capabilities for Integrating Care: A Review of Measurement Tools.

    PubMed

    Evans, Jenna M; Grudniewicz, Agnes; Baker, G Ross; Wodchis, Walter P

    2016-12-01

    The success of integrated care interventions is highly dependent on the internal and collective capabilities of the organizations in which they are implemented. Yet, organizational capabilities are rarely described, understood, or measured with sufficient depth and breadth in empirical studies or in practice. Assessing these capabilities can contribute to understanding why some integrated care interventions are more effective than others. We identified, organized, and assessed survey instruments that measure the internal and collective organizational capabilities required for integrated care delivery. We conducted an expert consultation and searched Medline and Google Scholar databases for survey instruments measuring factors outlined in the Context and Capabilities for Integrating Care Framework. A total of 58 instruments were included in the review and assessed based on their psychometric properties, practical considerations, and applicability to integrated care efforts. This study provides a bank of psychometrically sound instruments for describing and comparing organizational capabilities. Greater use of these instruments across integrated care interventions and studies can enhance standardized comparative analyses and inform change management. Further research is needed to build an evidence base for these instruments and to explore the associations between organizational capabilities and integrated care processes and outcomes. © The Author(s) 2016.

  3. The application of information theory for the research of aging and aging-related diseases.

    PubMed

    Blokh, David; Stambler, Ilia

    2017-10-01

    This article reviews the application of information-theoretical analysis, employing measures of entropy and mutual information, for the study of aging and aging-related diseases. The research of aging and aging-related diseases is particularly suitable for the application of information theory methods, as aging processes and related diseases are multi-parametric, with continuous parameters coexisting alongside discrete parameters, and with the relations between the parameters being as a rule non-linear. Information theory provides unique analytical capabilities for the solution of such problems, with unique advantages over common linear biostatistics. Among the age-related diseases, information theory has been used in the study of neurodegenerative diseases (particularly using EEG time series for diagnosis and prediction), cancer (particularly for establishing individual and combined cancer biomarkers), diabetes (mainly utilizing mutual information to characterize the diseased and aging states), and heart disease (mainly for the analysis of heart rate variability). Few works have employed information theory for the analysis of general aging processes and frailty, as underlying determinants and possible early preclinical diagnostic measures for aging-related diseases. Generally, the use of information-theoretical analysis permits not only establishing the (non-linear) correlations between diagnostic or therapeutic parameters of interest, but may also provide a theoretical insight into the nature of aging and related diseases by establishing the measures of variability, adaptation, regulation or homeostasis, within a system of interest. It may be hoped that the increased use of such measures in research may considerably increase diagnostic and therapeutic capabilities and the fundamental theoretical mathematical understanding of aging and disease. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. An introduction to the Marshall information retrieval and display system

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An on-line terminal oriented data storage and retrieval system is presented which allows a user to extract and process information from stored data bases. The use of on-line terminals for extracting and displaying data from the data bases provides a fast and responsive method for obtaining needed information. The system consists of general purpose computer programs that provide the overall capabilities of the total system. The system can process any number of data files via a Dictionary (one for each file) which describes the data format to the system. New files may be added to the system at any time, and reprogramming is not required. Illustrations of the system are shown, and sample inquiries and responses are given.

  5. NATIONAL WATER INFORMATION SYSTEM OF THE U. S. GEOLOGICAL SURVEY.

    USGS Publications Warehouse

    Edwards, Melvin D.

    1985-01-01

    National Water Information System (NWIS) has been designed as an interactive, distributed data system. It will integrate the existing, diverse data-processing systems into a common system. It will also provide easier, more flexible use as well as more convenient access and expanded computing, dissemination, and data-analysis capabilities. The NWIS is being implemented as part of a Distributed Information System (DIS) being developed by the Survey's Water Resources Division. The NWIS will be implemented on each node of the distributed network for the local processing, storage, and dissemination of hydrologic data collected within the node's area of responsibility. The processor at each node will also be used to perform hydrologic modeling, statistical data analysis, text editing, and some administrative work.

  6. 10 Steps to Building an Architecture for Space Surveillance Projects

    NASA Astrophysics Data System (ADS)

    Gyorko, E.; Barnhart, E.; Gans, H.

    Space surveillance is an increasingly complex task, requiring the coordination of a multitude of organizations and systems, while dealing with competing capabilities, proprietary processes, differing standards, and compliance issues. In order to fully understand space surveillance operations, analysts and engineers need to analyze and break down their operations and systems using what are essentially enterprise architecture processes and techniques. These techniques can be daunting to the first- time architect. This paper provides a summary of simplified steps to analyze a space surveillance system at the enterprise level in order to determine capabilities, services, and systems. These steps form the core of an initial Model-Based Architecting process. For new systems, a well defined, or well architected, space surveillance enterprise leads to an easier transition from model-based architecture to model-based design and provides a greater likelihood that requirements are fulfilled the first time. Both new and existing systems benefit from being easier to manage, and can be sustained more easily using portfolio management techniques, based around capabilities documented in the model repository. The resulting enterprise model helps an architect avoid 1) costly, faulty portfolio decisions; 2) wasteful technology refresh efforts; 3) upgrade and transition nightmares; and 4) non-compliance with DoDAF directives. The Model-Based Architecting steps are based on a process that Harris Corporation has developed from practical experience architecting space surveillance systems and ground systems. Examples are drawn from current work on documenting space situational awareness enterprises. The process is centered on DoDAF 2 and its corresponding meta-model so that terminology is standardized and communicable across any disciplines that know DoDAF architecting, including acquisition, engineering and sustainment disciplines. Each step provides a guideline for the type of data to collect, and also the appropriate views to generate. The steps include 1) determining the context of the enterprise, including active elements and high level capabilities or goals; 2) determining the desired effects of the capabilities and mapping capabilities against the project plan; 3) determining operational performers and their inter-relationships; 4) building information and data dictionaries; 5) defining resources associated with capabilities; 6) determining the operational behavior necessary to achieve each capability; 7) analyzing existing or planned implementations to determine systems, services and software; 8) cross-referencing system behavior to operational behavioral; 9) documenting system threads and functional implementations; and 10) creating any required textual documentation from the model.

  7. Microtechnology management considering test and cost aspects for stacked 3D ICs with MEMS

    NASA Astrophysics Data System (ADS)

    Hahn, K.; Wahl, M.; Busch, R.; Grünewald, A.; Brück, R.

    2018-01-01

    Innovative automotive systems require complex semiconductor devices currently only available in consumer grade quality. The European project TRACE will develop and demonstrate methods, processes, and tools to facilitate usage of Consumer Electronics (CE) components to be deployable more rapidly in the life-critical automotive domain. Consumer electronics increasingly use heterogeneous system integration methods and "More than Moore" technologies, which are capable to combine different circuit domains (Analog, Digital, RF, MEMS) and which are integrated within SiP or 3D stacks. Making these technologies or at least some of the process steps available under automotive electronics requirements is an important goal to keep pace with the growing demand for information processing within cars. The approach presented in this paper aims at a technology management and recommendation system that covers technology data, functional and non-functional constraints, and application scenarios, and that will comprehend test planning and cost consideration capabilities.

  8. The value and validation of broad spectrum biosensors for diagnosis and biodefense

    PubMed Central

    Metzgar, David; Sampath, Rangarajan; Rounds, Megan A; Ecker, David J

    2013-01-01

    Broad spectrum biosensors capable of identifying diverse organisms are transitioning from the realm of research into the clinic. These technologies simultaneously capture signals from a wide variety of biological entities using universal processes. Specific organisms are then identified through bioinformatic signature-matching processes. This is in contrast to currently accepted molecular diagnostic technologies, which utilize unique reagents and processes to detect each organism of interest. This paradigm shift greatly increases the breadth of molecular diagnostic tools with little increase in biochemical complexity, enabling simultaneous diagnostic, epidemiologic, and biothreat surveillance capabilities at the point of care. This, in turn, offers the promise of increased biosecurity and better antimicrobial stewardship. Efficient realization of these potential gains will require novel regulatory paradigms reflective of the generalized, information-based nature of these assays, allowing extension of empirical data obtained from readily available organisms to support broader reporting of rare, difficult to culture, or extremely hazardous organisms. PMID:24128433

  9. Developing Nationally Competitive NASA Research Capability in West Virginia

    NASA Technical Reports Server (NTRS)

    Calzonetti, Frank J.

    1997-01-01

    In May, 1995 West Virginia EPSCOR was awarded $150,000 to support activities to develop research capabilities in West Virginia in support of the National Aeronautics and Space Administration (NASA). These funds were used to support three projects: 1) Information Processing and the Earth Observing System, directed by Dr. Stuart Tewksbury of West Virginia University; 2) Development of Optical Materials for Atmospheric Sensing Experiments, directed by Dr. Nancy Giles of West Virginia University; and 3) Development of Doppler Global Velocimeter (DGV) for Aeronautical and Combustion Studies, directed by Dr. John Kuhlman of West Virginia University. The funding provides the means to develop capability in each of these areas. This report summarizes the technical accomplishments in each project supported under this award.

  10. Practice Evaluation Strategies Among Social Workers: Why an Evidence-Informed Dual-Process Theory Still Matters.

    PubMed

    Davis, Thomas D

    2017-01-01

    Practice evaluation strategies range in style from the formal-analytic tools of single-subject designs, rapid assessment instruments, algorithmic steps in evidence-informed practice, and computer software applications, to the informal-interactive tools of clinical supervision, consultation with colleagues, use of client feedback, and clinical experience. The purpose of this article is to provide practice researchers in social work with an evidence-informed theory that is capable of explaining both how and why social workers use practice evaluation strategies to self-monitor the effectiveness of their interventions in terms of client change. The author delineates the theoretical contours and consequences of what is called dual-process theory. Drawing on evidence-informed advances in the cognitive and social neurosciences, the author identifies among everyday social workers a theoretically stable, informal-interactive tool preference that is a cognitively necessary, sufficient, and stand-alone preference that requires neither the supplementation nor balance of formal-analytic tools. The author's delineation of dual-process theory represents a theoretical contribution in the century-old attempt to understand how and why social workers evaluate their practice the way they do.

  11. Corral framework: Trustworthy and fully functional data intensive parallel astronomical pipelines

    NASA Astrophysics Data System (ADS)

    Cabral, J. B.; Sánchez, B.; Beroiz, M.; Domínguez, M.; Lares, M.; Gurovich, S.; Granitto, P.

    2017-07-01

    Data processing pipelines represent an important slice of the astronomical software library that include chains of processes that transform raw data into valuable information via data reduction and analysis. In this work we present Corral, a Python framework for astronomical pipeline generation. Corral features a Model-View-Controller design pattern on top of an SQL Relational Database capable of handling: custom data models; processing stages; and communication alerts, and also provides automatic quality and structural metrics based on unit testing. The Model-View-Controller provides concept separation between the user logic and the data models, delivering at the same time multi-processing and distributed computing capabilities. Corral represents an improvement over commonly found data processing pipelines in astronomysince the design pattern eases the programmer from dealing with processing flow and parallelization issues, allowing them to focus on the specific algorithms needed for the successive data transformations and at the same time provides a broad measure of quality over the created pipeline. Corral and working examples of pipelines that use it are available to the community at https://github.com/toros-astro.

  12. Learning classification with auxiliary probabilistic information

    PubMed Central

    Nguyen, Quang; Valizadegan, Hamed; Hauskrecht, Milos

    2012-01-01

    Finding ways of incorporating auxiliary information or auxiliary data into the learning process has been the topic of active data mining and machine learning research in recent years. In this work we study and develop a new framework for classification learning problem in which, in addition to class labels, the learner is provided with an auxiliary (probabilistic) information that reflects how strong the expert feels about the class label. This approach can be extremely useful for many practical classification tasks that rely on subjective label assessment and where the cost of acquiring additional auxiliary information is negligible when compared to the cost of the example analysis and labelling. We develop classification algorithms capable of using the auxiliary information to make the learning process more efficient in terms of the sample complexity. We demonstrate the benefit of the approach on a number of synthetic and real world data sets by comparing it to the learning with class labels only. PMID:25309141

  13. Army Program Value Added Analysis 90-97 (VAA 90-97)

    DTIC Science & Technology

    1991-08-01

    affordability or duplication of capability. The AHP process appears to hold the greatest possibilities in this regard. 1-11. OTHER KEY FINDINGS a. The...to provide the logical skeleton in which to build an alternative’s effectiveness value. The analytical hierarchy process ( AHP ) is particularly...likely to be, at first cut, very fuzzy . Thus, the issue clarification step is inherently iterative. As the analyst gathers more and more information in

  14. Atmospheric Release Advisory Capability Pilot Project at Two Nuclear Power Plants and Associated State Offices of Emergency Preparedness.

    DTIC Science & Technology

    1983-01-01

    assumes any legal liability or responsibility for the accuracy, comn- pleteness. or usefulness of any information, apparatus, product. or process ...disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial products, process , or service...by trade name, trademark, manufacturer, or otherwise, does not necessarily constitute or imply. its endorsement. recommendation, or favoring by the U

  15. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2000-01-01

    Hospital information systems have to support quality improvement objectives. The design issues of health care information system can be classified into three categories: 1) time-oriented and event-labelled storage of patient data; 2) contextual support of decision-making; 3) capabilities for modular upgrading. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualize clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the field of blood transfusion. An object-oriented data model of a process has been defined in order to identify its main components: activity, sub-process, resources, constrains, guidelines, parameters and indicators. Although some aspects of activity, such as "where", "what else", and "why" are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for this approach to be generalised within the organisation, for the processes to be interrelated, and for their characteristics to be shared.

  16. Experiences using Visualization Techniques to Present Requirements, Risks to Them, and Options for Risk Mitigation

    NASA Technical Reports Server (NTRS)

    Feather, Martin S.; Cornford, Steven L.; Kiper, James D.; Menzies, Tim

    2006-01-01

    For several years we have been employing a risk-based decision process to guide development and application of advanced technologies, and for research and technology portfolio planning. The process is supported by custom software, in which visualization plays an important role. During requirements gathering, visualization is used to help scrutinize the status (completeness, extent) of the information. During decision making based on the gathered information, visualization is used to help decisionmakers understand the space of options and their consequences. In this paper we summarize the visualization capabilities that we have employed, indicating when and how they have proven useful.

  17. Scheduling Onboard Processing for the Proposed HyspIRI Mission

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Mclaren, David; Rabideau, Gregg; Mandl, Daniel; Hengemihle, Jerry

    2011-01-01

    The proposed Hyspiri mission is evaluating a X-band Direct Broadcast (DB) capability that would enable data to be delivered to ground stations virtually as it is acquired. However the HyspIRI VSWIR and TIR instruments will produce 1 Gbps data while the DB capability is 15 M bps for a 60x oversubscription. In order to address this data volume mismatch a DB concept has been developed thatdetermines which data to downlink based on both: 1. The type of surface the spacecraft is overflying and 2. Onboard processing of the data to detect events. For example when the spacecraft is overflying polar regions it might downlink a snow/ice product. Additionally the onboard software will search for thermal signatures indicative of a volcanic event or wild fire and downlink summary information (extent, spectra) when detected. The process of determining which products to generate when, based on request prioritization and onboard processing and downlink constraints is inherently a prioritized scheduling problem - we describe work to develop an automated solution to this problem.

  18. An intelligent approach to welding robot selection

    NASA Astrophysics Data System (ADS)

    Milano, J.; Mauk, S. D.; Flitter, L.; Morris, R.

    1993-10-01

    In a shipyard where multiple stationary and mobile workcells are employed in the fabrication of components of complex sub-assemblies,efficient operation requires an intelligent method of scheduling jobs and selecting workcells based on optimum throughput and cost. The achievement of this global solution requires the successful organization of resource availability,process requirements,and process constraints. The Off-line Planner (OLP) of the Programmable Automated Weld Systemd (PAWS) is capable of advanced modeling of weld processes and environments as well as the generation of complete weld procedures. These capabilities involve the integration of advanced Computer Aided Design (CAD), path planning, and obstacle detection and avoidance techniques as well as the synthesis of complex design and process information. These existing capabilities provide the basis of the functionality required for the successful implementation of an intelligent weld robot selector and material flow planner. Current efforts are focused on robot selection via the dynamic routing of components to the appropriate work cells. It is proposed that this problem is a variant of the “Traveling Salesman Problem” (TSP) that has been proven to belong to a larger set of optimization problems termed nondeterministic polynomial complete (NP complete). In this paper, a heuristic approach utilizing recurrent neural networks is explored as a rapid means of producing a near optimal, if not optimal, bdweld robot selection.

  19. Study of Fluid Experiment System (FES)/CAST/Holographic Ground System (HGS)

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Cummings, Rick; Jones, Brian

    1992-01-01

    The use of holographic and schlieren optical techniques for studying the concentration gradients in solidification processes has been used by several investigators over the years. The HGS facility at MSFC has been primary resource in researching this capability. Consequently, scientific personnel have been able to utilize these techniques in both ground based research and in space experiments. An important event in the scientific utilization of the HGS facilities was the TGS Crystal Growth and the casting and solidification technology (CAST) experiments that were flown on the International Microgravity Laboratory (IML) mission in March of this year. The preparation and processing of these space observations are the primary experiments reported in this work. This project provides some ground-based studies to optimize on the holographic techniques used to acquire information about the crystal growth processes flown on IML. Since the ground-based studies will be compared with the space-based experimental results, it is necessary to conduct sufficient ground based studies to best determine how the experiment worked in space. The current capabilities in computer based systems for image processing and numerical computation have certainly assisted in those efforts. As anticipated, this study has certainly shown that these advanced computing capabilities are helpful in the data analysis of such experiments.

  20. Aerobraking Maneuver (ABM) Report Generator

    NASA Technical Reports Server (NTRS)

    Fisher, Forrest; Gladden, Roy; Khanampornpan, Teerapat

    2008-01-01

    abmREPORT Version 3.1 is a Perl script that extracts vital summarization information from the Mars Reconnaissance Orbiter (MRO) aerobraking ABM build process. This information facilitates sequence reviews, and provides a high-level summarization of the sequence for mission management. The script extracts information from the ENV, SSF, FRF, SCMFmax, and OPTG files and burn magnitude configuration files and presents them in a single, easy-to-check report that provides the majority of the parameters necessary for cross check and verification during the sequence review process. This means that needed information, formerly spread across a number of different files and each in a different format, is all available in this one application. This program is built on the capabilities developed in dragReport and then the scripts evolved as the two tools continued to be developed in parallel.

  1. Evaluation of APREQCFR Coordination Procedures for Charlotte Douglas International Airport

    NASA Technical Reports Server (NTRS)

    Stevens, Lindsay K. S.; Parke, Bonny K.; Chevalley, Eric; Lee, Hanbong; Martin, Lynne H.; Jobe, Kimberly K.; Verma, Savita A.; Dulchinos, Victoria Lee

    2017-01-01

    NASA has been collaborating with the Federal Aviation Administration (FAA) and aviation industry partners to develop and demonstrate new concepts and technologies for the Integrated Arrival, Departure, and Surface (IADS) traffic management capabilities under the Airspace Technology Demonstration 2 (ATD-2) project. One of the goal of The IADS capabilities in the ATD-2 project is to increase predictability and increase throughput via improving TMI compliance. The IADS capabilities that will impact TMI compliance are built upon previous NASA research, the Precision Departure Release Capability (PDRC). The proposed paper will evaluate the APREQCFR process between ATC Tower and Center and information sharing between ATC Tower and the airline Ramp tower. Subjective measures collected from the HITL surveys (e.g., workload, situational awareness, acceptability, usability) and performance metrics such as TMI, TMAT, and pushback advisory compliance from APREQCFR flights and will be reported.

  2. NASA's Applied Sciences: Natural Disasters Program

    NASA Technical Reports Server (NTRS)

    Kessler, Jason L.

    2010-01-01

    Fully utilize current and near-term airborne and spaceborne assets and capabilities. NASA spaceborne instruments are for research but can be applied to natural disaster response as appropriate. NASA airborne instruments can be targeted specifically for disaster response. Could impact research programs. Better flow of information improves disaster response. Catalog capability, product, applicable disaster, points of contact. Ownership needs to come from the highest level of NASA - unpredictable and irregular nature of disasters requires contingency funding for disaster response. Build-in transfer of applicable natural disaster research capabilities to operational functionality at other agencies (e.g., USFS, NOAA, FEMA...) at the outset, whenever possible. For the Decadal Survey Missions, opportunities exist to identify needs and requirements early in the mission design process. Need to understand additional needs and commitments for meeting the needs of the disaster community. Opportunity to maximize disaster response and mitigation from the Decadal Survey Missions. Additional needs or capabilities may require agency contributions.

  3. A Context-Aware Model to Provide Positioning in Disaster Relief Scenarios

    PubMed Central

    Moreno, Daniel; Ochoa, Sergio F.; Meseguer, Roc

    2015-01-01

    The effectiveness of the work performed during disaster relief efforts is highly dependent on the coordination of activities conducted by the first responders deployed in the affected area. Such coordination, in turn, depends on an appropriate management of geo-referenced information. Therefore, enabling first responders to count on positioning capabilities during these activities is vital to increase the effectiveness of the response process. The positioning methods used in this scenario must assume a lack of infrastructure-based communication and electrical energy, which usually characterizes affected areas. Although positioning systems such as the Global Positioning System (GPS) have been shown to be useful, we cannot assume that all devices deployed in the area (or most of them) will have positioning capabilities by themselves. Typically, many first responders carry devices that are not capable of performing positioning on their own, but that require such a service. In order to help increase the positioning capability of first responders in disaster-affected areas, this paper presents a context-aware positioning model that allows mobile devices to estimate their position based on information gathered from their surroundings. The performance of the proposed model was evaluated using simulations, and the obtained results show that mobile devices without positioning capabilities were able to use the model to estimate their position. Moreover, the accuracy of the positioning model has been shown to be suitable for conducting most first response activities. PMID:26437406

  4. Multisource information fusion applied to ship identification for the recognized maritime picture

    NASA Astrophysics Data System (ADS)

    Simard, Marc-Alain; Lefebvre, Eric; Helleur, Christopher

    2000-04-01

    The Recognized Maritime Picture (RMP) is defined as a composite picture of activity over a maritime area of interest. In simplistic terms, building an RAMP comes down to finding if an object of interest, a ship in our case, is there or not, determining what it is, determining what it is doing and determining if some type of follow-on action is required. The Canadian Department of National Defence currently has access to or may, in the near future, have access to a number of civilians, military and allied information or sensor systems to accomplish these purposes. These systems include automatic self-reporting positional systems, air patrol surveillance systems, high frequency surface radars, electronic intelligence systems, radar space systems and high frequency direction finding sensors. The ability to make full use of these systems is limited by the existing capability to fuse data from all sources in a timely, accurate and complete manner. This paper presents an information fusion systems under development that correlates and fuses these information and sensor data sources. This fusion system, named Adaptive Fuzzy Logic Correlator, correlates the information in batch but fuses and constructs ship tracks sequentially. It applies standard Kalman filter techniques and fuzzy logic correlation techniques. We propose a set of recommendations that should improve the ship identification process. Particularly it is proposed to utilize as many non-redundant sources of information as possible that address specific vessel attributes. Another important recommendation states that the information fusion and data association techniques should be capable of dealing with incomplete and imprecise information. Some fuzzy logic techniques capable of tolerating imprecise and dissimilar data are proposed.

  5. Advanced Query and Data Mining Capabilities for MaROS

    NASA Technical Reports Server (NTRS)

    Wang, Paul; Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Hy, Franklin H.

    2013-01-01

    The Mars Relay Operational Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay network. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. As part of MaROS, the innovators have developed and implemented a feature set that operates on several levels of the software architecture. This new feature is an advanced querying capability through either the Web-based user interface, or through a back-end REST interface to access all of the data gathered from the network. This software is not meant to replace the REST interface, but to augment and expand the range of available data. The current REST interface provides specific data that is used by the MaROS Web application to display and visualize the information; however, the returned information from the REST interface has typically been pre-processed to return only a subset of the entire information within the repository, particularly only the information that is of interest to the GUI (graphical user interface). The new, advanced query and data mining capabilities allow users to retrieve the raw data and/or to perform their own data processing. The query language used to access the repository is a restricted subset of the structured query language (SQL) that can be built safely from the Web user interface, or entered as freeform SQL by a user. The results are returned in a CSV (Comma Separated Values) format for easy exporting to third party tools and applications that can be used for data mining or user-defined visualization and interpretation. This is the first time that a service is capable of providing access to all cross-project relay data from a single Web resource. Because MaROS contains the data for a variety of missions from the Mars network, which span both NASA and ESA, the software also establishes an access control list (ACL) on each data record in the database repository to enforce user access permissions through a multilayered approach.

  6. Associations between structural capabilities of primary care practices and performance on selected quality measures.

    PubMed

    Friedberg, Mark W; Coltin, Kathryn L; Safran, Dana Gelb; Dresser, Marguerite; Zaslavsky, Alan M; Schneider, Eric C

    2009-10-06

    Recent proposals to reform primary care have encouraged physician practices to adopt such structural capabilities as performance feedback and electronic health records. Whether practices with these capabilities have higher performance on measures of primary care quality is unknown. To measure associations between structural capabilities of primary care practices and performance on commonly used quality measures. Cross-sectional analysis. Massachusetts. 412 primary care practices. During 2007, 1 physician from each participating primary care practice (median size, 4 physicians) was surveyed about structural capabilities of the practice (responses representing 308 practices were obtained). Data on practice structural capabilities were linked to multipayer performance data on 13 Healthcare Effectiveness Data and Information Set (HEDIS) process measures in 4 clinical areas: screening, diabetes, depression, and overuse. Frequently used multifunctional electronic health records were associated with higher performance on 5 HEDIS measures (3 in screening and 2 in diabetes), with statistically significant differences in performance ranging from 3.1 to 7.6 percentage points. Frequent meetings to discuss quality were associated with higher performance on 3 measures of diabetes care (differences ranging from 2.3 to 3.1 percentage points). Physician awareness of patient experience ratings was associated with higher performance on screening for breast cancer and cervical cancer (1.9 and 2.2 percentage points, respectively). No other structural capabilities were associated with performance on more than 1 measure. No capabilities were associated with performance on depression care or overuse. Structural capabilities of primary care practices were assessed by physician survey. Among the investigated structural capabilities of primary care practices, electronic health records were associated with higher performance across multiple HEDIS measures. Overall, the modest magnitude and limited number of associations between structural capabilities and clinical performance suggest the importance of continuing to measure the processes and outcomes of care for patients. The Commonwealth Fund.

  7. Situational awareness in the commercial aircraft cockpit - A cognitive perspective

    NASA Technical Reports Server (NTRS)

    Adams, Marilyn J.; Pew, Richard W.

    1990-01-01

    A cognitive theory is presented that has relevance for the definition and assessment of situational awareness in the cockpit. The theory asserts that maintenance of situation awareness is a constructive process that demands mental resources in competition with ongoing task performance. Implications of this perspective for assessing and improving situational awareness are discussed. It is concluded that the goal of inserting advanced technology into any system is that it results in an increase in the effectiveness, timeliness, and safety with which the system's activities can be accomplished. The inherent difficulties of the multitask situation are very often compounded by the introduction of automation. To maximize situational awareness, the dynamics and capabilities of such technologies must be designed with thorough respect for the dynamics and capabilities of human information-processing.

  8. Personalised physical exercise regime for chronic patients through a wearable ICT platform.

    PubMed

    Angelidis, Pantelis A

    2010-01-01

    Today's state of the art in exercise physiology, professional athletics and sports practice in general clearly shows that the best results depend on the personalisation and continuous update of the recommendations provided to an athlete training, a sports lover or a person whose medical condition demands regular physical exercise. The vital signs information gathered in telemonitoring systems can be better evaluated and exploited if processed along with data from the subject's electronic health records, training history and performance statistics. In this context, the current paper intends to exploit modern smart miniaturised systems and advanced information systems towards the development of an infrastructure for continuous, non-invasive acquisition and advanced processing of vital signs information. In particular, it will look into wearable electronics embedded in textile capable of performing regular or exceptional measurements of vital physiological parameters and communicating them to an application server for further processing.

  9. Coding principles of the canonical cortical microcircuit in the avian brain

    PubMed Central

    Calabrese, Ana; Woolley, Sarah M. N.

    2015-01-01

    Mammalian neocortex is characterized by a layered architecture and a common or “canonical” microcircuit governing information flow among layers. This microcircuit is thought to underlie the computations required for complex behavior. Despite the absence of a six-layered cortex, birds are capable of complex cognition and behavior. In addition, the avian auditory pallium is composed of adjacent information-processing regions with genetically identified neuron types and projections among regions comparable with those found in the neocortex. Here, we show that the avian auditory pallium exhibits the same information-processing principles that define the canonical cortical microcircuit, long thought to have evolved only in mammals. These results suggest that the canonical cortical microcircuit evolved in a common ancestor of mammals and birds and provide a physiological explanation for the evolution of neural processes that give rise to complex behavior in the absence of cortical lamination. PMID:25691736

  10. The automated Army ROTC Questionnaire (ARQ)

    NASA Technical Reports Server (NTRS)

    Young, David L. H.

    1991-01-01

    The Reserve Officer Training Corps Cadet Command (ROTCCC) takes applications for its officer training program from college students and Army enlisted personnel worldwide. Each applicant is required to complete a set of application forms prior to acceptance into the ROTC program. These forms are covered by several regulations that govern the eligibility of potential applicants and guide the applicant through the application process. Eligibility criteria changes as Army regulations are periodically revised. Outdated information results in a loss of applications attributable to frustration and error. ROTCCC asked for an inexpensive and reliable way of automating their application process. After reviewing the process, it was determined that an expert system with good end user interface capabilities could be used to solve a large part of the problem. The system captures the knowledge contained within the regulations, enables the quick distribution and implementation of eligibility criteria changes, and distributes the expertise of the admissions personnel to the education centers and colleges. The expert system uses a modified version of CLIPS that was streamlined to make the most efficient use of its capabilities. A user interface with windowing capabilities provides the applicant with a simple and effective way to input his/her personal data.

  11. Human Processing of Knowledge from Texts: Acquisition, Integration, and Reasoning.

    ERIC Educational Resources Information Center

    Thorndyke, Perry W.; Hayes-Roth, Barbara

    This report documents a series of studies on how undergraduate students learn from and reason with textual information. The studies described were undertaken to produce models that could serve as the basis for designing computer systems capable of structuring and presenting text material in optimal formats. Divided into sections, the report…

  12. MIRADS-2 user's manual

    NASA Technical Reports Server (NTRS)

    1975-01-01

    An on-line data storage and retrieval system which allows the user to extract and process information from stored data bases is described. The capabilities of the system are provided by a general purpose computer program containing several functional modules. The modules contained in MIRADS are briefly described along with user terminal operation procedures and MIRADS commands.

  13. Social Information in Court Decisions of Compulsory Child Adoption in Israel

    ERIC Educational Resources Information Center

    Ben-David, Vered

    2011-01-01

    Ambiguity over the concepts of "parental capability" and "the child's best interests" in the Israeli adoption law, and a lack of sufficient professional knowledge can lead to bias in the professional decision-making process regarding child adoption. This study investigates the idea that judges do not use only legal…

  14. A Computational Account of Children's Analogical Reasoning: Balancing Inhibitory Control in Working Memory and Relational Representation

    ERIC Educational Resources Information Center

    Morrison, Robert G.; Doumas, Leonidas A. A.; Richland, Lindsey E.

    2011-01-01

    Theories accounting for the development of analogical reasoning tend to emphasize either the centrality of relational knowledge accretion or changes in information processing capability. Simulations in LISA (Hummel & Holyoak, 1997, 2003), a neurally inspired computer model of analogical reasoning, allow us to explore how these factors may…

  15. Comparative Minicolumnar Morphometry of Three Distinguished Scientists

    ERIC Educational Resources Information Center

    Casanova, Manuel F.; Switala, Andrew E.; Trippe, Juan; Fitzgerald, Michael

    2007-01-01

    It has been suggested that the cell minicolumn is the smallest module capable of information processing within the brain. In this case series, photomicrographs of six regions of interests (Brodmann areas 4, 9, 17, 21, 22, and 40) were analyzed by computerized image analysis for minicolumnar morphometry in the brains of three distinguished…

  16. A Framework for Mobile Apps in Colleges and Universities: Data Mining Perspective

    ERIC Educational Resources Information Center

    Singh, Archana; Ranjan, Jayanthi

    2016-01-01

    The Enterprise mobility communication technology provides easy and quick accessibility to data and information integrated into one single touch point device. This device incorporates or integrates all the processes into small applications or App and thus increases the workforce capability of knowledge workers. "App" which is a small set…

  17. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  18. Biological complexity and adaptability of simple mammalian olfactory memory systems.

    PubMed

    Brennan, P; Keverne, E B

    2015-03-01

    Chemosensory systems play vital roles in the lives of most mammals, including the detection and identification of predators, as well as sex and reproductive status and the identification of individual conspecifics. All of these capabilities require a process of recognition involving a combination of innate (kairomonal/pheromonal) and learned responses. Across very different phylogenies, the mechanisms for pheromonal and odour learning have much in common. They are frequently associated with plasticity of GABA-ergic feedback at the initial level of processing the chemosensory information, which enhances its pattern separation capability. Association of odourant features into an odour object primarily involves anterior piriform cortex for non-social odours. However, the medial amygdala appears to be involved in both the recognition of social odours and their association with chemosensory information sensed by the vomeronasal system. Unusually not only the sensory neurons themselves, but also the GABA-ergic interneurons in the olfactory bulb are continually being replaced, with implications for the induction and maintenance of learned chemosensory responses. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  19. Digital and biological computing in organizations.

    PubMed

    Kampfner, Roberto R

    2002-01-01

    Michael Conrad unveiled many of the fundamental characteristics of biological computing. Underlying the behavioral variability and the adaptability of biological systems are these characteristics, including the ability of biological information processing to exploit quantum features at the atomic level, the powerful 3-D pattern recognition capabilities of macromolecules, the computational efficiency, and the ability to support biological function. Among many other things, Conrad formalized and explicated the underlying principles of biological adaptability, characterized the differences between biological and digital computing in terms of a fundamental tradeoff between adaptability and programmability of information processing, and discussed the challenges of interfacing digital computers and human society. This paper is about the encounter of biological and digital computing. The focus is on the nature of the biological information processing infrastructure of organizations and how it can be extended effectively with digital computing. In order to achieve this goal effectively, however, we need to embed properly digital computing into the information processing aspects of human and social behavior and intelligence, which are fundamentally biological. Conrad's legacy provides a firm, strong, and inspiring foundation for this endeavor.

  20. A Contrast in Use of Metrics in Earth Science Data Systems

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram; Behnke, Jeanne; Hines-Watts, Tonjua

    2007-01-01

    In recent years there has been a surge in the number of systems for processing, archiving and distributing remotely sensed data. Such systems, working independently as well as in collaboration, have been contributing greatly to the advances in the scientific understanding of the Earth system, as well as utilization of the data for nationally and internationally important applications. Among such systems, we consider those that are developed by or under the sponsorship of NASA to fulfill one of its strategic objectives: "Study Earth from space to advance scientific understanding and meet societal needs." NASA's Earth science data systems are of varying size and complexity depending on the requirements they are intended to meet. Some data systems are regarded as NASA's "Core Capabilities" that provide the basic infrastructure for processing, archiving and distributing a set of data products to a large and diverse user community in a robust and reliable manner. Other data systems constitute "Community Capabilities". These provide specialized and innovative services to data users and/or research products offering new scientific insight. Such data systems are generally supported by NASA through peer reviewed competition. Examples of Core Capabilities are 1. Earth Observing Data and Information System (EOSDIS) with its Distributed Active Archive Centers (DAACs), Science Investigator-led Processing Systems (SIPSs), and the EOS Clearing House (ECHO); 2. Tropical Rainfall Measurement Mission (TRMM) Science Data and Information System (TSDIS); 3. Ocean Data Processing System (ODPS); and 4. CloudSat Data Processing Center. Examples of Community Capabilities are projects under the Research, Education and Applications Solutions Network (REASON), and Advancing Collaborative Connections for Earth System Science (ACCESS) Programs. In managing these data system capabilities, it is necessary to have well-established goals and to measure progress relative to them. Progress is measured through "metrics", which can be a combination of quantitative as well as qualitative assessments. The specific metrics of interest depend on the user of the metrics as well as the type of data system. The users of metrics can be data system managers, program managers, funding agency or the public. Data system managers need metrics for assessing and improving the performance of the system and for future planning. Program managers need metrics to assess progress and the value of the data systems sponsored by them. Also, there is a difference in the metrics needed for core capabilities that tend to be more complex, larger and longer-term compared to community capabilities and the community capabilities that tend to be simpler, smaller and shorter-term. Even among community capabilities there are differences; hence the same set of metrics does not apply to all. Some provide data products to users, some provide services that enable better utilization of data or interoperability among other systems, and some are a part of a larger project where provision of data or services is only a minor activity. There is also a contrast between metrics used for internal and external purposes. Examples of internal purposes are: ensuring that the system meets its requirements, and planning for evolution and growth. Examples of external purposes are: providing to sponsors indicators of success of the systems, demonstrating the contributions of the system to overall program success, etc. This paper will consider EOSDIS, REASON and ACCESS programs to show the various types of metrics needed and how they need to be tailored to the types of data systems while maintaining the overall management goals of measuring progress and contributions made by the data systems.

  1. A Contrast in Use of Metrics in Earth Science Data Systems

    NASA Astrophysics Data System (ADS)

    Ramapriyan, H. K.; Behnke, J.; Hines-Watts, T. M.

    2007-12-01

    In recent years there has been a surge in the number of systems for processing, archiving and distributing remotely sensed data. Such systems, working independently as well as in collaboration, have been contributing greatly to the advances in the scientific understanding of the Earth system, as well as utilization of the data for nationally and internationally important applications. Among such systems, we consider those that are developed by or under the sponsorship of NASA to fulfill one of its strategic objectives: "Study Earth from space to advance scientific understanding and meet societal needs." NASA's Earth science data systems are of varying size and complexity depending on the requirements they are intended to meet. Some data systems are regarded as NASA's Core Capabilities that provide the basic infrastructure for processing, archiving and distributing a set of data products to a large and diverse user community in a robust and reliable manner. Other data systems constitute Community Capabilities. These provide specialized and innovative services to data users and/or research products offering new scientific insight. Such data systems are generally supported by NASA through peer reviewed competition. Examples of Core Capabilities are 1. Earth Observing Data and Information System (EOSDIS) with its Distributed Active Archive Centers (DAACs), Science Investigator-led Processing Systems (SIPSs), and the EOS Clearing House (ECHO); 2. Tropical Rainfall Measurement Mission (TRMM) Science Data and Information System (TSDIS); 3. Ocean Data Processing System (ODPS); and 4. CloudSat Data Processing Center. Examples of Community Capabilities are projects under the Research, Education and Applications Solutions Network (REASoN), and Advancing Collaborative Connections for Earth System Science (ACCESS) Programs. In managing these data system capabilities, it is necessary to have well-established goals and to measure progress relative to them. Progress is measured through metrics, which can be a combination of quantitative as well as qualitative assessments. The specific metrics of interest depend on the user of the metrics as well as the type of data system. The users of metrics can be data system managers, program managers, funding agency or the public. Data system managers need metrics for assessing and improving the performance of the system and for future planning. Program managers need metrics to assess progress and the value of the data systems sponsored by them. Also, there is a difference in the metrics needed for core capabilities that tend to be more complex, larger and longer-term compared to community capabilities and the community capabilities that tend to be simpler, smaller and shorter-term. Even among community capabilities there are differences; hence the same set of metrics does not apply to all. Some provide data products to users, some provide services that enable better utilization of data or interoperability among other systems, and some are a part of a larger project where provision of data or services is only a minor activity. There is also a contrast between metrics used for internal and external purposes. Examples of internal purposes are: ensuring that the system meets its requirements, and planning for evolution and growth. Examples of external purposes are: providing to sponsors indicators of success of the systems, demonstrating the contributions of the system to overall program success, etc. This paper will consider EOSDIS, REASoN and ACCESS programs to show the various types of metrics needed and how they need to be tailored to the types of data systems while maintaining the overall management goals of measuring progress and contributions made by the data systems.

  2. Active glass-type human augmented cognition system considering attention and intention

    NASA Astrophysics Data System (ADS)

    Kim, Bumhwi; Ojha, Amitash; Lee, Minho

    2015-10-01

    Human cognition is the result of an interaction of several complex cognitive processes with limited capabilities. Therefore, the primary objective of human cognitive augmentation is to assist and expand these limited human cognitive capabilities independently or together. In this study, we propose a glass-type human augmented cognition system, which attempts to actively assist human memory functions by providing relevant, necessary and intended information by constantly assessing intention of the user. To achieve this, we exploit selective attention and intention processes. Although the system can be used in various real-life scenarios, we test the performance of the system in a person identity scenario. To detect the intended face, the system analyses the gaze points and change in pupil size to determine the intention of the user. An assessment of the gaze points and change in pupil size together indicates that the user intends to know the identity and information about the person in question. Then, the system retrieves several clues through speech recognition system and retrieves relevant information about the face, which is finally displayed through head-mounted display. We present the performance of several components of the system. Our results show that the active and relevant assistance based on users' intention significantly helps the enhancement of memory functions.

  3. Supporting users through integrated retrieval, processing, and distribution systems at the land processes distributed active archive center

    USGS Publications Warehouse

    Kalvelage, T.; Willems, Jennifer

    2003-01-01

    The design of the EOS Data and Information Systems (EOSDIS) to acquire, archive, manage and distribute Earth observation data to the broadest possible user community was discussed. A number of several integrated retrieval, processing and distribution capabilities have been explained. The value of these functions to the users were described and potential future improvements were laid out for the users. The users were interested in acquiring the retrieval, processing and archiving systems integrated so that they can get the data they want in the format and delivery mechanism of their choice.

  4. Double ionization in R -matrix theory using a two-electron outer region

    NASA Astrophysics Data System (ADS)

    Wragg, Jack; Parker, J. S.; van der Hart, H. W.

    2015-08-01

    We have developed a two-electron outer region for use within R -matrix theory to describe double ionization processes. The capability of this method is demonstrated for single-photon double ionization of He in the photon energy region between 80 and 180 eV. The cross sections are in agreement with established data. The extended R -matrix with time dependence method also provides information on higher-order processes, as demonstrated by the identification of signatures for sequential double ionization processes involving an intermediate He+ state with n =2 .

  5. AOIPS water resources data management system

    NASA Technical Reports Server (NTRS)

    Vanwie, P.

    1977-01-01

    The text and computer-generated displays used to demonstrate the AOIPS (Atmospheric and Oceanographic Information Processing System) water resources data management system are investigated. The system was developed to assist hydrologists in analyzing the physical processes occurring in watersheds. It was designed to alleviate some of the problems encountered while investigating the complex interrelationships of variables such as land-cover type, topography, precipitation, snow melt, surface runoff, evapotranspiration, and streamflow rates. The system has an interactive image processing capability and a color video display to display results as they are obtained.

  6. Building a base map with AutoCAD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flarity, S.J.

    1989-12-01

    The fundamental step in the exploration process is building a base map. Consequently, any serious computer exploration program should be capable of providing base maps. Data used in constructing base maps are available from commercial sources such as Tobin. and Petroleum Information. These data sets include line and well data, the line data being latitude longitude vectors, and the ell data any identifying text information for well and their locations. AutoCAD is a commercial program useful in building base maps. Its features include infinite zoom and pan capability, layering, block definition, text dialog boxes, and a command language, AutoLisp. AutoLispmore » provides more power by allowing the geologist to modify the way the program works. Three AutoLisp routines presented here allow geologists to construct a geologic base map from raw Tobin data. The first program, WELLS.LSP, sets up the map environment for the subsequent programs, WELLADD.LSP and LINEADD.LSP. Welladd.lisp reads the Tobin data and spots the well symbols and the identifying information. Lineadd.lsp performs the same task on line and textural information contained within the data set.« less

  7. Dynamic Reconfiguration of a RGBD Sensor Based on QoS and QoC Requirements in Distributed Systems.

    PubMed

    Munera, Eduardo; Poza-Lujan, Jose-Luis; Posadas-Yagüe, Juan-Luis; Simó-Ten, José-Enrique; Noguera, Juan Fco Blanes

    2015-07-24

    The inclusion of embedded sensors into a networked system provides useful information for many applications. A Distributed Control System (DCS) is one of the clearest examples where processing and communications are constrained by the client's requirements and the capacity of the system. An embedded sensor with advanced processing and communications capabilities supplies high level information, abstracting from the data acquisition process and objects recognition mechanisms. The implementation of an embedded sensor/actuator as a Smart Resource permits clients to access sensor information through distributed network services. Smart resources can offer sensor services as well as computing, communications and peripheral access by implementing a self-aware based adaptation mechanism which adapts the execution profile to the context. On the other hand, information integrity must be ensured when computing processes are dynamically adapted. Therefore, the processing must be adapted to perform tasks in a certain lapse of time but always ensuring a minimum process quality. In the same way, communications must try to reduce the data traffic without excluding relevant information. The main objective of the paper is to present a dynamic configuration mechanism to adapt the sensor processing and communication to the client's requirements in the DCS. This paper describes an implementation of a smart resource based on a Red, Green, Blue, and Depth (RGBD) sensor in order to test the dynamic configuration mechanism presented.

  8. Design of a graphical user interface for an intelligent multimedia information system for radiology research

    NASA Astrophysics Data System (ADS)

    Taira, Ricky K.; Wong, Clement; Johnson, David; Bhushan, Vikas; Rivera, Monica; Huang, Lu J.; Aberle, Denise R.; Cardenas, Alfonso F.; Chu, Wesley W.

    1995-05-01

    With the increase in the volume and distribution of images and text available in PACS and medical electronic health-care environments it becomes increasingly important to maintain indexes that summarize the content of these multi-media documents. Such indices are necessary to quickly locate relevant patient cases for research, patient management, and teaching. The goal of this project is to develop an intelligent document retrieval system that allows researchers to request for patient cases based on document content. Thus we wish to retrieve patient cases from electronic information archives that could include a combined specification of patient demographics, low level radiologic findings (size, shape, number), intermediate-level radiologic findings (e.g., atelectasis, infiltrates, etc.) and/or high-level pathology constraints (e.g., well-differentiated small cell carcinoma). The cases could be distributed among multiple heterogeneous databases such as PACS, RIS, and HIS. Content- based retrieval systems go beyond the capabilities of simple key-word or string-based retrieval matching systems. These systems require a knowledge base to comprehend the generality/specificity of a concept (thus knowing the subclasses or related concepts to a given concept) and knowledge of the various string representations for each concept (i.e., synonyms, lexical variants, etc.). We have previously reported on a data integration mediation layer that allows transparent access to multiple heterogeneous distributed medical databases (HIS, RIS, and PACS). The data access layer of our architecture currently has limited query processing capabilities. Given a patient hospital identification number, the access mediation layer collects all documents in RIS and HIS and returns this information to a specified workstation location. In this paper we report on our efforts to extend the query processing capabilities of the system by creation of custom query interfaces, an intelligent query processing engine, and a document-content index that can be generated automatically (i.e., no manual authoring or changes to the normal clinical protocols).

  9. Enabling search over encrypted multimedia databases

    NASA Astrophysics Data System (ADS)

    Lu, Wenjun; Swaminathan, Ashwin; Varna, Avinash L.; Wu, Min

    2009-02-01

    Performing information retrieval tasks while preserving data confidentiality is a desirable capability when a database is stored on a server maintained by a third-party service provider. This paper addresses the problem of enabling content-based retrieval over encrypted multimedia databases. Search indexes, along with multimedia documents, are first encrypted by the content owner and then stored onto the server. Through jointly applying cryptographic techniques, such as order preserving encryption and randomized hash functions, with image processing and information retrieval techniques, secure indexing schemes are designed to provide both privacy protection and rank-ordered search capability. Retrieval results on an encrypted color image database and security analysis of the secure indexing schemes under different attack models show that data confidentiality can be preserved while retaining very good retrieval performance. This work has promising applications in secure multimedia management.

  10. A database management capability for Ada

    NASA Technical Reports Server (NTRS)

    Chan, Arvola; Danberg, SY; Fox, Stephen; Landers, Terry; Nori, Anil; Smith, John M.

    1986-01-01

    The data requirements of mission critical defense systems have been increasing dramatically. Command and control, intelligence, logistics, and even weapons systems are being required to integrate, process, and share ever increasing volumes of information. To meet this need, systems are now being specified that incorporate data base management subsystems for handling storage and retrieval of information. It is expected that a large number of the next generation of mission critical systems will contain embedded data base management systems. Since the use of Ada has been mandated for most of these systems, it is important to address the issues of providing data base management capabilities that can be closely coupled with Ada. A comprehensive distributed data base management project has been investigated. The key deliverables of this project are three closely related prototype systems implemented in Ada. These three systems are discussed.

  11. GWM-VI: groundwater management with parallel processing for multiple MODFLOW versions

    USGS Publications Warehouse

    Banta, Edward R.; Ahlfeld, David P.

    2013-01-01

    Groundwater Management–Version Independent (GWM–VI) is a new version of the Groundwater Management Process of MODFLOW. The Groundwater Management Process couples groundwater-flow simulation with a capability to optimize stresses on the simulated aquifer based on an objective function and constraints imposed on stresses and aquifer state. GWM–VI extends prior versions of Groundwater Management in two significant ways—(1) it can be used with any version of MODFLOW that meets certain requirements on input and output, and (2) it is structured to allow parallel processing of the repeated runs of the MODFLOW model that are required to solve the optimization problem. GWM–VI uses the same input structure for files that describe the management problem as that used by prior versions of Groundwater Management. GWM–VI requires only minor changes to the input files used by the MODFLOW model. GWM–VI uses the Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER-API) to implement both version independence and parallel processing. GWM–VI communicates with the MODFLOW model by manipulating certain input files and interpreting results from the MODFLOW listing file and binary output files. Nearly all capabilities of prior versions of Groundwater Management are available in GWM–VI. GWM–VI has been tested with MODFLOW-2005, MODFLOW-NWT (a Newton formulation for MODFLOW-2005), MF2005-FMP2 (the Farm Process for MODFLOW-2005), SEAWAT, and CFP (Conduit Flow Process for MODFLOW-2005). This report provides sample problems that demonstrate a range of applications of GWM–VI and the directory structure and input information required to use the parallel-processing capability.

  12. Shuttle Abort Flight Management (SAFM) - Application Overview

    NASA Technical Reports Server (NTRS)

    Hu, Howard; Straube, Tim; Madsen, Jennifer; Ricard, Mike

    2002-01-01

    One of the most demanding tasks that must be performed by the Space Shuttle flight crew is the process of determining whether, when and where to abort the vehicle should engine or system failures occur during ascent or entry. Current Shuttle abort procedures involve paging through complicated paper checklists to decide on the type of abort and where to abort. Additional checklists then lead the crew through a series of actions to execute the desired abort. This process is even more difficult and time consuming in the absence of ground communications since the ground flight controllers have the analysis tools and information that is currently not available in the Shuttle cockpit. Crew workload specifically abort procedures will be greatly simplified with the implementation of the Space Shuttle Cockpit Avionics Upgrade (CAU) project. The intent of CAU is to maximize crew situational awareness and reduce flight workload thru enhanced controls and displays, and onboard abort assessment and determination capability. SAFM was developed to help satisfy the CAU objectives by providing the crew with dynamic information about the capability of the vehicle to perform a variety of abort options during ascent and entry. This paper- presents an overview of the SAFM application. As shown in Figure 1, SAFM processes the vehicle navigation state and other guidance information to provide the CAU displays with evaluations of abort options, as well as landing site recommendations. This is accomplished by three main SAFM components: the Sequencer Executive, the Powered Flight Function, and the Glided Flight Function, The Sequencer Executive dispatches the Powered and Glided Flight Functions to evaluate the vehicle's capability to execute the current mission (or current abort), as well as more than IS hypothetical abort options or scenarios. Scenarios are sequenced and evaluated throughout powered and glided flight. Abort scenarios evaluated include Abort to Orbit (ATO), Transatlantic Abort Landing (TAL), East Coast Abort Landing (ECAL) and Return to Launch Site (RTLS). Sequential and simultaneous engine failures are assessed and landing footprint information is provided during actual entry scenarios as well as hypothetical "loss of thrust now" scenarios during ascent.

  13. NASA information sciences and human factors program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Data Systems Program consists of research and technology devoted to controlling, processing, storing, manipulating, and analyzing space-derived data. The objectives of the program are to provide the technology advancements needed to enable affordable utilization of space-derived data, to increase substantially the capability for future missions of on-board processing and recording and to provide high-speed, high-volume computational systems that are anticipated for missions such as the evolutionary Space Station and Earth Observing System.

  14. Checkout/demonstration application program for the SEL 840MP Multi-Processing Control System: Version 1 (MPCS/1)

    NASA Technical Reports Server (NTRS)

    Anderson, W. F.; Conway, J. R.; Keller, L. C.

    1972-01-01

    The characteristics of the application program were developed to verify and demonstrate the SEL 840MP Multi-Processing Control System - Version I (MPCS/1). The application program emphasizes the display support and task control capabilities. The application program is further intended to be used as an aid to familization with MPCS/1. It complements the information provided in the MPCS/1 Users Guide, Volume I and II.

  15. Defense Acquisition Structures and Capabilities Review. Addendum. National Defense Authorization Act Fiscal Year 2006 Section 814 Report

    DTIC Science & Technology

    2007-06-01

    National Security Agency ( NSA ), one significant short- fall in coordinating requirements occurs with respect to NSA and the Information Assurance...funding issues and potential performance and schedule problems. A formal review process for all NSA requirements should therefore be implemented to...issues between Service networks to permit true “joint access. j. Establish a formal review process for all NSA , or any other non-DoD requirements. 3

  16. magHD: a new approach to multi-dimensional data storage, analysis, display and exploitation

    NASA Astrophysics Data System (ADS)

    Angleraud, Christophe

    2014-06-01

    The ever increasing amount of data and processing capabilities - following the well- known Moore's law - is challenging the way scientists and engineers are currently exploiting large datasets. The scientific visualization tools, although quite powerful, are often too generic and provide abstract views of phenomena, thus preventing cross disciplines fertilization. On the other end, Geographic information Systems allow nice and visually appealing maps to be built but they often get very confused as more layers are added. Moreover, the introduction of time as a fourth analysis dimension to allow analysis of time dependent phenomena such as meteorological or climate models, is encouraging real-time data exploration techniques that allow spatial-temporal points of interests to be detected by integration of moving images by the human brain. Magellium is involved in high performance image processing chains for satellite image processing as well as scientific signal analysis and geographic information management since its creation (2003). We believe that recent work on big data, GPU and peer-to-peer collaborative processing can open a new breakthrough in data analysis and display that will serve many new applications in collaborative scientific computing, environment mapping and understanding. The magHD (for Magellium Hyper-Dimension) project aims at developing software solutions that will bring highly interactive tools for complex datasets analysis and exploration commodity hardware, targeting small to medium scale clusters with expansion capabilities to large cloud based clusters.

  17. Warfighter information services: lessons learned in the intelligence domain

    NASA Astrophysics Data System (ADS)

    Bray, S. E.

    2014-05-01

    A vision was presented in a previous paper of how a common set of services within a framework could be used to provide all the information processing needs of Warfighters. Central to that vision was the concept of a "Virtual Knowledge Base". The paper presents an implementation of these ideas in the intelligence domain. Several innovative technologies were employed in the solution, which are presented and their benefits explained. The project was successful, validating many of the design principles for such a system which had been proposed in earlier work. Many of these principles are discussed in detail, explaining lessons learned. The results showed that it is possible to make vast improvements in the ability to exploit available data, making it discoverable and queryable wherever it is from anywhere within a participating network; and to exploit machine reasoning to make faster and better inferences from available data, enabling human analysts to spend more of their time doing more difficult analytical tasks rather than searching for relevant data. It was also demonstrated that a small number of generic Information Processing services can be combined and configured in a variety of ways (without changing any software code) to create "fact-processing" workflows, in this case to create different intelligence analysis capabilities. It is yet to be demonstrated that the same generic services can be reused to create analytical/situational awareness capabilities for logistics, operations, planning or other military functions but this is considered likely.

  18. Cochlea-inspired sensing node for compressive sensing

    NASA Astrophysics Data System (ADS)

    Peckens, Courtney A.; Lynch, Jerome P.

    2013-04-01

    While sensing technologies for structural monitoring applications have made significant advances over the last several decades, there is still room for improvement in terms of computational efficiency, as well as overall energy consumption. The biological nervous system can offer a potential solution to address these current deficiencies. The nervous system is capable of sensing and aggregating information about the external environment through very crude processing units known as neurons. Neurons effectively communicate in an extremely condensed format by encoding information into binary electrical spike trains, thereby reducing the amount of raw information sent throughout a neural network. Due to its unique signal processing capabilities, the mammalian cochlea and its interaction with the biological nervous system is of particular interest for devising compressive sensing strategies for dynamic engineered systems. The cochlea uses a novel method of place theory and frequency decomposition, thereby allowing for rapid signal processing within the nervous system. In this study, a low-power sensing node is proposed that draws inspiration from the mechanisms employed by the cochlea and the biological nervous system. As such, the sensor is able to perceive and transmit a compressed representation of the external stimulus with minimal distortion. Each sensor represents a basic building block, with function similar to the neuron, and can form a network with other sensors, thus enabling a system that can convey input stimulus in an extremely condensed format. The proposed sensor is validated through a structural monitoring application of a single degree of freedom structure excited by seismic ground motion.

  19. An examination of iconic memory in children with autism spectrum disorders.

    PubMed

    McMorris, Carly A; Brown, Stephanie M; Bebko, James M

    2013-08-01

    Iconic memory is the ability to accurately recall a number of items after a very brief visual exposure. Previous research has examined these capabilities in typically developing (TD) children and individuals with intellectual disabilities (ID); however, there is limited research on these abilities in children with Autism Spectrum Disorders (ASD). Twenty-one TD and eighteen ASD children were presented with circular visual arrays of letters for 100 ms and were asked to recall as many letters as possible or a single letter that was cued for recall. Groups did not differ in the number of items recalled, the rate of information decay, or speed of information processing. These findings suggest that iconic memory is an intact skill for children with ASD, a result that has implications for subsequent information processing.

  20. Knowledge-based expert systems and a proof-of-concept case study for multiple sequence alignment construction and analysis.

    PubMed

    Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron; Thompson, Julie Dawn

    2009-01-01

    The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented.

  1. Knowledge-based expert systems and a proof-of-concept case study for multiple sequence alignment construction and analysis

    PubMed Central

    Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron

    2009-01-01

    The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented. PMID:18971242

  2. 76 FR 4708 - Agency Information Collection Activities: Submission for OMB Review; Comment Request, OMB No...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-26

    ... assess disaster logistics planning and response capabilities and identify areas of relative strength and...; Logistics Capability Assessment Tool (LCAT) AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice...: Collection of Information Title: Logistics Capability Assessment Tool (LCAT). Type of Information Collection...

  3. VISSR Atmospheric Sounder (VAS) Research Review

    NASA Technical Reports Server (NTRS)

    Greaves, J. R. (Editor)

    1983-01-01

    The VAS, an experimental instrument flown onboard Geostationary Operational Environmental Satellite (GOES), is capable of achieving mutlispectral imagery of atmospheric temperature, water vapor, and cloudiness patterns over short time intervals. In addition, this instrument provides an atmospheric sounding capability from geosynchronous orbit. The VAS demonstration is an effort for evaluating the VAS instrument's performance, and for demonstrating the capabilities of a VAS prototype system to provide useful geosynchronous satellite data for supporting weather forecasts and atmospheric research. The demonstration evaluates the performance of the VAS Instruments on GOES-4-5, and -6, develops research oriented and prototype/operational VAS data processing systems, determines the accuracy of certain basic and derived meteorological parameters that can be obtained from the VAS instrument, and assesses the utility of VAS derived information in analyzing severe weather situations.

  4. A survey of tools and resources for the next generation analyst

    NASA Astrophysics Data System (ADS)

    Hall, David L.; Graham, Jake; Catherman, Emily

    2015-05-01

    We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.

  5. Assessing COSMO-SkyMed capability for crops identification and monitoring

    NASA Astrophysics Data System (ADS)

    Guarini, R.; Dini, L.

    2015-12-01

    In the last decade, it has been possible to better understand the impact of agricultural human practices on the global environmental change at different spatial (from local to global) and time (from seasonal to decadal) scales. This has been achieved thanks to: big dataset continuously acquired by Earth Observation (EO) satellites; the improved capabilities of remote sensing techniques in extracting valuable information from the EO datasets; the new EO data policy which allowed unrestricted data usage; the net technologies which allowed to quickly and easily share national, international and market-derived information; an increasingly performing computing technology which allows to massively process large amount of data easier and at decreasing costs. To better understand the environmental impacts of agriculture and to monitor the consequences of human agricultural activities on the biosphere, scientists require to better identify crops and monitor crop conditions over time and space. Traditionally, NDVI time series maps derived from optical sensors have been used to this aim. As well-known this important source of information is conditioned by cloud cover. Unlike passive systems, synthetic aperture radar (SAR) ones are almost insensitive to atmospheric influences; thus, they are especially suitable for crop identification and condition monitoring. Among the other SAR systems currently in orbit, the Italian Space Agency (ASI) COSMO Sky-Med® (CSK®) constellation (X-band, frequency 9.6 GHz, wavelength 3.1 cm), especially for its peculiar high revisit capability (up to four images in 16 days with same acquisition geometry) seems to be particular suitable for providing information in addition and/or in alternative to other optical EO systems. To assess the capability of the CSK® constellation in identifying crops and in monitoring crops condition in 2013 ASI started the "AGRICIDOT" project. Some of the main project achievements will be presented at the congress.

  6. The Global Emergency Observation and Warning System

    NASA Technical Reports Server (NTRS)

    Bukley, Angelia P.; Mulqueen, John A.

    1994-01-01

    Based on an extensive characterization of natural hazards, and an evaluation of their impacts on humanity, a set of functional technical requirements for a global warning and relief system was developed. Since no technological breakthroughs are required to implement a global system capable of performing the functions required to provide sufficient information for prevention, preparedness, warning, and relief from natural disaster effects, a system is proposed which would combine the elements of remote sensing, data processing, information distribution, and communications support on a global scale for disaster mitigation.

  7. Informed Consent to Treatment in Psychiatry

    PubMed Central

    Neilson, Grainne; Chaimowitz, Gary

    2015-01-01

    Summary Patients have a right to be informed and actively involved in their health care. Fundamental to a person’s dignity and autonomy is the right to make decisions about their psychiatric treatment, including their right to refuse unwanted treatments, providing that the refusal is a capable one. It is important that psychiatrists have an awareness of the ethical underpinnings of consent and the legislated requirements related to consent, including precedent cases. Consent may change over time and for different conditions and circumstances. Consent must be an ongoing process.

  8. Lyceum: A Multi-Protocol Digital Library Gateway

    NASA Technical Reports Server (NTRS)

    Maa, Ming-Hokng; Nelson, Michael L.; Esler, Sandra L.

    1997-01-01

    Lyceum is a prototype scalable query gateway that provides a logically central interface to multi-protocol and physically distributed, digital libraries of scientific and technical information. Lyceum processes queries to multiple syntactically distinct search engines used by various distributed information servers from a single logically central interface without modification of the remote search engines. A working prototype (http://www.larc.nasa.gov/lyceum/) demonstrates the capabilities, potentials, and advantages of this type of meta-search engine by providing access to over 50 servers covering over 20 disciplines.

  9. Data-Base Software For Tracking Technological Developments

    NASA Technical Reports Server (NTRS)

    Aliberti, James A.; Wright, Simon; Monteith, Steve K.

    1996-01-01

    Technology Tracking System (TechTracS) computer program developed for use in storing and retrieving information on technology and related patent information developed under auspices of NASA Headquarters and NASA's field centers. Contents of data base include multiple scanned still images and quick-time movies as well as text. TechTracS includes word-processing, report-editing, chart-and-graph-editing, and search-editing subprograms. Extensive keyword searching capabilities enable rapid location of technologies, innovators, and companies. System performs routine functions automatically and serves multiple users.

  10. Cell phones as imaging sensors

    NASA Astrophysics Data System (ADS)

    Bhatti, Nina; Baker, Harlyn; Marguier, Joanna; Berclaz, Jérôme; Süsstrunk, Sabine

    2010-04-01

    Camera phones are ubiquitous, and consumers have been adopting them faster than any other technology in modern history. When connected to a network, though, they are capable of more than just picture taking: Suddenly, they gain access to the power of the cloud. We exploit this capability by providing a series of image-based personal advisory services. These are designed to work with any handset over any cellular carrier using commonly available Multimedia Messaging Service (MMS) and Short Message Service (SMS) features. Targeted at the unsophisticated consumer, these applications must be quick and easy to use, not requiring download capabilities or preplanning. Thus, all application processing occurs in the back-end system (i.e., as a cloud service) and not on the handset itself. Presenting an image to an advisory service in the cloud, a user receives information that can be acted upon immediately. Two of our examples involve color assessment - selecting cosmetics and home décor paint palettes; the third provides the ability to extract text from a scene. In the case of the color imaging applications, we have shown that our service rivals the advice quality of experts. The result of this capability is a new paradigm for mobile interactions - image-based information services exploiting the ubiquity of camera phones.

  11. Weather data dissemination to aircraft

    NASA Technical Reports Server (NTRS)

    Mcfarland, Richard H.; Parker, Craig B.

    1990-01-01

    Documentation exists that shows weather to be responsible for approximately 40 percent of all general aviation accidents with fatalities. Weather data products available on the ground are becoming more sophisticated and greater in number. Although many of these data are critical to aircraft safety, they currently must be transmitted verbally to the aircraft. This process is labor intensive and provides a low rate of information transfer. Consequently, the pilot is often forced to make life-critical decisions based on incomplete and outdated information. Automated transmission of weather data from the ground to the aircraft can provide the aircrew with accurate data in near-real time. The current National Airspace System Plan calls for such an uplink capability to be provided by the Mode S Beacon System data link. Although this system has a very advanced data link capability, it will not be capable of providing adequate weather data to all airspace users in its planned configuration. This paper delineates some of the important weather data uplink system requirements, and describes a system which is capable of meeting these requirements. The proposed system utilizes a run-length coding technique for image data compression and a hybrid phase and amplitude modulation technique for the transmission of both voice and weather data on existing aeronautical Very High Frequency (VHF) voice communication channels.

  12. An observational study of the relationship between meaningful use-based electronic health information exchange, interoperability, and medication reconciliation capabilities.

    PubMed

    Elysee, Gerald; Herrin, Jeph; Horwitz, Leora I

    2017-10-01

    Stagnation in hospitals' adoption of data integration functionalities coupled with reduction in the number of operational health information exchanges could become a significant impediment to hospitals' adoption of 3 critical capabilities: electronic health information exchange, interoperability, and medication reconciliation, in which electronic systems are used to assist with resolving medication discrepancies and improving patient safety. Against this backdrop, we assessed the relationships between the 3 capabilities.We conducted an observational study applying partial least squares-structural equation modeling technique to 27 variables obtained from the 2013 American Hospital Association annual survey Information Technology (IT) supplement, which describes health IT capabilities.We included 1330 hospitals. In confirmatory factor analysis, out of the 27 variables, 15 achieved loading values greater than 0.548 at P < .001, as such were validated as the building blocks of the 3 capabilities. Subsequent path analysis showed a significant, positive, and cyclic relationship between the capabilities, in that decreases in the hospitals' adoption of one would lead to decreases in the adoption of the others.These results show that capability for high quality medication reconciliation may be impeded by lagging adoption of interoperability and health information exchange capabilities. Policies focused on improving one or more of these capabilities may have ancillary benefits.

  13. Rotorcraft Conceptual Design Environment

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne; Sinsay, Jeffrey

    2009-01-01

    Requirements for a rotorcraft conceptual design environment are discussed, from the perspective of a government laboratory. Rotorcraft design work in a government laboratory must support research, by producing technology impact assessments and defining the context for research and development; and must support the acquisition process, including capability assessments and quantitative evaluation of designs, concepts, and alternatives. An information manager that will enable increased fidelity of analysis early in the design effort is described. This manager will be a framework to organize information that describes the aircraft, and enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described.

  14. Rotorcraft Conceptual Design Environment

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne; Sinsay, Jeffrey D.

    2010-01-01

    Requirements for a rotorcraft conceptual design environment are discussed, from the perspective of a government laboratory. Rotorcraft design work in a government laboratory must support research, by producing technology impact assessments and defining the context for research and development; and must support the acquisition process, including capability assessments and quantitative evaluation of designs, concepts, and alternatives. An information manager that will enable increased fidelity of analysis early in the design effort is described. This manager will be a framework to organize information that describes the aircraft, and enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described.

  15. The Effect of Visual Information on the Manual Approach and Landing

    NASA Technical Reports Server (NTRS)

    Wewerinke, P. H.

    1982-01-01

    The effect of visual information in combination with basic display information on the approach performance. A pre-experimental model analysis was performed in terms of the optimal control model. The resulting aircraft approach performance predictions were compared with the results of a moving base simulator program. The results illustrate that the model provides a meaningful description of the visual (scene) perception process involved in the complex (multi-variable, time varying) manual approach task with a useful predictive capability. The theoretical framework was shown to allow a straight-forward investigation of the complex interaction of a variety of task variables.

  16. Stable isotope probing in the metagenomics era: a bridge towards improved bioremediation

    PubMed Central

    Uhlik, Ondrej; Leewis, Mary-Cathrine; Strejcek, Michal; Musilova, Lucie; Mackova, Martina; Leigh, Mary Beth; Macek, Tomas

    2012-01-01

    Microbial biodegradation and biotransformation reactions are essential to most bioremediation processes, yet the specific organisms, genes, and mechanisms involved are often not well understood. Stable isotope probing (SIP) enables researchers to directly link microbial metabolic capability to phylogenetic and metagenomic information within a community context by tracking isotopically labeled substances into phylogenetically and functionally informative biomarkers. SIP is thus applicable as a tool for the identification of active members of the microbial community and associated genes integral to the community functional potential, such as biodegradative processes. The rapid evolution of SIP over the last decade and integration with metagenomics provides researchers with a much deeper insight into potential biodegradative genes, processes, and applications, thereby enabling an improved mechanistic understanding that can facilitate advances in the field of bioremediation. PMID:23022353

  17. Problems of systems dataware using optoelectronic measuring means of linear displacement

    NASA Astrophysics Data System (ADS)

    Bazykin, S. N.; Bazykina, N. A.; Samohina, K. S.

    2017-10-01

    Problems of the dataware of the systems with the use of optoelectronic means of the linear displacement are considered in the article. The classification of the known physical effects, realized by the means of information-measuring systems, is given. The organized analysis of information flows in technical systems from the standpoint of determination of inaccuracies of measurement and management was conducted. In spite of achieved successes in automation of machine-building and instruments-building equipment in the field of dataware of the technical systems, there are unresolved problems, concerning the qualitative aspect of the production process. It was shown that the given problem can be solved using optoelectronic lazer information-measuring systems. Such information-measuring systems are capable of not only executing the measuring functions, but also solving the problems of management and control during processing, thereby guaranteeing the quality of final products.

  18. Fast reversible learning based on neurons functioning as anisotropic multiplex hubs

    NASA Astrophysics Data System (ADS)

    Vardi, Roni; Goldental, Amir; Sheinin, Anton; Sardi, Shira; Kanter, Ido

    2017-05-01

    Neural networks are composed of neurons and synapses, which are responsible for learning in a slow adaptive dynamical process. Here we experimentally show that neurons act like independent anisotropic multiplex hubs, which relay and mute incoming signals following their input directions. Theoretically, the observed information routing enriches the computational capabilities of neurons by allowing, for instance, equalization among different information routes in the network, as well as high-frequency transmission of complex time-dependent signals constructed via several parallel routes. In addition, this kind of hubs adaptively eliminate very noisy neurons from the dynamics of the network, preventing masking of information transmission. The timescales for these features are several seconds at most, as opposed to the imprint of information by the synaptic plasticity, a process which exceeds minutes. Results open the horizon to the understanding of fast and adaptive learning realities in higher cognitive brain's functionalities.

  19. What should we measure? Conceptualizing usage in health information exchange

    PubMed Central

    Jasperson, Jon

    2010-01-01

    Under the provisions of the Health Information Technology for Economic & Clinical Health act providers need to demonstrate their ‘meaningful use’ of electronic health record systems' health information exchange (HIE) capability. HIE usage is not a simple construct, but the choice of its measurement must attend to the users, context, and objectives of the system being examined. This review examined how usage is reported in the existing literature and also what conceptualizations of usage might best reflect the nature and objectives of HIE. While existing literature on HIE usage included a diverse set of measures, most were theoretically weak, did not attend to the interplay of measure, level of analysis and architectural strategy, and did not reflect how HIE usage affected the actual process of care. Attention to these issues will provide greater insight into the effects of previously inaccessible information on medical decision-making and the process of care. PMID:20442148

  20. A federated capability-based access control mechanism for internet of things (IoTs)

    NASA Astrophysics Data System (ADS)

    Xu, Ronghua; Chen, Yu; Blasch, Erik; Chen, Genshe

    2018-05-01

    The prevalence of Internet of Things (IoTs) allows heterogeneous embedded smart devices to collaboratively provide intelligent services with or without human intervention. While leveraging the large-scale IoT-based applications like Smart Gird and Smart Cities, IoT also incurs more concerns on privacy and security. Among the top security challenges that IoTs face is that access authorization is critical in resource and information protection over IoTs. Traditional access control approaches, like Access Control Lists (ACL), Role-based Access Control (RBAC) and Attribute-based Access Control (ABAC), are not able to provide a scalable, manageable and efficient mechanisms to meet requirement of IoT systems. The extraordinary large number of nodes, heterogeneity as well as dynamicity, necessitate more fine-grained, lightweight mechanisms for IoT devices. In this paper, a federated capability-based access control (FedCAC) framework is proposed to enable an effective access control processes to devices, services and information in large scale IoT systems. The federated capability delegation mechanism, based on a propagation tree, is illustrated for access permission propagation. An identity-based capability token management strategy is presented, which involves registering, propagation and revocation of the access authorization. Through delegating centralized authorization decision-making policy to local domain delegator, the access authorization process is locally conducted on the service provider that integrates situational awareness (SAW) and customized contextual conditions. Implemented and tested on both resources-constrained devices, like smart sensors and Raspberry PI, and non-resource-constrained devices, like laptops and smart phones, our experimental results demonstrate the feasibility of the proposed FedCAC approach to offer a scalable, lightweight and fine-grained access control solution to IoT systems connected to a system network.

  1. Natural language processing systems for capturing and standardizing unstructured clinical information: A systematic review.

    PubMed

    Kreimeyer, Kory; Foster, Matthew; Pandey, Abhishek; Arya, Nina; Halford, Gwendolyn; Jones, Sandra F; Forshee, Richard; Walderhaug, Mark; Botsis, Taxiarchis

    2017-09-01

    We followed a systematic approach based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses to identify existing clinical natural language processing (NLP) systems that generate structured information from unstructured free text. Seven literature databases were searched with a query combining the concepts of natural language processing and structured data capture. Two reviewers screened all records for relevance during two screening phases, and information about clinical NLP systems was collected from the final set of papers. A total of 7149 records (after removing duplicates) were retrieved and screened, and 86 were determined to fit the review criteria. These papers contained information about 71 different clinical NLP systems, which were then analyzed. The NLP systems address a wide variety of important clinical and research tasks. Certain tasks are well addressed by the existing systems, while others remain as open challenges that only a small number of systems attempt, such as extraction of temporal information or normalization of concepts to standard terminologies. This review has identified many NLP systems capable of processing clinical free text and generating structured output, and the information collected and evaluated here will be important for prioritizing development of new approaches for clinical NLP. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Optical security system for the protection of personal identification information.

    PubMed

    Doh, Yang-Hoi; Yoon, Jong-Soo; Choi, Kyung-Hyun; Alam, Mohammad S

    2005-02-10

    A new optical security system for the protection of personal identification information is proposed. First, authentication of the encrypted personal information is carried out by primary recognition of a personal identification number (PIN) with the proposed multiplexed minimum average correlation energy phase-encrypted (MMACE_p) filter. The MMACE_p filter, synthesized with phase-encrypted training images, can increase the discrimination capability and prevent the leak of personal identification information. After the PIN is recognized, speedy authentication of personal information can be achieved through one-to-one optical correlation by means of the optical wavelet filter. The possibility of information counterfeiting can be significantly decreased with the double-identification process. Simulation results demonstrate the effectiveness of the proposed technique.

  3. Sight-Reading Expertise: Cross-Modality Integration Investigated Using Eye Tracking

    ERIC Educational Resources Information Center

    Drai-Zerbib, Veronique; Baccino, Thierry; Bigand, Emmanuel

    2012-01-01

    It is often said that experienced musicians are capable of hearing what they read (and vice versa). This suggests that they are able to process and to integrate multimodal information. The present study investigates this issue with an eye-tracking technique. Two groups of musicians chosen on the basis of their level of expertise (experts,…

  4. Ideas Tried, Lessons Learned, and Improvements to Make: A Journey in Moving a Spreadsheet-Intensive Course Online

    ERIC Educational Resources Information Center

    Berardi, Victor L.

    2012-01-01

    Using information systems to solve business problems is increasingly required of everyone in an organization, not just technical specialists. In the operations management class, spreadsheet usage has intensified with the focus on building decision models to solve operations management concerns such as forecasting, process capability, and inventory…

  5. E-Learning and Higher Education: Understanding and Supporting Organisational Change in New Zealand

    ERIC Educational Resources Information Center

    Marshall, Stephen

    2012-01-01

    Over an 18-month period four New Zealand educational institutions--a university, a private tertiary enterprise, a wananga, and an institute of technology/polytechnic--have engaged in a process of change influenced by technology. Their e-learning capability was benchmarked using the E-Learning Maturity Model, and this information was used to…

  6. Diameter sensors for tree-length harvesting systems

    Treesearch

    T.P. McDonald; Robert B. Rummer; T.E. Grift

    2003-01-01

    Most cut-to-length (CTL) harvesters provide sensors for measuring diameter of trees as they are cut and processed. Among other uses, this capability provides a data collection tool for marketing of logs in real time. Logs can be sorted and stacked based on up-to-date market information, then transportation systems optimized to route wood to proper destinations at...

  7. DynAMo: A Modular Platform for Monitoring Process, Outcome, and Algorithm-Based Treatment Planning in Psychotherapy

    PubMed Central

    Laireiter, Anton Rupert

    2017-01-01

    Background In recent years, the assessment of mental disorders has become more and more personalized. Modern advancements such as Internet-enabled mobile phones and increased computing capacity make it possible to tap sources of information that have long been unavailable to mental health practitioners. Objective Software packages that combine algorithm-based treatment planning, process monitoring, and outcome monitoring are scarce. The objective of this study was to assess whether the DynAMo Web application can fill this gap by providing a software solution that can be used by both researchers to conduct state-of-the-art psychotherapy process research and clinicians to plan treatments and monitor psychotherapeutic processes. Methods In this paper, we report on the current state of a Web application that can be used for assessing the temporal structure of mental disorders using information on their temporal and synchronous associations. A treatment planning algorithm automatically interprets the data and delivers priority scores of symptoms to practitioners. The application is also capable of monitoring psychotherapeutic processes during therapy and of monitoring treatment outcomes. This application was developed using the R programming language (R Core Team, Vienna) and the Shiny Web application framework (RStudio, Inc, Boston). It is made entirely from open-source software packages and thus is easily extensible. Results The capabilities of the proposed application are demonstrated. Case illustrations are provided to exemplify its usefulness in clinical practice. Conclusions With the broad availability of Internet-enabled mobile phones and similar devices, collecting data on psychopathology and psychotherapeutic processes has become easier than ever. The proposed application is a valuable tool for capturing, processing, and visualizing these data. The combination of dynamic assessment and process- and outcome monitoring has the potential to improve the efficacy and effectiveness of psychotherapy. PMID:28729233

  8. Low NO sub x heavy fuel combustor concept program

    NASA Technical Reports Server (NTRS)

    Russell, P.; Beal, G.; Hinton, B.

    1981-01-01

    A gas turbine technology program to improve and optimize the staged rich lean low NOx combustor concept is described. Subscale combustor tests to develop the design information for optimization of the fuel preparation, rich burn, quick air quench, and lean burn steps of the combustion process were run. The program provides information for the design of high pressure full scale gas turbine combustors capable of providing environmentally clean combustion of minimally of minimally processed and synthetic fuels. It is concluded that liquid fuel atomization and mixing, rich zone stoichiometry, rich zone liner cooling, rich zone residence time, and quench zone stoichiometry are important considerations in the design and scale up of the rich lean combustor.

  9. Mechanical logic switches based on DNA-inspired acoustic metamaterials with ultrabroad low-frequency band gaps

    NASA Astrophysics Data System (ADS)

    Zheng, Bowen; Xu, Jun

    2017-11-01

    Mechanical information processing and control has attracted great attention in recent years. A challenging pursuit is to achieve broad functioning frequency ranges, especially at low-frequency domain. Here, we propose a design of mechanical logic switches based on DNA-inspired chiral acoustic metamaterials, which are capable of having ultrabroad band gaps at low-frequency domain. Logic operations can be easily performed by applying constraints at different locations and the functioning frequency ranges are able to be low, broad and tunable. This work may have an impact on the development of mechanical information processing, programmable materials, stress wave manipulation, as well as the isolation of noise and harmful vibration.

  10. Tomography and Purification of the Temporal-Mode Structure of Quantum Light

    NASA Astrophysics Data System (ADS)

    Ansari, Vahid; Donohue, John M.; Allgaier, Markus; Sansoni, Linda; Brecht, Benjamin; Roslund, Jonathan; Treps, Nicolas; Harder, Georg; Silberhorn, Christine

    2018-05-01

    High-dimensional quantum information processing promises capabilities beyond the current state of the art, but addressing individual information-carrying modes presents a significant experimental challenge. Here we demonstrate effective high-dimensional operations in the time-frequency domain of nonclassical light. We generate heralded photons with tailored temporal-mode structures through the pulse shaping of a broadband parametric down-conversion pump. We then implement a quantum pulse gate, enabled by dispersion-engineered sum-frequency generation, to project onto programmable temporal modes, reconstructing the quantum state in seven dimensions. We also manipulate the time-frequency structure by selectively removing temporal modes, explicitly demonstrating the effectiveness of engineered nonlinear processes for the mode-selective manipulation of quantum states.

  11. Neural processing of gravity information

    NASA Technical Reports Server (NTRS)

    Schor, Robert H.

    1992-01-01

    The goal of this project was to use the linear acceleration capabilities of the NASA Vestibular Research Facility (VRF) at Ames Research Center to directly examine encoding of linear accelerations in the vestibular system of the cat. Most previous studies, including my own, have utilized tilt stimuli, which at very low frequencies (e.g., 'static tilt') can be considered a reasonably pure linear acceleration (e.g., 'down'); however, higher frequencies of tilt, necessary for understanding the dynamic processing of linear acceleration information, necessarily involves rotations which can stimulate the semicircular canals. The VRF, particularly the Long Linear Sled, has promise to provide controlled pure linear accelerations at a variety of stimulus frequencies, with no confounding angular motion.

  12. Basic multisensory functions can be acquired after congenital visual pattern deprivation in humans.

    PubMed

    Putzar, Lisa; Gondan, Matthias; Röder, Brigitte

    2012-01-01

    People treated for bilateral congenital cataracts offer a model to study the influence of visual deprivation in early infancy on visual and multisensory development. We investigated cross-modal integration capabilities in cataract patients using a simple detection task that provided redundant information to two different senses. In both patients and controls, redundancy gains were consistent with coactivation models, indicating an integrated processing of modality-specific information. This finding is in contrast with recent studies showing impaired higher-level multisensory interactions in cataract patients. The present results suggest that basic cross-modal integrative processes for simple short stimuli do not depend on visual and/or crossmodal input since birth.

  13. Military clouds: utilization of cloud computing systems at the battlefield

    NASA Astrophysics Data System (ADS)

    Süleyman, Sarıkürk; Volkan, Karaca; İbrahim, Kocaman; Ahmet, Şirzai

    2012-05-01

    Cloud computing is known as a novel information technology (IT) concept, which involves facilitated and rapid access to networks, servers, data saving media, applications and services via Internet with minimum hardware requirements. Use of information systems and technologies at the battlefield is not new. Information superiority is a force multiplier and is crucial to mission success. Recent advances in information systems and technologies provide new means to decision makers and users in order to gain information superiority. These developments in information technologies lead to a new term, which is known as network centric capability. Similar to network centric capable systems, cloud computing systems are operational today. In the near future extensive use of military clouds at the battlefield is predicted. Integrating cloud computing logic to network centric applications will increase the flexibility, cost-effectiveness, efficiency and accessibility of network-centric capabilities. In this paper, cloud computing and network centric capability concepts are defined. Some commercial cloud computing products and applications are mentioned. Network centric capable applications are covered. Cloud computing supported battlefield applications are analyzed. The effects of cloud computing systems on network centric capability and on the information domain in future warfare are discussed. Battlefield opportunities and novelties which might be introduced to network centric capability by cloud computing systems are researched. The role of military clouds in future warfare is proposed in this paper. It was concluded that military clouds will be indispensible components of the future battlefield. Military clouds have the potential of improving network centric capabilities, increasing situational awareness at the battlefield and facilitating the settlement of information superiority.

  14. Utilization of extended bayesian networks in decision making under uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Eeckhout, Edward M; Leishman, Deborah A; Gibson, William L

    2009-01-01

    Bayesian network tool (called IKE for Integrated Knowledge Engine) has been developed to assess the probability of undesirable events. The tool allows indications and observables from sensors and/or intelligence to feed directly into hypotheses of interest, thus allowing one to quantify the probability and uncertainty of these events resulting from very disparate evidence. For example, the probability that a facility is processing nuclear fuel or assembling a weapon can be assessed by examining the processes required, establishing the observables that should be present, then assembling information from intelligence, sensors and other information sources related to the observables. IKE also hasmore » the capability to determine tasking plans, that is, prioritize which observable should be collected next to most quickly ascertain the 'true' state and drive the probability toward 'zero' or 'one.' This optimization capability is called 'evidence marshaling.' One example to be discussed is a denied facility monitoring situation; there is concern that certain process(es) are being executed at the site (due to some intelligence or other data). We will show how additional pieces of evidence will then ascertain with some degree of certainty the likelihood of this process(es) as each piece of evidence is obtained. This example shows how both intelligence and sensor data can be incorporated into the analysis. A second example involves real-time perimeter security. For this demonstration we used seismic, acoustic, and optical sensors linked back to IKE. We show how these sensors identified and assessed the likelihood of 'intruder' versus friendly vehicles.« less

  15. Algorithms exploiting ultrasonic sensors for subject classification

    NASA Astrophysics Data System (ADS)

    Desai, Sachi; Quoraishee, Shafik

    2009-09-01

    Proposed here is a series of techniques exploiting micro-Doppler ultrasonic sensors capable of characterizing various detected mammalian targets based on their physiological movements captured a series of robust features. Employed is a combination of unique and conventional digital signal processing techniques arranged in such a manner they become capable of classifying a series of walkers. These processes for feature extraction develops a robust feature space capable of providing discrimination of various movements generated from bipeds and quadrupeds and further subdivided into large or small. These movements can be exploited to provide specific information of a given signature dividing it in a series of subset signatures exploiting wavelets to generate start/stop times. After viewing a series spectrograms of the signature we are able to see distinct differences and utilizing kurtosis, we generate an envelope detector capable of isolating each of the corresponding step cycles generated during a walk. The walk cycle is defined as one complete sequence of walking/running from the foot pushing off the ground and concluding when returning to the ground. This time information segments the events that are readily seen in the spectrogram but obstructed in the temporal domain into individual walk sequences. This walking sequence is then subsequently translated into a three dimensional waterfall plot defining the expected energy value associated with the motion at particular instance of time and frequency. The value is capable of being repeatable for each particular class and employable to discriminate the events. Highly reliable classification is realized exploiting a classifier trained on a candidate sample space derived from the associated gyrations created by motion from actors of interest. The classifier developed herein provides a capability to classify events as an adult humans, children humans, horses, and dogs at potentially high rates based on the tested sample space. The algorithm developed and described will provide utility to an underused sensor modality for human intrusion detection because of the current high-rate of generated false alarms. The active ultrasonic sensor coupled in a multi-modal sensor suite with binary, less descriptive sensors like seismic devices realizing a greater accuracy rate for detection of persons of interest for homeland purposes.

  16. Psychology of knowledge representation.

    PubMed

    Grimm, Lisa R

    2014-05-01

    Every cognitive enterprise involves some form of knowledge representation. Humans represent information about the external world and internal mental states, like beliefs and desires, and use this information to meet goals (e.g., classification or problem solving). Unfortunately, researchers do not have direct access to mental representations. Instead, cognitive scientists design experiments and implement computational models to develop theories about the mental representations present during task performance. There are several main types of mental representation and corresponding processes that have been posited: spatial, feature, network, and structured. Each type has a particular structure and a set of processes that are capable of accessing and manipulating information within the representation. The structure and processes determine what information can be used during task performance and what information has not been represented at all. As such, the different types of representation are likely used to solve different kinds of tasks. For example, structured representations are more complex and computationally demanding, but are good at representing relational information. Researchers interested in human psychology would benefit from considering how knowledge is represented in their domain of inquiry. For further resources related to this article, please visit the WIREs website. The author has declared no conflicts of interest for this article. © 2014 John Wiley & Sons, Ltd.

  17. Research Priorities in Environmental Risk Assessment. Workshop on Research Needs in Environmental Toxicology and Chemistry Held in Breckenridge, Colorado on August 16-21, 1987

    DTIC Science & Technology

    1988-06-30

    accordance with SETAC’s goal of providing a forum for communication among professionals involved with the use, protection, and management of the...templated action. Risk assessment provides technical input to risk management , the process of making decisions about the acceptability of risks and the need... management and computerized information-processing capabilities needed for risk assessment is also essential. Aquatic Toxicology In order to quantify and

  18. Coherent perfect rotation

    NASA Astrophysics Data System (ADS)

    Crescimanno, Michael; Dawson, Nathan J.; Andrews, James H.

    2012-09-01

    Two classes of conservative, linear, optical rotary effects (optical activity and Faraday rotation) are distinguished by their behavior under time reversal. Faraday rotation, but not optical activity, is capable of coherent perfect rotation, by which we mean the complete transfer of counterpropagating coherent light fields into their orthogonal polarization. Unlike coherent perfect absorption, however, this process is explicitly energy conserving and reversible. Our study highlights the necessity of time-reversal-odd processes (not just absorption) and coherence in perfect mode conversion and thus informs the optimization of active multiport optical devices.

  19. Image Processing

    NASA Technical Reports Server (NTRS)

    1987-01-01

    A new spinoff product was derived from Geospectra Corporation's expertise in processing LANDSAT data in a software package. Called ATOM (for Automatic Topographic Mapping), it's capable of digitally extracting elevation information from stereo photos taken by spaceborne cameras. ATOM offers a new dimension of realism in applications involving terrain simulations, producing extremely precise maps of an area's elevations at a lower cost than traditional methods. ATOM has a number of applications involving defense training simulations and offers utility in architecture, urban planning, forestry, petroleum and mineral exploration.

  20. Electrical features of new DNC, CNC system viewed

    NASA Astrophysics Data System (ADS)

    Fritzsch, W.; Kochan, D.; Schaller, J.; Zander, H. J.

    1985-03-01

    Control structures capable of solving the problems of a flexible minial-labor manufacturing process are analyzed. The present state of development of equipment technology is described, and possible ways of modeling control processes are surveyed. Concepts which are frequently differently interpreted in various specialized disciplines are systematized, with a view toward creating the prerequisites for interdisciplinary cooperation. Problems and information flow during the preparatory and performance phases of manufacturing are examined with respect to coupling CAD/CAM functions. Mathematical modeling for direct numerical control is explored.

  1. Information Technology and the Autonomous Control of a Mars In-Situ Propellant Production System

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R.; Sridhar, K. R.; Larson, William E.; Clancy, Daniel J.; Peschur, Charles; Briggs, Geoffrey A.; Zornetzer, Steven F. (Technical Monitor)

    1999-01-01

    With the rapidly increasing performance of information technology, i.e., computer hardware and software systems, as well as networks and communication systems, a new capability is being developed that holds the clear promise of greatly increased exploration capability, along with dramatically reduced design, development, and operating costs. These new intelligent systems technologies, utilizing knowledge-based software and very high performance computer systems, will provide new design and development tools, scheduling mechanisms, and vehicle and system health monitoring capabilities. In addition, specific technologies such as neural nets will provide a degree of machine intelligence and associated autonomy which has previously been unavailable to the mission and spacecraft designer and to the system operator. One of the most promising applications of these new information technologies is to the area of in situ resource utilization. Useful resources such as oxygen, compressed carbon dioxide, water, methane, and buffer gases can be extracted and/or generated from planetary atmospheres, such as the Martian atmosphere. These products, when used for propulsion and life-support needs can provide significant savings in the launch mass and costs for both robotic and crewed missions. In the longer term the utilization of indigenous resources is an enabling technology that is vital to sustaining long duration human presence on Mars. This paper will present the concepts that are currently under investigation and development for mining the Martian atmosphere, such as temperature-swing adsorption, zirconia electrolysis etc., to create propellants and life-support materials. This description will be followed by an analysis of the information technology and control needs for the reliable and autonomous operation of such processing plants in a fault tolerant manner, as well as the approach being taken for the development of the controlling software. Finally, there will be a brief discussion of the verification and validation process so crucial to the implementation of mission-critical software.

  2. PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Czuchlewski, Kristina Rodriguez; Hart, William E.

    Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of humanmore » perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into overlooked relationships and patterns. The capability is directly relevant to the nation's nonproliferation remote-sensing activities and has broad national security applications for military and intelligence- gathering organizations.« less

  3. Lessons Learned from the Development of an Example Precision Information Environment for International Safeguards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe N.; Henry, Michael J.; Burtner, IV, E. R.

    The International Atomic Energy Agency (IAEA) is interested in increasing capabilities of IAEA safeguards inspectors to access information that would improve their situational awareness on the job. A mobile information platform could potentially provide access to information, analytics, and technical and logistical support to inspectors in the field, as well as providing regular updates to analysts at IAEA Headquarters in Vienna or at satellite offices. To demonstrate the potential capability of such a system, Pacific Northwest National Laboratory (PNNL) implemented a number of example capabilities within a PNNL-developed precision information environment (PIE), and using a tablet as a mobile informationmore » platform. PNNL’s safeguards proof-of-concept PIE intends to; demonstrate novel applications of mobile information platforms to international safeguards use cases; demonstrate proof-of-principle capability implementation; and provide “vision” for capabilities that could be implemented. This report documents the lessons learned from this two-year development activity for the Precision Information Environment for International Safeguards (PIE-IS), describing the developed capabilities, technical challenges, and considerations for future development, so that developers working to develop a similar system for the IAEA or other safeguards agencies might benefit from our work.« less

  4. Exploration Medical Capability - Technology Watch

    NASA Technical Reports Server (NTRS)

    Krihak, Michael; Watkins, Sharmila; Barr, Yael; Barsten, Kristina; Fung, Paul; Baumann, David

    2011-01-01

    The objectives of the Technology Watch process are to identify emerging, high-impact technologies that augment current ExMC development efforts, and to work with academia, industry, and other government agencies to accelerate the development of medical care and research capabilities for the mitigation of potential health issues that could occur during space exploration missions. The establishment of collaborations with these entities is beneficial to technology development, assessment and/or insertion. Such collaborations also further NASA s goal to provide a safe and healthy environment for human exploration. The Tech Watch project addresses requirements and capabilities identified by knowledge and technology gaps that are derived from a discrete set of medical conditions that are most likely to occur on exploration missions. These gaps are addressed through technology readiness level assessments, market surveys, collaborations and distributed innovation opportunities. Ultimately, these gaps need to be closed with respect to exploration missions, and may be achieved through technology development projects. Information management is a key aspect to this process where Tech Watch related meetings, research articles, collaborations and partnerships are tracked by the HRP s Exploration Medical Capabilities (ExMC) Element. In 2011, ExMC will be introducing the Tech Watch external website and evidence wiki that will provide access to ExMC technology and knowledge gaps, technology needs and requirements documents.

  5. Detection of Subsurface Defects in Levees in Correlation to Weather Conditions Utilizing Ground Penetrating Radar

    NASA Astrophysics Data System (ADS)

    Martinez, I. A.; Eisenmann, D.

    2012-12-01

    Ground Penetrating Radar (GPR) has been used for many years in successful subsurface detection of conductive and non-conductive objects in all types of material including different soils and concrete. Typical defect detection is based on subjective examination of processed scans using data collection and analysis software to acquire and analyze the data, often requiring a developed expertise or an awareness of how a GPR works while collecting data. Processing programs, such as GSSI's RADAN analysis software are then used to validate the collected information. Iowa State University's Center for Nondestructive Evaluation (CNDE) has built a test site, resembling a typical levee used near rivers, which contains known sub-surface targets of varying size, depth, and conductivity. Scientist at CNDE have developed software with the enhanced capabilities, to decipher a hyperbola's magnitude and amplitude for GPR signal processing. With this enhanced capability, the signal processing and defect detection capabilities for GPR have the potential to be greatly enhanced. This study will examine the effects of test parameters, antenna frequency (400MHz), data manipulation methods (which include data filters and restricting the range of depth in which the chosen antenna's signal can reach), and real-world conditions using this test site (such as varying weather conditions) , with the goal of improving GPR tests sensitivity for differing soil conditions.

  6. Materials management information systems.

    PubMed

    1996-01-01

    The hospital materials management function--ensuring that goods and services get from a source to an end user--encompasses many areas of the hospital and can significantly affect hospital costs. Performing this function in a manner that will keep costs down and ensure adequate cash flow requires effective management of a large amount of information from a variety of sources. To effectively coordinate such information, most hospitals have implemented some form of materials management information system (MMIS). These systems can be used to automate or facilitate functions such as purchasing, accounting, inventory management, and patient supply charges. In this study, we evaluated seven MMISs from seven vendors, focusing on the functional capabilities of each system and the quality of the service and support provided by the vendor. This Evaluation is intended to (1) assist hospitals purchasing an MMIS by educating materials managers about the capabilities, benefits, and limitations of MMISs and (2) educate clinical engineers and information system managers about the scope of materials management within a healthcare facility. Because software products cannot be evaluated in the same manner as most devices typically included in Health Devices Evaluations, our standard Evaluation protocol was not applicable for this technology. Instead, we based our ratings on our observations (e.g., during site visits), interviews we conducted with current users of each system, and information provided by the vendor (e.g., in response to a request for information [RFI]). We divided the Evaluation into the following sections: Section 1. Responsibilities and Information Requirements of Materials Management: Provides an overview of typical materials management functions and describes the capabilities, benefits, and limitations of MMISs. Also includes the supplementary article, "Inventory Cost and Reimbursement Issues" and the glossary, "Materials Management Terminology." Section 2. The MMIS Selection Process: Outlines steps to follow and describes factors to consider when selecting an MMIS. Also includes our Materials Management Process Evaluation and Needs Assessment Worksheet (which is also available online through ECRInet(TM)) and a list of suggested interview questions to be used when gathering user experience information for systems under consideration. Section 3A. MMIS Vendor Profiles: Presents information for the evaluated systems in a standardized, easy-to-compare format. Profiles include an Executive Summary describing our findings, a discussion of user comments, a listing of MMIS specifications, and information on the vendor's business background. Section 3B. Discussion of Vendor Profile Conclusions and Ratings: Presents our ratings and summarizes our rationale for all evaluated systems. Also includes a blank Vendor Profile Template to be used when gathering information on other vendors and systems. We found that, in general, all of the evaluated systems are able to meet most of the functional needs of a materials management department. However, we did uncover significant differences in the quality of service and support provided by each vendor, and our ratings reflect these differences: we rated two of the systems Acceptable--Preferred and four of the systems Acceptable. We have not yet rated the seventh system because our user experience information may not reflect the vendor's new ownership and management. When this vendor provides the references we requested, we will interview users and supply a rating. We caution readers against basing purchasing decisions solely on our ratings. Each hospital must consider the unique needs of its users and its overall strategic plans--a process that can be aided by using our Process Evaluation and Needs Assessment Worksheet. Our conclusions can then be used to narrow down the number of vendors under consideration...

  7. Rare-earth-doped materials with application to optical signal processing, quantum information science, and medical imaging technology

    NASA Astrophysics Data System (ADS)

    Cone, R. L.; Thiel, C. W.; Sun, Y.; Böttger, Thomas; Macfarlane, R. M.

    2012-02-01

    Unique spectroscopic properties of isolated rare earth ions in solids offer optical linewidths rivaling those of trapped single atoms and enable a variety of recent applications. We design rare-earth-doped crystals, ceramics, and fibers with persistent or transient "spectral hole" recording properties for applications including high-bandwidth optical signal processing where light and our solids replace the high-bandwidth portion of the electronics; quantum cryptography and information science including the goal of storage and recall of single photons; and medical imaging technology for the 700-900 nm therapeutic window. Ease of optically manipulating rare-earth ions in solids enables capturing complex spectral information in 105 to 108 frequency bins. Combining spatial holography and spectral hole burning provides a capability for processing high-bandwidth RF and optical signals with sub-MHz spectral resolution and bandwidths of tens to hundreds of GHz for applications including range-Doppler radar and high bandwidth RF spectral analysis. Simply stated, one can think of these crystals as holographic recording media capable of distinguishing up to 108 different colors. Ultra-narrow spectral holes also serve as a vibration-insensitive sub-kHz frequency reference for laser frequency stabilization to a part in 1013 over tens of milliseconds. The unusual properties and applications of spectral hole burning of rare earth ions in optical materials are reviewed. Experimental results on the promising Tm3+:LiNbO3 material system are presented and discussed for medical imaging applications. Finally, a new application of these materials as dynamic optical filters for laser noise suppression is discussed along with experimental demonstrations and theoretical modeling of the process.

  8. A Multi-mission Event-Driven Component-Based System for Support of Flight Software Development, ATLO, and Operations first used by the Mars Science Laboratory (MSL) Project

    NASA Technical Reports Server (NTRS)

    Dehghani, Navid; Tankenson, Michael

    2006-01-01

    This paper details an architectural description of the Mission Data Processing and Control System (MPCS), an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is developed based on a set of small reusable components, implemented in Java, each designed with a specific function and well-defined interfaces. An industry standard messaging bus is used to transfer information among system components. Components generate standard messages which are used to capture system information, as well as triggers to support the event-driven architecture of the system. Event-driven systems are highly desirable for processing high-rate telemetry (science and engineering) data, and for supporting automation for many mission operations processes.

  9. The UARS and open data system concept and analysis study. Executive summary

    NASA Technical Reports Server (NTRS)

    Mittal, M.; Nebb, J.; Woodward, H.

    1983-01-01

    Alternative concepts for a common design for the UARS and OPEN Central Data Handling Facility (CDHF) are offered. The designs are consistent with requirements shared by UARS and OPEN and the data storage and data processing demands of these missions. Because more detailed information is available for UARS, the design approach was to size the system and to select components for a UARS CDHF, but in a manner that does not optimize the CDHF at the expense of OPEN. Costs for alternative implementations of the UARS designs are presented showing that the system design does not restrict the implementation to a single manufacturer. Processing demands on the alternative UARS CDHF implementations are discussed. With this information at hand together with estimates for OPEN processing demands, it is shown that any shortfall in system capability for OPEN support can be remedied by either component upgrades or array processing attachments rather than a system redesign.

  10. Open Source Clinical NLP - More than Any Single System.

    PubMed

    Masanz, James; Pakhomov, Serguei V; Xu, Hua; Wu, Stephen T; Chute, Christopher G; Liu, Hongfang

    2014-01-01

    The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP's mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice.

  11. An eHealth Capabilities Framework for Graduates and Health Professionals: Mixed-Methods Study.

    PubMed

    Brunner, Melissa; McGregor, Deborah; Keep, Melanie; Janssen, Anna; Spallek, Heiko; Quinn, Deleana; Jones, Aaron; Tseris, Emma; Yeung, Wilson; Togher, Leanne; Solman, Annette; Shaw, Tim

    2018-05-15

    The demand for an eHealth-ready and adaptable workforce is placing increasing pressure on universities to deliver eHealth education. At present, eHealth education is largely focused on components of eHealth rather than considering a curriculum-wide approach. This study aimed to develop a framework that could be used to guide health curriculum design based on current evidence, and stakeholder perceptions of eHealth capabilities expected of tertiary health graduates. A 3-phase, mixed-methods approach incorporated the results of a literature review, focus groups, and a Delphi process to develop a framework of eHealth capability statements. Participants (N=39) with expertise or experience in eHealth education, practice, or policy provided feedback on the proposed framework, and following the fourth iteration of this process, consensus was achieved. The final framework consisted of 4 higher-level capability statements that describe the learning outcomes expected of university graduates across the domains of (1) digital health technologies, systems, and policies; (2) clinical practice; (3) data analysis and knowledge creation; and (4) technology implementation and codesign. Across the capability statements are 40 performance cues that provide examples of how these capabilities might be demonstrated. The results of this study inform a cross-faculty eHealth curriculum that aligns with workforce expectations. There is a need for educational curriculum to reinforce existing eHealth capabilities, adapt existing capabilities to make them transferable to novel eHealth contexts, and introduce new learning opportunities for interactions with technologies within education and practice encounters. As such, the capability framework developed may assist in the application of eHealth by emerging and existing health care professionals. Future research needs to explore the potential for integration of findings into workforce development programs. ©Melissa Brunner, Deborah McGregor, Melanie Keep, Anna Janssen, Heiko Spallek, Deleana Quinn, Aaron Jones, Emma Tseris, Wilson Yeung, Leanne Togher, Annette Solman, Tim Shaw. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 15.05.2018.

  12. Information fusion: telling the story (or threat narrative)

    NASA Astrophysics Data System (ADS)

    Fenstermacher, Laurie

    2014-06-01

    Today's operators face a "double whammy" - the need to process increasing amounts of information, including "Twitter-INT"1 (social information such as Facebook, You-Tube videos, blogs, Twitter) as well as the need to discern threat signatures in new security environments, including those in which the airspace is contested. To do this will require the Air Force to "fuse and leverage its vast capabilities in new ways."2 For starters, the integration of quantitative and qualitative information must be done in a way that preserves important contextual information since the goal increasingly is to identify and mitigate violence before it occurs. To do so requires a more nuanced understanding of the environment being sensed, including the human environment, ideally from the "emic" perspective; that is, from the perspective of that individual or group. This requires not only data and information that informs the understanding of how the individuals and/or groups see themselves and others (social identity) but also information on how that identity filters information in their environment which, in turn, shapes their behaviors.3 The goal is to piece together the individual and/or collective narratives regarding threat, the threat narrative, from various sources of information. Is there a threat? If so, what is it? What is motivating the threat? What is the intent of those who pose the threat and what are their capabilities and their vulnerabilities?4 This paper will describe preliminary investigations regarding the application of prototype hybrid information fusion method based on the threat narrative framework.

  13. A computer program for uncertainty analysis integrating regression and Bayesian methods

    USGS Publications Warehouse

    Lu, Dan; Ye, Ming; Hill, Mary C.; Poeter, Eileen P.; Curtis, Gary

    2014-01-01

    This work develops a new functionality in UCODE_2014 to evaluate Bayesian credible intervals using the Markov Chain Monte Carlo (MCMC) method. The MCMC capability in UCODE_2014 is based on the FORTRAN version of the differential evolution adaptive Metropolis (DREAM) algorithm of Vrugt et al. (2009), which estimates the posterior probability density function of model parameters in high-dimensional and multimodal sampling problems. The UCODE MCMC capability provides eleven prior probability distributions and three ways to initialize the sampling process. It evaluates parametric and predictive uncertainties and it has parallel computing capability based on multiple chains to accelerate the sampling process. This paper tests and demonstrates the MCMC capability using a 10-dimensional multimodal mathematical function, a 100-dimensional Gaussian function, and a groundwater reactive transport model. The use of the MCMC capability is made straightforward and flexible by adopting the JUPITER API protocol. With the new MCMC capability, UCODE_2014 can be used to calculate three types of uncertainty intervals, which all can account for prior information: (1) linear confidence intervals which require linearity and Gaussian error assumptions and typically 10s–100s of highly parallelizable model runs after optimization, (2) nonlinear confidence intervals which require a smooth objective function surface and Gaussian observation error assumptions and typically 100s–1,000s of partially parallelizable model runs after optimization, and (3) MCMC Bayesian credible intervals which require few assumptions and commonly 10,000s–100,000s or more partially parallelizable model runs. Ready access allows users to select methods best suited to their work, and to compare methods in many circumstances.

  14. The Defense Threat Reduction Agency's Technical Nuclear Forensics Research and Development Program

    NASA Astrophysics Data System (ADS)

    Franks, J.

    2015-12-01

    The Defense Threat Reduction Agency (DTRA) Technical Nuclear Forensics (TNF) Research and Development (R&D) Program's overarching goal is to design, develop, demonstrate, and transition advanced technologies and methodologies that improve the interagency operational capability to provide forensics conclusions after the detonation of a nuclear device. This goal is attained through the execution of three focus areas covering the span of the TNF process to enable strategic decision-making (attribution): Nuclear Forensic Materials Exploitation - Development of targeted technologies, methodologies and tools enabling the timely collection, analysis and interpretation of detonation materials.Prompt Nuclear Effects Exploitation - Improve ground-based capabilities to collect prompt nuclear device outputs and effects data for rapid, complementary and corroborative information.Nuclear Forensics Device Characterization - Development of a validated and verified capability to reverse model a nuclear device with high confidence from observables (e.g., prompt diagnostics, sample analysis, etc.) seen after an attack. This presentation will outline DTRA's TNF R&D strategy and current investments, with efforts focusing on: (1) introducing new technical data collection capabilities (e.g., ground-based prompt diagnostics sensor systems; innovative debris collection and analysis); (2) developing new TNF process paradigms and concepts of operations to decrease timelines and uncertainties, and increase results confidence; (3) enhanced validation and verification (V&V) of capabilities through technology evaluations and demonstrations; and (4) updated weapon output predictions to account for the modern threat environment. A key challenge to expanding these efforts to a global capability is the need for increased post-detonation TNF international cooperation, collaboration and peer reviews.

  15. Information revolution in nursing and health care: educating for tomorrow's challenge.

    PubMed

    Kooker, B M; Richardson, S S

    1994-06-01

    Current emphasis on the national electronic highway and a national health database for comparative health care reporting demonstrates society's increasing reliance on information technology. The efficient electronic processing and managing of data, information, and knowledge are critical for survival in tomorrow's health care organization. To take a leadership role in this information revolution, informatics nurse specialists must possess competencies that incorporate information science, computer science, and nursing science for successful information system development. In selecting an appropriate informatics educational program or to hire an individual capable of meeting this challenge, nurse administrators must look for the following technical knowledge and skill set: information management principles, system development life cycle, programming languages, file design and access, hardware and network architecture, project management skills, and leadership abilities.

  16. Convergence in full motion video processing, exploitation, and dissemination and activity based intelligence

    NASA Astrophysics Data System (ADS)

    Phipps, Marja; Lewis, Gina

    2012-06-01

    Over the last decade, intelligence capabilities within the Department of Defense/Intelligence Community (DoD/IC) have evolved from ad hoc, single source, just-in-time, analog processing; to multi source, digitally integrated, real-time analytics; to multi-INT, predictive Processing, Exploitation and Dissemination (PED). Full Motion Video (FMV) technology and motion imagery tradecraft advancements have greatly contributed to Intelligence, Surveillance and Reconnaissance (ISR) capabilities during this timeframe. Imagery analysts have exploited events, missions and high value targets, generating and disseminating critical intelligence reports within seconds of occurrence across operationally significant PED cells. Now, we go beyond FMV, enabling All-Source Analysts to effectively deliver ISR information in a multi-INT sensor rich environment. In this paper, we explore the operational benefits and technical challenges of an Activity Based Intelligence (ABI) approach to FMV PED. Existing and emerging ABI features within FMV PED frameworks are discussed, to include refined motion imagery tools, additional intelligence sources, activity relevant content management techniques and automated analytics.

  17. Expanding NASA's Land, Atmosphere Near real-time Capability for EOS

    NASA Astrophysics Data System (ADS)

    Davies, D.; Michael, K.; Masuoka, E.; Ye, G.; Schmaltz, J. E.; Harrison, S.; Ziskin, D.; Durbin, P. B.; Protack, S.; Rinsland, P. L.; Slayback, D. A.; Policelli, F. S.; Olsina, O.; Fu, G.; Ederer, G. A.; Ding, F.; Braun, J.; Gumley, L.; Prins, E. M.; Davidson, C. C.; Wong, M. M.

    2017-12-01

    NASA's Land, Atmosphere Near real-time Capability for EOS (LANCE) is a virtual system that provides near real-time EOS data and imagery to meet the needs of scientists and application users interested in monitoring a wide variety of natural and man-made phenomena in near real-time. Over the last year: near real-time products and imagery from MOPITT, MISR, OMPS and VIIRS (Land and Atmosphere) have been added; the Fire Information for Resource Management System (FIRMS) has been updated and LANCE has begun the process of integrating the Global NRT flood product. In addition, following the AMSU-A2 instrument anomaly in September 2016, AIRS-only products have replaced the NRT level 2 AIRS+AMSU products. This presentation provides a brief overview of LANCE, describes the new products that are recently available and contains a preview of what to expect in LANCE over the coming year. For more information visit: https://earthdata.nasa.gov/lance

  18. The electrophotonic silicon biosensor

    NASA Astrophysics Data System (ADS)

    Juan-Colás, José; Parkin, Alison; Dunn, Katherine E.; Scullion, Mark G.; Krauss, Thomas F.; Johnson, Steven D.

    2016-09-01

    The emergence of personalized and stratified medicine requires label-free, low-cost diagnostic technology capable of monitoring multiple disease biomarkers in parallel. Silicon photonic biosensors combine high-sensitivity analysis with scalable, low-cost manufacturing, but they tend to measure only a single biomarker and provide no information about their (bio)chemical activity. Here we introduce an electrochemical silicon photonic sensor capable of highly sensitive and multiparameter profiling of biomarkers. Our electrophotonic technology consists of microring resonators optimally n-doped to support high Q resonances alongside electrochemical processes in situ. The inclusion of electrochemical control enables site-selective immobilization of different biomolecules on individual microrings within a sensor array. The combination of photonic and electrochemical characterization also provides additional quantitative information and unique insight into chemical reactivity that is unavailable with photonic detection alone. By exploiting both the photonic and the electrical properties of silicon, the sensor opens new modalities for sensing on the microscale.

  19. Coal conversion processes and analysis methodologies for synthetic fuels production. [technology assessment and economic analysis of reactor design for coal gasification

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.

  20. Visual pattern image sequence coding

    NASA Technical Reports Server (NTRS)

    Silsbee, Peter; Bovik, Alan C.; Chen, Dapang

    1990-01-01

    The visual pattern image coding (VPIC) configurable digital image-coding process is capable of coding with visual fidelity comparable to the best available techniques, at compressions which (at 30-40:1) exceed all other technologies. These capabilities are associated with unprecedented coding efficiencies; coding and decoding operations are entirely linear with respect to image size and entail a complexity that is 1-2 orders of magnitude faster than any previous high-compression technique. The visual pattern image sequence coding to which attention is presently given exploits all the advantages of the static VPIC in the reduction of information from an additional, temporal dimension, to achieve unprecedented image sequence coding performance.

  1. Building net-centric data strategies in support of a transformational MIW capability

    NASA Astrophysics Data System (ADS)

    Cramer, M. A.; Stack, J.

    2010-04-01

    The Mine Warfare (MIW) Community of Interest (COI) was established to develop data strategies in support of a future information-based architecture for naval MIW. As these strategies are developed and deployed, the ability for these datafocused efforts to enable technology insertion is becoming increasingly evident. This paper explores and provides concrete examples as to the ways in which these data strategies are supporting the technology insertion process for software-based systems and ultimately contribute to the establishment of an Open Business Model virtual environment. It is through the creation of such a collaborative research platform that a truly transformation MIW capability can be realized.

  2. NASA Biomedical Informatics Capabilities and Needs

    NASA Technical Reports Server (NTRS)

    Johnson-Throop, Kathy A.

    2009-01-01

    To improve on-orbit clinical capabilities by developing and providing operational support for intelligent, robust, reliable, and secure, enterprise-wide and comprehensive health care and biomedical informatics systems with increasing levels of autonomy, for use on Earth, low Earth orbit & exploration class missions. Biomedical Informatics is an emerging discipline that has been defined as the study, invention, and implementation of structures and algorithms to improve communication, understanding and management of medical information. The end objective of biomedical informatics is the coalescing of data, knowledge, and the tools necessary to apply that data and knowledge in the decision-making process, at the time and place that a decision needs to be made.

  3. Elliptical orbit performance computer program

    NASA Technical Reports Server (NTRS)

    Myler, T. R.

    1981-01-01

    A FORTRAN coded computer program which generates and plots elliptical orbit performance capability of space boosters for presentation purposes is described. Orbital performance capability of space boosters is typically presented as payload weight as a function of perigee and apogee altitudes. The parameters are derived from a parametric computer simulation of the booster flight which yields the payload weight as a function of velocity and altitude at insertion. The process of converting from velocity and altitude to apogee and perigee altitude and plotting the results as a function of payload weight is mechanized with the ELOPE program. The program theory, user instruction, input/output definitions, subroutine descriptions and detailed FORTRAN coding information are included.

  4. Process-based tolerance assessment of connecting rod machining process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa; Surendra Babu, B.

    2016-06-01

    Process tolerancing based on the process capability studies is the optimistic and pragmatic approach of determining the manufacturing process tolerances. On adopting the define-measure-analyze-improve-control approach, the process potential capability index ( C p) and the process performance capability index ( C pk) values of identified process characteristics of connecting rod machining process are achieved to be greater than the industry benchmark of 1.33, i.e., four sigma level. The tolerance chain diagram methodology is applied to the connecting rod in order to verify the manufacturing process tolerances at various operations of the connecting rod manufacturing process. This paper bridges the gap between the existing dimensional tolerances obtained via tolerance charting and process capability studies of the connecting rod component. Finally, the process tolerancing comparison has been done by adopting a tolerance capability expert software.

  5. Overview of the NASA Wallops Flight Facility Mobile Range Control System

    NASA Technical Reports Server (NTRS)

    Davis, Rodney A.; Semancik, Susan K.; Smith, Donna C.; Stancil, Robert K.

    1999-01-01

    The NASA GSFC's Wallops Flight Facility (WFF) Mobile Range Control System (MRCS) is based on the functionality of the WFF Range Control Center at Wallops Island, Virginia. The MRCS provides real time instantaneous impact predictions, real time flight performance data, and other critical information needed by mission and range safety personnel in support of range operations at remote launch sites. The MRCS integrates a PC telemetry processing system (TELPro), a PC radar processing system (PCDQS), multiple Silicon Graphics display workstations (IRIS), and communication links within a mobile van for worldwide support of orbital, suborbital, and aircraft missions. This paper describes the MRCS configuration; the TELPro's capability to provide single/dual telemetry tracking and vehicle state data processing; the PCDQS' capability to provide real time positional data and instantaneous impact prediction for up to 8 data sources; and the IRIS' user interface for setup/display options. With portability, PC-based data processing, high resolution graphics, and flexible multiple source support, the MRCS system is proving to be responsive to the ever-changing needs of a variety of increasingly complex missions.

  6. Functional Fault Model Development Process to Support Design Analysis and Operational Assessment

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.

    2016-01-01

    A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.

  7. Spectral OCT with speckle contrast reduction for evaluation of the healing process after PRK and transepithelial PRK.

    PubMed

    Kaluzny, Bartlomiej J; Szkulmowski, Maciej; Bukowska, Danuta M; Wojtkowski, Maciej

    2014-04-01

    We evaluate Spectral OCT (SOCT) with a speckle contrast reduction technique using resonant scanner for assessment of corneal surface changes after excimer laser photorefractive keratectomy (PRK) and we compare healing process between conventional PRK and transepithelial PRK. The measurements were performed before and after the surgery. Obtained results show that SOCT with a resonant scanner speckle contrast reduction is capable of providing information regarding the healing process after PRK. The main difference between the healing processes of PRK and TransPRK, assessed by SOCT, was the time to cover the stroma with epithelium, which was shorter in the TransPRK group.

  8. Parmeterization of spectra

    NASA Technical Reports Server (NTRS)

    Cornish, C. R.

    1983-01-01

    Following reception and analog to digital conversion (A/D) conversion, atmospheric radar backscatter echoes need to be processed so as to obtain desired information about atmospheric processes and to eliminate or minimize contaminating contributions from other sources. Various signal processing techniques have been implemented at mesosphere-stratosphere-troposphere (MST) radar facilities to estimate parameters of interest from received spectra. Such estimation techniques need to be both accurate and sufficiently efficient to be within the capabilities of the particular data-processing system. The various techniques used to parameterize the spectra of received signals are reviewed herein. Noise estimation, electromagnetic interference, data smoothing, correlation, and the Doppler effect are among the specific points addressed.

  9. Bidding-based autonomous process planning and scheduling

    NASA Astrophysics Data System (ADS)

    Gu, Peihua; Balasubramanian, Sivaram; Norrie, Douglas H.

    1995-08-01

    Improving productivity through computer integrated manufacturing systems (CIMS) and concurrent engineering requires that the islands of automation in an enterprise be completely integrated. The first step in this direction is to integrate design, process planning, and scheduling. This can be achieved through a bidding-based process planning approach. The product is represented in a STEP model with detailed design and administrative information including design specifications, batch size, and due dates. Upon arrival at the manufacturing facility, the product registered in the shop floor manager which is essentially a coordinating agent. The shop floor manager broadcasts the product's requirements to the machines. The shop contains autonomous machines that have knowledge about their functionality, capabilities, tooling, and schedule. Each machine has its own process planner and responds to the product's request in a different way that is consistent with its capabilities and capacities. When more than one machine offers certain process(es) for the same requirements, they enter into negotiation. Based on processing time, due date, and cost, one of the machines wins the contract. The successful machine updates its schedule and advises the product to request raw material for processing. The concept was implemented using a multi-agent system with the task decomposition and planning achieved through contract nets. The examples are included to illustrate the approach.

  10. Supporting virtual enterprise design by a web-based information model

    NASA Astrophysics Data System (ADS)

    Li, Dong; Barn, Balbir; McKay, Alison; de Pennington, Alan

    2001-10-01

    Development of IT and its applications have led to significant changes in business processes. To pursue agility, flexibility and best service to customers, enterprises focus on their core competence and dynamically build relationships with partners to form virtual enterprises as customer driven temporary demand chains/networks. Building the networked enterprise needs responsively interactive decisions instead of a single-direction partner selection process. Benefits and risks in the combination should be systematically analysed, and aggregated information about value-adding abilities and risks of networks needs to be derived from interactions of all partners. In this research, a hierarchical information model to assess partnerships for designing virtual enterprises was developed. Internet technique has been applied to the evaluation process so that interactive decisions can be visualised and made responsively during the design process. The assessment is based on the process which allows each partner responds to requirements of the virtual enterprise by planning its operational process as a bidder. The assessment is then produced by making an aggregated value to represent prospect of the combination of partners given current bidding. Final design is a combination of partners with the greatest total value-adding capability and lowest risk.

  11. Emergency Response Fire-Imaging UAS Missions over the Southern California Wildfire Disaster

    NASA Technical Reports Server (NTRS)

    DelFrate, John H.

    2008-01-01

    Objectives include: Demonstrate capabilities of UAS to overfly and collect sensor data on widespread fires throughout Western US. Demonstrate long-endurance mission capabilities (20-hours+). Image multiple fires (greater than 4 fires per mission), to showcase extendable mission configuration and ability to either linger over key fires or station over disparate regional fires. Demonstrate new UAV-compatible, autonomous sensor for improved thermal characterization of fires. Provide automated, on-board, terrain and geo-rectified sensor imagery over OTH satcom links to national fire personnel and Incident commanders. Deliver real-time imagery (within 10-minutes of acquisition). Demonstrate capabilities of OTS technologies (GoogleEarth) to serve and display mission-critical sensor data, coincident with other pertinent data elements to facilitate information processing (WX data, ground asset data, other satellite data, R/T video, flight track info, etc).

  12. Emergency Response Fire-Imaging UAS Missions over the Southern California Wildfire Disaster

    NASA Technical Reports Server (NTRS)

    Cobleigh, Brent R.

    2007-01-01

    Objectives include: Demonstrate capabilities of UAS to overfly and collect sensor data on widespread fires throughout Western US. Demonstrate long-endurance mission capabilities (20-hours+). Image multiple fires (greater than 4 fires per mission), to showcase extendable mission configuration and ability to either linger over key fires or station over disparate regional fires. Demonstrate new UAV-compatible, autonomous sensor for improved thermal characterization of fires. Provide automated, on-board, terrain and geo-rectified sensor imagery over OTH satcom links to national fire personnel and Incident commanders. Deliver real-time imagery (within 10-minutes of acquisition). Demonstrate capabilities of OTS technologies (GoogleEarth) to serve and display mission-critical sensor data, coincident with other pertinent data elements to facilitate information processing (WX data, ground asset data, other satellite data, R/T video, flight track info, etc).

  13. Session on High Speed Civil Transport Design Capability Using MDO and High Performance Computing

    NASA Technical Reports Server (NTRS)

    Rehder, Joe

    2000-01-01

    Since the inception of CAS in 1992, NASA Langley has been conducting research into applying multidisciplinary optimization (MDO) and high performance computing toward reducing aircraft design cycle time. The focus of this research has been the development of a series of computational frameworks and associated applications that increased in capability, complexity, and performance over time. The culmination of this effort is an automated high-fidelity analysis capability for a high speed civil transport (HSCT) vehicle installed on a network of heterogeneous computers with a computational framework built using Common Object Request Broker Architecture (CORBA) and Java. The main focus of the research in the early years was the development of the Framework for Interdisciplinary Design Optimization (FIDO) and associated HSCT applications. While the FIDO effort was eventually halted, work continued on HSCT applications of ever increasing complexity. The current application, HSCT4.0, employs high fidelity CFD and FEM analysis codes. For each analysis cycle, the vehicle geometry and computational grids are updated using new values for design variables. Processes for aeroelastic trim, loads convergence, displacement transfer, stress and buckling, and performance have been developed. In all, a total of 70 processes are integrated in the analysis framework. Many of the key processes include automatic differentiation capabilities to provide sensitivity information that can be used in optimization. A software engineering process was developed to manage this large project. Defining the interactions among 70 processes turned out to be an enormous, but essential, task. A formal requirements document was prepared that defined data flow among processes and subprocesses. A design document was then developed that translated the requirements into actual software design. A validation program was defined and implemented to ensure that codes integrated into the framework produced the same results as their standalone counterparts. Finally, a Commercial Off the Shelf (COTS) configuration management system was used to organize the software development. A computational environment, CJOPT, based on the Common Object Request Broker Architecture, CORBA, and the Java programming language has been developed as a framework for multidisciplinary analysis and Optimization. The environment exploits the parallelisms inherent in the application and distributes the constituent disciplines on machines best suited to their needs. In CJOpt, a discipline code is "wrapped" as an object. An interface to the object identifies the functionality (services) provided by the discipline, defined in Interface Definition Language (IDL) and implemented using Java. The results of using the HSCT4.0 capability are described. A summary of lessons learned is also presented. The use of some of the processes, codes, and techniques by industry are highlighted. The application of the methodology developed in this research to other aircraft are described. Finally, we show how the experience gained is being applied to entirely new vehicles, such as the Reusable Space Transportation System. Additional information is contained in the original.

  14. Multi-kilowatt modularized spacecraft power processing system development

    NASA Technical Reports Server (NTRS)

    Andrews, R. E.; Hayden, J. H.; Hedges, R. T.; Rehmann, D. W.

    1975-01-01

    A review of existing information pertaining to spacecraft power processing systems and equipment was accomplished with a view towards applicability to the modularization of multi-kilowatt power processors. Power requirements for future spacecraft were determined from the NASA mission model-shuttle systems payload data study which provided the limits for modular power equipment capabilities. Three power processing systems were compared to evaluation criteria to select the system best suited for modularity. The shunt regulated direct energy transfer system was selected by this analysis for a conceptual design effort which produced equipment specifications, schematics, envelope drawings, and power module configurations.

  15. Advanced software development workstation. Knowledge base design: Design of knowledge base for flight planning application

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    1992-01-01

    The development process of the knowledge base for the generation of Test Libraries for Mission Operations Computer (MOC) Command Support focused on a series of information gathering interviews. These knowledge capture sessions are supporting the development of a prototype for evaluating the capabilities of INTUIT on such an application. the prototype includes functions related to POCC (Payload Operation Control Center) processing. It prompts the end-users for input through a series of panels and then generates the Meds associated with the initialization and the update of hazardous command tables for a POCC Processing TLIB.

  16. A Theoretical and Experimental Analysis of the Outside World Perception Process

    NASA Technical Reports Server (NTRS)

    Wewerinke, P. H.

    1978-01-01

    The outside scene is often an important source of information for manual control tasks. Important examples of these are car driving and aircraft control. This paper deals with modelling this visual scene perception process on the basis of linear perspective geometry and the relative motion cues. Model predictions utilizing psychophysical threshold data from base-line experiments and literature of a variety of visual approach tasks are compared with experimental data. Both the performance and workload results illustrate that the model provides a meaningful description of the outside world perception process, with a useful predictive capability.

  17. Quantum technologies with hybrid systems

    PubMed Central

    Kurizki, Gershon; Bertet, Patrice; Kubo, Yuimaru; Mølmer, Klaus; Petrosyan, David; Rabl, Peter; Schmiedmayer, Jörg

    2015-01-01

    An extensively pursued current direction of research in physics aims at the development of practical technologies that exploit the effects of quantum mechanics. As part of this ongoing effort, devices for quantum information processing, secure communication, and high-precision sensing are being implemented with diverse systems, ranging from photons, atoms, and spins to mesoscopic superconducting and nanomechanical structures. Their physical properties make some of these systems better suited than others for specific tasks; thus, photons are well suited for transmitting quantum information, weakly interacting spins can serve as long-lived quantum memories, and superconducting elements can rapidly process information encoded in their quantum states. A central goal of the envisaged quantum technologies is to develop devices that can simultaneously perform several of these tasks, namely, reliably store, process, and transmit quantum information. Hybrid quantum systems composed of different physical components with complementary functionalities may provide precisely such multitasking capabilities. This article reviews some of the driving theoretical ideas and first experimental realizations of hybrid quantum systems and the opportunities and challenges they present and offers a glance at the near- and long-term perspectives of this fascinating and rapidly expanding field. PMID:25737558

  18. Quantum technologies with hybrid systems.

    PubMed

    Kurizki, Gershon; Bertet, Patrice; Kubo, Yuimaru; Mølmer, Klaus; Petrosyan, David; Rabl, Peter; Schmiedmayer, Jörg

    2015-03-31

    An extensively pursued current direction of research in physics aims at the development of practical technologies that exploit the effects of quantum mechanics. As part of this ongoing effort, devices for quantum information processing, secure communication, and high-precision sensing are being implemented with diverse systems, ranging from photons, atoms, and spins to mesoscopic superconducting and nanomechanical structures. Their physical properties make some of these systems better suited than others for specific tasks; thus, photons are well suited for transmitting quantum information, weakly interacting spins can serve as long-lived quantum memories, and superconducting elements can rapidly process information encoded in their quantum states. A central goal of the envisaged quantum technologies is to develop devices that can simultaneously perform several of these tasks, namely, reliably store, process, and transmit quantum information. Hybrid quantum systems composed of different physical components with complementary functionalities may provide precisely such multitasking capabilities. This article reviews some of the driving theoretical ideas and first experimental realizations of hybrid quantum systems and the opportunities and challenges they present and offers a glance at the near- and long-term perspectives of this fascinating and rapidly expanding field.

  19. Quantum technologies with hybrid systems

    NASA Astrophysics Data System (ADS)

    Kurizki, Gershon; Bertet, Patrice; Kubo, Yuimaru; Mølmer, Klaus; Petrosyan, David; Rabl, Peter; Schmiedmayer, Jörg

    2015-03-01

    An extensively pursued current direction of research in physics aims at the development of practical technologies that exploit the effects of quantum mechanics. As part of this ongoing effort, devices for quantum information processing, secure communication, and high-precision sensing are being implemented with diverse systems, ranging from photons, atoms, and spins to mesoscopic superconducting and nanomechanical structures. Their physical properties make some of these systems better suited than others for specific tasks; thus, photons are well suited for transmitting quantum information, weakly interacting spins can serve as long-lived quantum memories, and superconducting elements can rapidly process information encoded in their quantum states. A central goal of the envisaged quantum technologies is to develop devices that can simultaneously perform several of these tasks, namely, reliably store, process, and transmit quantum information. Hybrid quantum systems composed of different physical components with complementary functionalities may provide precisely such multitasking capabilities. This article reviews some of the driving theoretical ideas and first experimental realizations of hybrid quantum systems and the opportunities and challenges they present and offers a glance at the near- and long-term perspectives of this fascinating and rapidly expanding field.

  20. Top-down modulation of visual and auditory cortical processing in aging.

    PubMed

    Guerreiro, Maria J S; Eck, Judith; Moerel, Michelle; Evers, Elisabeth A T; Van Gerven, Pascal W M

    2015-02-01

    Age-related cognitive decline has been accounted for by an age-related deficit in top-down attentional modulation of sensory cortical processing. In light of recent behavioral findings showing that age-related differences in selective attention are modality dependent, our goal was to investigate the role of sensory modality in age-related differences in top-down modulation of sensory cortical processing. This question was addressed by testing younger and older individuals in several memory tasks while undergoing fMRI. Throughout these tasks, perceptual features were kept constant while attentional instructions were varied, allowing us to devise all combinations of relevant and irrelevant, visual and auditory information. We found no top-down modulation of auditory sensory cortical processing in either age group. In contrast, we found top-down modulation of visual cortical processing in both age groups, and this effect did not differ between age groups. That is, older adults enhanced cortical processing of relevant visual information and suppressed cortical processing of visual distractors during auditory attention to the same extent as younger adults. The present results indicate that older adults are capable of suppressing irrelevant visual information in the context of cross-modal auditory attention, and thereby challenge the view that age-related attentional and cognitive decline is due to a general deficits in the ability to suppress irrelevant information. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. International Virtual Observatory System for Water Resources Information

    NASA Astrophysics Data System (ADS)

    Leinenweber, Lewis; Bermudez, Luis

    2013-04-01

    Sharing, accessing, and integrating hydrologic and climatic data have been identified as a critical need for some time. The current state of data portals, standards, technologies, activities, and expertise can be leverage to develop an initial operational capability for a virtual observatory system. This system will allow to link observations data with stream networks and models, and to solve semantic inconsistencies among communities. Prototyping a virtual observatory system is an inter-disciplinary, inter-agency and international endeavor. The Open Geospatial Consortium (OGC) within the OGC Interoperability Program provides the process and expertise to run such collaborative effort. The OGC serves as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The project coordinated by OGC that is advancing an international virtual observatory system for water resources information is called Climatology-Hydrology Information Sharing Pilot, Phase 1 (CHISP-1). It includes observations and forecasts in the U.S. and Canada levering current networks and capabilities. It is designed to support the following use cases: 1) Hydrologic modeling for historical and near-future stream flow and groundwater conditions. Requires the integration of trans-boundary stream flow and groundwater well data, as well as national river networks (US NHD and Canada NHN) from multiple agencies. Emphasis will be on time series data and real-time flood monitoring. 2) Modeling and assessment of nutrient load into the lakes. Requires accessing water-quality data from multiple agencies and integrating with stream flow information for calculating loads. Emphasis on discrete sampled water quality observations, linking those to specific NHD stream reaches and catchments, and additional metadata for sampled data. The key objectives of these use cases are: 1) To link observations data to the stream network, enabling queries of conditions upstream from a given location to return all relevant gages and well locations. This is currently not practical with the data sources available. 2) To bridge differences in semantics across information models and processes used by the various data producers, to improve the hydrologic and water quality modeling capabilities. Other expected benefits to be derived from this project include: - Leverage a large body of existing data holdings and related activities of multiple agencies in the US and Canada. - Influence data and metadata standards used internationally for web-based information sharing, through multiple agency cooperation and OGC standards setting process. - Reduction of procurement risk through partnership-based development of an initial operating capability verses the cost for building a fully operational system using a traditional "waterfall approach". - Identification and clarification of what is possible, and of the key technical and non-technical barriers to continued progress in sharing and integrating hydrologic and climatic information. - Promote understanding and strengthen ties within the hydro-climatic community. This is anticipated to be the first phase of a multi-phase project, with future work on forecasting the hydrologic consequences of extreme weather events, and enabling more sophisticated water quality modeling.

  2. Evolutionary Capability Delivery of Coast Guard Manpower System

    DTIC Science & Technology

    2014-06-01

    Office IID iterative incremental development model IT information technology MA major accomplishment MRA manpower requirements analysis MRD manpower...CG will need to ensure that development is low risk. The CG uses Manpower Requirements Analysis ( MRAs ) to collect the necessary manpower data to...of users. The CG uses two business processes to manage human capital: Manpower Requirements Analysis ( MRA ) and Manpower Requirements

  3. Speaker Localisation Using Time Difference of Arrival

    DTIC Science & Technology

    2008-04-01

    School of Electrical and Electronic Engineering of the University of Adelaide. His area of expertise and interest is in Signal Processing including audio ...support of Theatre intelligence capabilities. His recent research interests include: information visualisation , text and data mining, and speech and...by: steering microphone arrays to improve the quality of audio pickup for recording, communication and transcription; enhancing the separation – and

  4. Total Ownership Cost Reduction Case Study: AEGIS Microwave Power Tubes

    DTIC Science & Technology

    2006-05-31

    processes. The center would also maintain crucial capabilities and knowledge required for test and evaluation, Logistics, and for certain...of the research presented herein. Reproduction of all or part of this report is authorized. The report was prepared by...burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching

  5. National Commission on New Technological Uses of Copyrighted Works, Meeting Number Three (White Plains and New York, N.Y., December 18-19, 1975).

    ERIC Educational Resources Information Center

    Library of Congress, Washington, DC. Copyright Office.

    The first day of the National Commission on New Technological Uses of Copyrighted Works' (CONTU) third meeting was spent at IBM processing headquarters, where commissioners learned about the history and terminology of computers, information storage methods, computer capabilities, computers and copyrights, future trends, and costs. On the second…

  6. The George C. Marshall Space Flight Center High Reynolds Number Wind Tunnel Technical Handbook

    NASA Technical Reports Server (NTRS)

    Gwin, H. S.

    1975-01-01

    The High Reynolds Number Wind Tunnel at the George C. Marshall Space Flight Center is described. The following items are presented to illustrate the operation and capabilities of the facility: facility descriptions and specifications, operational and performance characteristics, model design criteria, instrumentation and data recording equipment, data processing and presentation, and preliminary test information required.

  7. The Emerging Role of the Data Base Manager. Report No. R-1253-PR.

    ERIC Educational Resources Information Center

    Sawtelle, Thomas K.

    The Air Force Logistics Command (AFLC) is revising and enhancing its data-processing capabilities with the development of a large-scale, multi-site, on-line, integrated data base information system known as the Advanced Logistics System (ALS). A data integrity program is to be built around a Data Base Manager (DBM), an individual or a group of…

  8. A Comparison of Product Realization Frameworks

    DTIC Science & Technology

    1993-10-01

    software (integrated FrameMaker ). Also included are BOLD for on-line documentation delivery, printer/plotter support, and 18 network licensing support. AMPLE...are built with DSS. Documentation tools include an on-line information system (BOLD), text editing (Notepad), word processing (integrated FrameMaker ...within an application. FrameMaker is fully integrated with the Falcon Framework to provide consistent documentation capabilities within engineering

  9. Integrating Sensor-Collected Intelligence

    DTIC Science & Technology

    2008-11-01

    collecting, processing, data storage and fusion, and the dissemination of information collected by Intelligence, Surveillance, and Reconnaissance (ISR...Grid – Bandwidth Expansion (GIG-BE) program) to provide the capability to transfer data from sensors to accessible storage and satellite and airborne...based ISR is much more fragile. There was a purposeful drawdown of these systems following the Cold War and modernization programs were planned to

  10. Personalised and Self Regulated Learning in the Web 2.0 Era: International Exemplars of Innovative Pedagogy Using Social Software

    ERIC Educational Resources Information Center

    McLoughlin, Catherine; Lee, Mark J. W.

    2010-01-01

    Research findings in recent years provide compelling evidence of the importance of encouraging student control over the learning process as a whole. The socially based tools and technologies of the Web 2.0 movement are capable of supporting informal conversation, reflexive dialogue and collaborative content generation, enabling access to a wide…

  11. Rand Arroyo Center 2014

    DTIC Science & Technology

    2015-01-01

    field effective command and control sys- tems within the framework of current policies and processes. Cost Considerations in Cloud Computing ...www.rand.org/t/PE113 Finds that cloud provider costs can vary compared with tradi- tional information system alternatives because of different cost structures...for analysts evaluating new cloud investments. U.S. Army photo by Staff Sgt. Christopher Calvert FOCUS ON Capabilities Development and Acquisition

  12. Damage Precursor Identification via Microstructure-Sensitive Nondestructive Evaluation

    NASA Astrophysics Data System (ADS)

    Wisner, Brian John

    Damage in materials is a complex and stochastic process bridging several time and length scales. This dissertation focuses on investigating the damage process in a particular class of precipitate-hardened aluminum alloys which is widely used in automotive and aerospace applications. Most emphasis in the literature has been given either on their ductility for manufacturing purposes or fracture for performance considerations. In this dissertation, emphasis is placed on using nondestructive evaluation (NDE) combined with mechanical testing and characterization methods applied at a scale where damage incubation and initiation is occurring. Specifically, a novel setup built inside a Scanning Electron Microscope (SEM) and retrofitted to be combined with characterization and NDE capabilities was developed with the goal to track the early stages of the damage process in this type of material. The characterization capabilities include Electron Backscatter Diffraction (EBSD) and Energy Dispersive Spectroscopy (EDS) in addition to X-ray micro-computed tomography (μ-CT) and nanoindentation, in addition to microscopy achieved by the Secondary Electron (SE) and Back Scatter Electron (BSE) detectors. The mechanical testing inside the SEM was achieved with the use of an appropriate stage that fitted within its chamber and is capable of applying both axial and bending monotonic and cyclic loads. The NDE capabilities, beyond the microscopy and μ-CT, include the methods of Acoustic Emission and Digital Image Correlation (DIC). This setup was used to identify damage precursors in this material system and their evolution over time and space. The experimental results were analyzed by a custom signal processing scheme that involves both feature-based analyses as well as a machine learning method to relate recorded microstructural data to damage in this material. Extensions of the presented approach to include information from computational methods as well as its applicability to other material systems are discussed.

  13. Computer-aided biochemical programming of synthetic microreactors as diagnostic devices.

    PubMed

    Courbet, Alexis; Amar, Patrick; Fages, François; Renard, Eric; Molina, Franck

    2018-04-26

    Biological systems have evolved efficient sensing and decision-making mechanisms to maximize fitness in changing molecular environments. Synthetic biologists have exploited these capabilities to engineer control on information and energy processing in living cells. While engineered organisms pose important technological and ethical challenges, de novo assembly of non-living biomolecular devices could offer promising avenues toward various real-world applications. However, assembling biochemical parts into functional information processing systems has remained challenging due to extensive multidimensional parameter spaces that must be sampled comprehensively in order to identify robust, specification compliant molecular implementations. We introduce a systematic methodology based on automated computational design and microfluidics enabling the programming of synthetic cell-like microreactors embedding biochemical logic circuits, or protosensors , to perform accurate biosensing and biocomputing operations in vitro according to temporal logic specifications. We show that proof-of-concept protosensors integrating diagnostic algorithms detect specific patterns of biomarkers in human clinical samples. Protosensors may enable novel approaches to medicine and represent a step toward autonomous micromachines capable of precise interfacing of human physiology or other complex biological environments, ecosystems, or industrial bioprocesses. © 2018 The Authors. Published under the terms of the CC BY 4.0 license.

  14. Establishment of Imaging Spectroscopy of Nuclear Gamma-Rays based on Geometrical Optics

    PubMed Central

    Tanimori, Toru; Mizumura, Yoshitaka; Takada, Atsushi; Miyamoto, Shohei; Takemura, Taito; Kishimoto, Tetsuro; Komura, Shotaro; Kubo, Hidetoshi; Kurosawa, Shunsuke; Matsuoka, Yoshihiro; Miuchi, Kentaro; Mizumoto, Tetsuya; Nakamasu, Yuma; Nakamura, Kiseki; Parker, Joseph D.; Sawano, Tatsuya; Sonoda, Shinya; Tomono, Dai; Yoshikawa, Kei

    2017-01-01

    Since the discovery of nuclear gamma-rays, its imaging has been limited to pseudo imaging, such as Compton Camera (CC) and coded mask. Pseudo imaging does not keep physical information (intensity, or brightness in Optics) along a ray, and thus is capable of no more than qualitative imaging of bright objects. To attain quantitative imaging, cameras that realize geometrical optics is essential, which would be, for nuclear MeV gammas, only possible via complete reconstruction of the Compton process. Recently we have revealed that “Electron Tracking Compton Camera” (ETCC) provides a well-defined Point Spread Function (PSF). The information of an incoming gamma is kept along a ray with the PSF and that is equivalent to geometrical optics. Here we present an imaging-spectroscopic measurement with the ETCC. Our results highlight the intrinsic difficulty with CCs in performing accurate imaging, and show that the ETCC surmounts this problem. The imaging capability also helps the ETCC suppress the noise level dramatically by ~3 orders of magnitude without a shielding structure. Furthermore, full reconstruction of Compton process with the ETCC provides spectra free of Compton edges. These results mark the first proper imaging of nuclear gammas based on the genuine geometrical optics. PMID:28155870

  15. Establishment of Imaging Spectroscopy of Nuclear Gamma-Rays based on Geometrical Optics.

    PubMed

    Tanimori, Toru; Mizumura, Yoshitaka; Takada, Atsushi; Miyamoto, Shohei; Takemura, Taito; Kishimoto, Tetsuro; Komura, Shotaro; Kubo, Hidetoshi; Kurosawa, Shunsuke; Matsuoka, Yoshihiro; Miuchi, Kentaro; Mizumoto, Tetsuya; Nakamasu, Yuma; Nakamura, Kiseki; Parker, Joseph D; Sawano, Tatsuya; Sonoda, Shinya; Tomono, Dai; Yoshikawa, Kei

    2017-02-03

    Since the discovery of nuclear gamma-rays, its imaging has been limited to pseudo imaging, such as Compton Camera (CC) and coded mask. Pseudo imaging does not keep physical information (intensity, or brightness in Optics) along a ray, and thus is capable of no more than qualitative imaging of bright objects. To attain quantitative imaging, cameras that realize geometrical optics is essential, which would be, for nuclear MeV gammas, only possible via complete reconstruction of the Compton process. Recently we have revealed that "Electron Tracking Compton Camera" (ETCC) provides a well-defined Point Spread Function (PSF). The information of an incoming gamma is kept along a ray with the PSF and that is equivalent to geometrical optics. Here we present an imaging-spectroscopic measurement with the ETCC. Our results highlight the intrinsic difficulty with CCs in performing accurate imaging, and show that the ETCC surmounts this problem. The imaging capability also helps the ETCC suppress the noise level dramatically by ~3 orders of magnitude without a shielding structure. Furthermore, full reconstruction of Compton process with the ETCC provides spectra free of Compton edges. These results mark the first proper imaging of nuclear gammas based on the genuine geometrical optics.

  16. Characterization of selected elementary motion detector cells to image primitives.

    PubMed

    Benson, Leslie A; Barrett, Steven F; Wright, Cameron H G

    2008-01-01

    Developing a visual sensing system, complete with motion processing hardware and software would have many applications to current technology. It could be mounted on many autonomous vehicles to provide information about the navigational environment, as well as obstacle avoidance features. Incorporating the motion processing capabilities into the sensor requires a new approach to the algorithm implementation. This research, and that of many others, have turned to nature for inspiration. Elementary motion detector (EMD) cells are involved in a biological preprocessing network to provide information to the motion processing lobes of the house degrees y Musca domestica. This paper describes the response of the photoreceptor inputs to the EMDs. The inputs to the EMD components are tested as they are stimulated with varying image primitives. This is the first of many steps in characterizing the EMD response to image primitives.

  17. Secure VM for Monitoring Industrial Process Controllers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Dipankar; Ali, Mohammad Hassan; Abercrombie, Robert K

    2011-01-01

    In this paper, we examine the biological immune system as an autonomic system for self-protection, which has evolved over millions of years probably through extensive redesigning, testing, tuning and optimization process. The powerful information processing capabilities of the immune system, such as feature extraction, pattern recognition, learning, memory, and its distributive nature provide rich metaphors for its artificial counterpart. Our study focuses on building an autonomic defense system, using some immunological metaphors for information gathering, analyzing, decision making and launching threat and attack responses. In order to detection Stuxnet like malware, we propose to include a secure VM (or dedicatedmore » host) to the SCADA Network to monitor behavior and all software updates. This on-going research effort is not to mimic the nature but to explore and learn valuable lessons useful for self-adaptive cyber defense systems.« less

  18. A 14 × 14 μm2 footprint polarization-encoded quantum controlled-NOT gate based on hybrid waveguide

    PubMed Central

    Wang, S. M.; Cheng, Q. Q.; Gong, Y. X.; Xu, P.; Sun, C.; Li, L.; Li, T.; Zhu, S. N.

    2016-01-01

    Photonic quantum information processing system has been widely used in communication, metrology and lithography. The recent emphasis on the miniaturized photonic platform is thus motivated by the urgent need for realizing large-scale information processing and computing. Although the integrated quantum logic gates and quantum algorithms based on path encoding have been successfully demonstrated, the technology for handling another commonly used polarization-encoded qubits has yet to be fully developed. Here, we show the implementation of a polarization-dependent beam-splitter in the hybrid waveguide system. With precisely design, the polarization-encoded controlled-NOT gate can be implemented using only single such polarization-dependent beam-splitter with the significant size reduction of the overall device footprint to 14 × 14 μm2. The experimental demonstration of the highly integrated controlled-NOT gate sets the stage to develop large-scale quantum information processing system. Our hybrid design also establishes the new capabilities in controlling the polarization modes in integrated photonic circuits. PMID:27142992

  19. A 14 × 14 μm(2) footprint polarization-encoded quantum controlled-NOT gate based on hybrid waveguide.

    PubMed

    Wang, S M; Cheng, Q Q; Gong, Y X; Xu, P; Sun, C; Li, L; Li, T; Zhu, S N

    2016-05-04

    Photonic quantum information processing system has been widely used in communication, metrology and lithography. The recent emphasis on the miniaturized photonic platform is thus motivated by the urgent need for realizing large-scale information processing and computing. Although the integrated quantum logic gates and quantum algorithms based on path encoding have been successfully demonstrated, the technology for handling another commonly used polarization-encoded qubits has yet to be fully developed. Here, we show the implementation of a polarization-dependent beam-splitter in the hybrid waveguide system. With precisely design, the polarization-encoded controlled-NOT gate can be implemented using only single such polarization-dependent beam-splitter with the significant size reduction of the overall device footprint to 14 × 14 μm(2). The experimental demonstration of the highly integrated controlled-NOT gate sets the stage to develop large-scale quantum information processing system. Our hybrid design also establishes the new capabilities in controlling the polarization modes in integrated photonic circuits.

  20. Contingency theoretic methodology for agent-based web-oriented manufacturing systems

    NASA Astrophysics Data System (ADS)

    Durrett, John R.; Burnell, Lisa J.; Priest, John W.

    2000-12-01

    The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.

  1. NEIS (NASA Environmental Information System)

    NASA Technical Reports Server (NTRS)

    Cook, Beth

    1995-01-01

    The NASA Environmental Information System (NEIS) is a tool to support the functions of the NASA Operational Environment Team (NOET). The NEIS is designed to provide a central environmental technology resource drawing on all NASA centers' capabilities, and to support program managers who must ultimately deliver hardware compliant with performance specifications and environmental requirements. The NEIS also tracks environmental regulations, usages of materials and processes, and new technology developments. It has proven to be a useful instrument for channeling information throughout the aerospace community, NASA, other federal agencies, educational institutions, and contractors. The associated paper will discuss the dynamic databases within the NEIS, and the usefulness it provides for environmental compliance efforts.

  2. Laboratory Information Systems.

    PubMed

    Henricks, Walter H

    2015-06-01

    Laboratory information systems (LISs) supply mission-critical capabilities for the vast array of information-processing needs of modern laboratories. LIS architectures include mainframe, client-server, and thin client configurations. The LIS database software manages a laboratory's data. LIS dictionaries are database tables that a laboratory uses to tailor an LIS to the unique needs of that laboratory. Anatomic pathology LIS (APLIS) functions play key roles throughout the pathology workflow, and laboratories rely on LIS management reports to monitor operations. This article describes the structure and functions of APLISs, with emphasis on their roles in laboratory operations and their relevance to pathologists. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Deep Learning for Real-Time Capable Object Detection and Localization on Mobile Platforms

    NASA Astrophysics Data System (ADS)

    Particke, F.; Kolbenschlag, R.; Hiller, M.; Patiño-Studencki, L.; Thielecke, J.

    2017-10-01

    Industry 4.0 is one of the most formative terms in current times. Subject of research are particularly smart and autonomous mobile platforms, which enormously lighten the workload and optimize production processes. In order to interact with humans, the platforms need an in-depth knowledge of the environment. Hence, it is required to detect a variety of static and non-static objects. Goal of this paper is to propose an accurate and real-time capable object detection and localization approach for the use on mobile platforms. A method is introduced to use the powerful detection capabilities of a neural network for the localization of objects. Therefore, detection information of a neural network is combined with depth information from a RGB-D camera, which is mounted on a mobile platform. As detection network, YOLO Version 2 (YOLOv2) is used on a mobile robot. In order to find the detected object in the depth image, the bounding boxes, predicted by YOLOv2, are mapped to the corresponding regions in the depth image. This provides a powerful and extremely fast approach for establishing a real-time-capable Object Locator. In the evaluation part, the localization approach turns out to be very accurate. Nevertheless, it is dependent on the detected object itself and some additional parameters, which are analysed in this paper.

  4. Shining a light on planetary processes using synchrotron techniques

    NASA Astrophysics Data System (ADS)

    Brand, H. E. A.; Kimpton, J. A.

    2017-12-01

    The Australian Synchrotron is a world-class national research facility that uses accelerator technology to produce X-rays and infrared for research. It is available for researchers from all institutions and disciplines. This contribution is intended to inform the community of the current capabilities at the facility using examples drawn from planetary research across the beamlines. Examples will include: formation of jarosite minerals with a view to Mars; studies of Micrometeorites; and large volume CT imaging of geological samples. A suite of new beamlines has been proposed for the growth of the facility and one of these, ADS, the Advanced Diffraction and Scattering beamline, is intended to be a high energy X-ray diffraction beamline capable of reaching extreme conditions and carrying out challenging in situ experiments. There is an opportunity to develop complex new sample environments which could be of relevance to shock metamorphic processes and this will form part of the discussion.

  5. Real-Time Mapping alert system; characteristics and capabilities

    USGS Publications Warehouse

    Torres, L.A.; Lambert, S.C.; Liebermann, T.D.

    1995-01-01

    The U.S. Geological Survey has an extensive hydrologic network that records and transmits precipitation, stage, discharge, and other water-related data on a real-time basis to an automated data processing system. Data values are recorded on electronic data collection platforms at field sampling sites. These values are transmitted by means of orbiting satellites to receiving ground stations, and by way of telecommunication lines to a U.S. Geological Survey office where they are processed on a computer system. Data that exceed predefined thresholds are identified as alert values. The current alert status at monitoring sites within a state or region is of critical importance during floods, hurricanes, and other extreme hydrologic events. This report describes the characteristics and capabilities of a series of computer programs for real-time mapping of hydrologic data. The software provides interactive graphics display and query of hydrologic information from the network in a real-time, map-based, menu-driven environment.

  6. Surface water-groundwater exchange in transitional coastal environments by airborne electromagnetics: The Venice Lagoon example

    NASA Astrophysics Data System (ADS)

    Viezzoli, A.; Tosi, L.; Teatini, P.; Silvestri, S.

    2010-01-01

    A comprehensive investigation of the mixing between salt/fresh surficial water and groundwater in transitional environments is an issue of paramount importance considering the ecological, cultural, and socio-economic relevance of coastal zones. Acquiring information, which can improve the process understanding, is often logistically challenging, and generally expensive and slow in these areas. Here we investigate the capability of airborne electromagnetics (AEM) at the margin of the Venice Lagoon, Italy. The quasi-3D interpretation of the AEM outcome by the spatially constrained inversion (SCI) methodology allows us to accurately distinguish several hydrogeological features down to a depth of about 200 m. For example, the extent of the saltwater intrusion in coastal aquifers and the transition between the upper salt saturated and the underlying fresher sediments below the lagoon bottom are detected. The research highlights the AEM capability to improve the hydrogeological characterization of subsurface processes in worldwide lagoons, wetlands, deltas.

  7. Design and development of linked data from the National Map

    USGS Publications Warehouse

    Usery, E. Lynn; Varanka, Dalia E.

    2012-01-01

    The development of linked data on the World-Wide Web provides the opportunity for the U.S. Geological Survey (USGS) to supply its extensive volumes of geospatial data, information, and knowledge in a machine interpretable form and reach users and applications that heretofore have been unavailable. To pilot a process to take advantage of this opportunity, the USGS is developing an ontology for The National Map and converting selected data from nine research test areas to a Semantic Web format to support machine processing and linked data access. In a case study, the USGS has developed initial methods for legacy vector and raster formatted geometry, attributes, and spatial relationships to be accessed in a linked data environment maintaining the capability to generate graphic or image output from semantic queries. The description of an initial USGS approach to developing ontology, linked data, and initial query capability from The National Map databases is presented.

  8. Constraints, Approach, and Status of Mars Surveyor 2001 Landing Site Selection

    NASA Technical Reports Server (NTRS)

    Golombek, M.; Bridges, N.; Briggs, G.; Gilmore, M.; Haldemann, A.; Parker, T.; Saunders, R.; Spencer, D.; Smith, J.; Soderblom, L.

    1999-01-01

    There are many similarities between the Mars Surveyor '01 (MS '01) landing site selection process and that of Mars Pathfinder. The selection process includes two parallel activities in which engineers define and refine the capabilities of the spacecraft through design, testing and modeling and scientists define a set of landing site constraints based on the spacecraft design and landing scenario. As for Pathfinder, the safety of the site is without question the single most important factor, for the simple reason that failure to land safely yields no science and exposes the mission and program to considerable risk. The selection process must be thorough and defensible and capable of surviving multiple withering reviews similar to the Pathfinder decision. On Pathfinder, this was accomplished by attempting to understand the surface properties of sites using available remote sensing data sets and models based on them. Science objectives are factored into the selection process only after the safety of the site is validated. Finally, as for Pathfinder, the selection process is being done in an open environment with multiple opportunities for community involvement including open workshops, with education and outreach opportunities. Additional information is contained in the original extended abstract.

  9. From Process to Product: Your Risk Process at Work

    NASA Technical Reports Server (NTRS)

    Kundrot, Craig E.; Fogarty, Jenifer; Charles, John; Buquo, Lynn; Sibonga, Jean; Alexander, David; Horn, Wayne G.; Edwards, J. Michelle

    2010-01-01

    The Space Life Sciences Directorate (SLSD) and Human Research Program (HRP) at the NASA/Johnson Space Center work together to address and manage the human health and performance risks associated with human space flight. This includes all human system requirements before, during, and after space flight, providing for research, and managing the risk of adverse long-term health outcomes for the crew. We previously described the framework and processes developed for identifying and managing these human system risks. The focus of this panel is to demonstrate how the implementation of the framework and associated processes has provided guidance in the management and communication of human system risks. The risks of early onset osteoporosis, CO2 exposure, and intracranial hypertension in particular have all benefitted from the processes developed for human system risk management. Moreover, we are continuing to develop capabilities, particularly in the area of information architecture, which will also be described. We are working to create a system whereby all risks and associated actions can be tracked and related to one another electronically. Such a system will enhance the management and communication capabilities for the human system risks, thereby increasing the benefit to researchers and flight surgeons.

  10. Biometrics Enabling Capability Increment 1 (BEC Inc 1)

    DTIC Science & Technology

    2016-03-01

    2016 Major Automated Information System Annual Report Biometrics Enabling Capability Increment 1 (BEC Inc 1) Defense Acquisition Management...Phone: 227-3119 DSN Fax: Date Assigned: July 15, 2015 Program Information Program Name Biometrics Enabling Capability Increment 1 (BEC Inc 1) DoD...therefore, no Original Estimate has been established. BEC Inc 1 2016 MAR UNCLASSIFIED 4 Program Description The Biometrics Enabling Capability (BEC

  11. Multimodal quantitative phase and fluorescence imaging of cell apoptosis

    NASA Astrophysics Data System (ADS)

    Fu, Xinye; Zuo, Chao; Yan, Hao

    2017-06-01

    Fluorescence microscopy, utilizing fluorescence labeling, has the capability to observe intercellular changes which transmitted and reflected light microscopy techniques cannot resolve. However, the parts without fluorescence labeling are not imaged. Hence, the processes simultaneously happen in these parts cannot be revealed. Meanwhile, fluorescence imaging is 2D imaging where information in the depth is missing. Therefore the information in labeling parts is also not complete. On the other hand, quantitative phase imaging is capable to image cells in 3D in real time through phase calculation. However, its resolution is limited by the optical diffraction and cannot observe intercellular changes below 200 nanometers. In this work, fluorescence imaging and quantitative phase imaging are combined to build a multimodal imaging system. Such system has the capability to simultaneously observe the detailed intercellular phenomenon and 3D cell morphology. In this study the proposed multimodal imaging system is used to observe the cell behavior in the cell apoptosis. The aim is to highlight the limitations of fluorescence microscopy and to point out the advantages of multimodal quantitative phase and fluorescence imaging. The proposed multimodal quantitative phase imaging could be further applied in cell related biomedical research, such as tumor.

  12. DREAMS and IMAGE: A Model and Computer Implementation for Concurrent, Life-Cycle Design of Complex Systems

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.

    1995-01-01

    Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.

  13. Harnessing Disordered-Ensemble Quantum Dynamics for Machine Learning

    NASA Astrophysics Data System (ADS)

    Fujii, Keisuke; Nakajima, Kohei

    2017-08-01

    The quantum computer has an amazing potential of fast information processing. However, the realization of a digital quantum computer is still a challenging problem requiring highly accurate controls and key application strategies. Here we propose a platform, quantum reservoir computing, to solve these issues successfully by exploiting the natural quantum dynamics of ensemble systems, which are ubiquitous in laboratories nowadays, for machine learning. This framework enables ensemble quantum systems to universally emulate nonlinear dynamical systems including classical chaos. A number of numerical experiments show that quantum systems consisting of 5-7 qubits possess computational capabilities comparable to conventional recurrent neural networks of 100-500 nodes. This discovery opens up a paradigm for information processing with artificial intelligence powered by quantum physics.

  14. Modelling and Decision Support of Clinical Pathways

    NASA Astrophysics Data System (ADS)

    Gabriel, Roland; Lux, Thomas

    The German health care market is under a rapid rate of change, forcing especially hospitals to provide high-quality services at low costs. Appropriate measures for more effective and efficient service provision are process orientation and decision support by information technology of clinical pathway of a patient. The essential requirements are adequate modelling of clinical pathways as well as usage of adequate systems, which are capable of assisting the complete path of a patient within a hospital, and preferably also outside of it, in a digital way. To fulfil these specifications the authors present a suitable concept, which meets the challenges of well-structured clinical pathways as well as rather poorly structured diagnostic and therapeutic decisions, by interplay of process-oriented and knowledge-based hospital information systems.

  15. Future Gamma-Ray Observations of Pulsars and their Environments

    NASA Technical Reports Server (NTRS)

    Thompson, David J.

    2006-01-01

    Pulsars and pulsar wind nebulae seen at gamma-ray energies offer insight into particle acceleration to very high energies under extreme conditions. Pulsed emission provides information about the geometry and interaction processes in the magnetospheres of these rotating neutron stars, while the pulsar wind nebulae yield information about high-energy particles interacting with their surroundings. During the next decade, a number of new and expanded gamma-ray facilities will become available for pulsar studies, including Astro-rivelatore Gamma a Immagini LEggero (AGILE) and Gamma-ray Large Area Space Telescope (GLAST) in space and a number of higher-energy ground-based systems. This review describes the capabilities of such observatories to answer some of the open questions about the highest-energy processes involving neutron stars.

  16. Polar Environmental Monitoring

    NASA Technical Reports Server (NTRS)

    Nagler, R. G.; Schulteis, A. C.

    1979-01-01

    The present and projected benefits of the polar regions were reviewed and then translated into information needs in order to support the array of polar activities anticipated. These needs included measurement sensitivities for polar environmental data (ice/snow, atmosphere, and ocean data for integrated support) and the processing and delivery requirements which determine the effectiveness of environmental services. An assessment was made of how well electromagnetic signals can be converted into polar environmental information. The array of sensor developments in process or proposed were also evaluated as to the spectral diversity, aperture sizes, and swathing capabilities available to provide these measurements from spacecraft, aircraft, or in situ platforms. Global coverage and local coverage densification options were studied in terms of alternative spacecraft trajectories and aircraft flight paths.

  17. Using Graphical Processing Units to Accelerate Orthorectification, Atmospheric Correction and Transformations for Big Data

    NASA Astrophysics Data System (ADS)

    O'Connor, A. S.; Justice, B.; Harris, A. T.

    2013-12-01

    Graphics Processing Units (GPUs) are high-performance multiple-core processors capable of very high computational speeds and large data throughput. Modern GPUs are inexpensive and widely available commercially. These are general-purpose parallel processors with support for a variety of programming interfaces, including industry standard languages such as C. GPU implementations of algorithms that are well suited for parallel processing can often achieve speedups of several orders of magnitude over optimized CPU codes. Significant improvements in speeds for imagery orthorectification, atmospheric correction, target detection and image transformations like Independent Components Analsyis (ICA) have been achieved using GPU-based implementations. Additional optimizations, when factored in with GPU processing capabilities, can provide 50x - 100x reduction in the time required to process large imagery. Exelis Visual Information Solutions (VIS) has implemented a CUDA based GPU processing frame work for accelerating ENVI and IDL processes that can best take advantage of parallelization. Testing Exelis VIS has performed shows that orthorectification can take as long as two hours with a WorldView1 35,0000 x 35,000 pixel image. With GPU orthorecification, the same orthorectification process takes three minutes. By speeding up image processing, imagery can successfully be used by first responders, scientists making rapid discoveries with near real time data, and provides an operational component to data centers needing to quickly process and disseminate data.

  18. Platform for Post-Processing Waveform-Based NDE

    NASA Technical Reports Server (NTRS)

    Roth, Don J.

    2010-01-01

    Signal- and image-processing methods are commonly needed to extract information from the waves, improve resolution of, and highlight defects in an image. Since some similarity exists for all waveform-based nondestructive evaluation (NDE) methods, it would seem that a common software platform containing multiple signal- and image-processing techniques to process the waveforms and images makes sense where multiple techniques, scientists, engineers, and organizations are involved. NDE Wave & Image Processor Version 2.0 software provides a single, integrated signal- and image-processing and analysis environment for total NDE data processing and analysis. It brings some of the most useful algorithms developed for NDE over the past 20 years into a commercial-grade product. The software can import signal/spectroscopic data, image data, and image series data. This software offers the user hundreds of basic and advanced signal- and image-processing capabilities including esoteric 1D and 2D wavelet-based de-noising, de-trending, and filtering. Batch processing is included for signal- and image-processing capability so that an optimized sequence of processing operations can be applied to entire folders of signals, spectra, and images. Additionally, an extensive interactive model-based curve-fitting facility has been included to allow fitting of spectroscopy data such as from Raman spectroscopy. An extensive joint-time frequency module is included for analysis of non-stationary or transient data such as that from acoustic emission, vibration, or earthquake data.

  19. The evolving identity, capacity, and capability of the future surgeon.

    PubMed

    Himidan, Sharifa; Kim, Peter

    2015-06-01

    Technology has transformed surgery more within the last 30 years than the previous 2000 years of human history combined. These innovations have changed not only how the surgeon practices but have also altered the very essence of what it is to be a surgeon in the modern era. Beyond the industrial revolution, today's information revolution allows patients access to an abundance of easily accessible, unfiltered information which they can use to evaluate their surgical treatment, and truly participate in their personal care. We are entering yet another revolution specifically affecting surgeons, where the traditional surgical tools of our craft are becoming "smart." Intelligence in surgical tools and connectivity based on sensory data, processing, and analysis are enabling and enhancing a surgeon's capacity and capability. Given the tempo of change, within one generation the traditional role and identity of a surgeon will be fully transformed. In this article, the impact of the information revolution, technological advances combined with smart connectivity on the changing role of surgery will be considered. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Use of artificial intelligence in analytical systems for the clinical laboratory

    PubMed Central

    Truchaud, Alain; Ozawa, Kyoichi; Pardue, Harry; Schnipelsky, Paul

    1995-01-01

    The incorporation of information-processing technology into analytical systems in the form of standard computing software has recently been advanced by the introduction of artificial intelligence (AI), both as expert systems and as neural networks. This paper considers the role of software in system operation, control and automation, and attempts to define intelligence. AI is characterized by its ability to deal with incomplete and imprecise information and to accumulate knowledge. Expert systems, building on standard computing techniques, depend heavily on the domain experts and knowledge engineers that have programmed them to represent the real world. Neural networks are intended to emulate the pattern-recognition and parallel processing capabilities of the human brain and are taught rather than programmed. The future may lie in a combination of the recognition ability of the neural network and the rationalization capability of the expert system. In the second part of the paper, examples are given of applications of AI in stand-alone systems for knowledge engineering and medical diagnosis and in embedded systems for failure detection, image analysis, user interfacing, natural language processing, robotics and machine learning, as related to clinical laboratories. It is concluded that AI constitutes a collective form of intellectual propery, and that there is a need for better documentation, evaluation and regulation of the systems already being used in clinical laboratories. PMID:18924784

  1. Information Quality Evaluation of C2 Systems at Architecture Level

    DTIC Science & Technology

    2014-06-01

    based on architecture models of C2 systems, which can help to identify key factors impacting information quality and improve the system capability at the stage of architecture design of C2 system....capability evaluation of C2 systems at architecture level becomes necessary and important for improving the system capability at the stage of architecture ... design . This paper proposes a method for information quality evaluation of C2 system at architecture level. First, the information quality model is

  2. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain.

    PubMed

    Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A

    2011-11-29

    Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios.

  3. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain

    PubMed Central

    2011-01-01

    Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios. PMID:22126392

  4. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation †

    PubMed Central

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-01-01

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted. PMID:28287448

  5. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation.

    PubMed

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-03-10

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted.

  6. Apply creative thinking of decision support in electrical nursing record.

    PubMed

    Hao, Angelica Te-Hui; Hsu, Chien-Yeh; Li-Fang, Huang; Jian, Wen-Shan; Wu, Li-Bin; Kao, Ching-Chiu; Lu, Mei-Show; Chang, Her-Kung

    2006-01-01

    The nursing process consists of five interrelated steps: assessment, diagnosis, planning, intervention, and evaluation. In the nursing process, the nurse collects a great deal of data and information. The amount of data and information may exceed the amount the nurse can process efficiently and correctly. Thus, the nurse needs assistance to become proficient in the planning of nursing care, due to the difficulty of simultaneously processing a large set of information. Computer systems are viewed as tools to expand the capabilities of the nurse's mind. Using computer technology to support clinicians' decision making may provide high-quality, patient-centered, and efficient healthcare. Although some existing nursing information systems aid in the nursing process, they only provide the most fundamental decision support--i.e., standard care plans associated with common nursing diagnoses. Such a computerized decision support system helps the nurse develop a care plan step-by-step. But it does not assist the nurse in the decision-making process. The decision process about how to generate nursing diagnoses from data and how to individualize the care plans still reminds of the nurse. The purpose of this study is to develop a pilot structure in electronic nursing record system integrated with international nursing standard for improving the proficiency and accuracy of plan of care in clinical pathway process. The proposed pilot systems not only assist both student nurses and nurses who are novice in nursing practice, but also experts who need to work in a practice area which they are not familiar with.

  7. Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane

    2012-01-01

    Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements.

  8. E-Government Goes Semantic Web: How Administrations Can Transform Their Information Processes

    NASA Astrophysics Data System (ADS)

    Klischewski, Ralf; Ukena, Stefan

    E-government applications and services are built mainly on access to, retrieval of, integration of, and delivery of relevant information to citizens, businesses, and administrative users. In order to perform such information processing automatically through the Semantic Web,1 machine-readable2 enhancements of web resources are needed, based on the understanding of the content and context of the information in focus. While these enhancements are far from trivial to produce, administrations in their role of information and service providers so far find little guidance on how to migrate their web resources and enable a new quality of information processing; even research is still seeking best practices. Therefore, the underlying research question of this chapter is: what are the appropriate approaches which guide administrations in transforming their information processes toward the Semantic Web? In search for answers, this chapter analyzes the challenges and possible solutions from the perspective of administrations: (a) the reconstruction of the information processing in the e-government in terms of how semantic technologies must be employed to support information provision and consumption through the Semantic Web; (b) the required contribution to the transformation is compared to the capabilities and expectations of administrations; and (c) available experience with the steps of transformation are reviewed and discussed as to what extent they can be expected to successfully drive the e-government to the Semantic Web. This research builds on studying the case of Schleswig-Holstein, Germany, where semantic technologies have been used within the frame of the Access-eGov3 project in order to semantically enhance electronic service interfaces with the aim of providing a new way of accessing and combining e-government services.

  9. Automation technology using Geographic Information System (GIS)

    NASA Technical Reports Server (NTRS)

    Brooks, Cynthia L.

    1994-01-01

    Airport Surface Movement Area is but one of the actions taken to increase the capacity and safety of existing airport facilities. The System Integration Branch (SIB) has designed an integrated system consisting of an electronic moving display in the cockpit, and includes display of taxi routes which will warn controllers and pilots of the position of other traffic and warning information automatically. Although, this system has in test simulation proven to be accurate and helpful; the initial process of obtaining an airport layout of the taxi-routes and designing each of them is a very tedious and time-consuming process. Other methods of preparing the display maps are being researched. One such method is the use of the Geographical Information System (GIS). GIS is an integrated system of computer hardware and software linking topographical, demographic and other resource data that is being referenced. The software can support many areas of work with virtually unlimited information compatibility due to the system's open architecture. GIS will allow us to work faster with increased efficiency and accuracy while providing decision making capabilities. GIS is currently being used at the Langley Research Center with other applications and has been validated as an accurate system for that task. GIS usage for our task will involve digitizing aerial photographs of the topology for each taxi-runway and identifying each position according to its specific spatial coordinates. The information currently being used can be integrated with the GIS system, due to its ability to provide a wide variety of user interfaces. Much more research and data analysis will be needed before this technique will be used, however we are hopeful this will lead to better usage of man-power and technological capabilities for the future.

  10. A multistage motion vector processing method for motion-compensated frame interpolation.

    PubMed

    Huang, Ai- Mei; Nguyen, Truong Q

    2008-05-01

    In this paper, a novel, low-complexity motion vector processing algorithm at the decoder is proposed for motion-compensated frame interpolation or frame rate up-conversion. We address the problems of having broken edges and deformed structures in an interpolated frame by hierarchically refining motion vectors on different block sizes. Our method explicitly considers the reliability of each received motion vector and has the capability of preserving the structure information. This is achieved by analyzing the distribution of residual energies and effectively merging blocks that have unreliable motion vectors. The motion vector reliability information is also used as a prior knowledge in motion vector refinement using a constrained vector median filter to avoid choosing identical unreliable one. We also propose using chrominance information in our method. Experimental results show that the proposed scheme has better visual quality and is also robust, even in video sequences with complex scenes and fast motion.

  11. Cooperation, Coordination, and Trust in Virtual Teams: Insights from Virtual Games

    NASA Astrophysics Data System (ADS)

    Korsgaard, M. Audrey; Picot, Arnold; Wigand, Rolf T.; Welpe, Isabelle M.; Assmann, Jakob J.

    This chapter considers fundamental concepts of effective virtual teams, illustrated by research on Travian, a massively multiplayer online strategy game wherein players seek to build empires. Team inputs are the resources that enable individuals to work interdependently toward a common goal, including individual and collective capabilities, shared knowledge structures, and leadership style. Team processes, notably coordination and cooperation, transform team inputs to desired collective outcomes. Because the members of virtual teams are geographically dispersed, relying on information and communication technology, three theories are especially relevant for understanding how they can function effectively: social presence theory, media richness theory, and media synchronicity theory. Research in settings like Travian can inform our understanding of structures, processes, and performance of virtual teams. Such research could provide valuable insight into the emergence and persistence of trust and cooperation, as well as the impact of different communication media for coordination and information management in virtual organizations.

  12. Interoperability Matter: Levels of Data Sharing, Starting from a 3d Information Modelling

    NASA Astrophysics Data System (ADS)

    Tommasi, C.; Achille, C.

    2017-02-01

    Nowadays, the adoption of BIM processes in the AEC (Architecture, Engineering and Construction) industry means to be oriented towards synergistic workflows, based on informative instruments capable of realizing the virtual model of the building. The target of this article is to speak about the interoperability matter, approaching the subject through a theoretical part and also a practice example, in order to show how these notions are applicable in real situations. In particular, the case study analysed belongs to the Cultural Heritage field, where it is possible to find some difficulties - both in the modelling and sharing phases - due to the complexity of shapes and elements. Focusing on the interoperability between different software, the questions are: What and how many kind of information can I share? Given that this process leads also to a standardization of the modelled parts, is there the possibility of an accuracy loss?

  13. Streamlining environmental product declarations: a stage model

    NASA Astrophysics Data System (ADS)

    Lefebvre, Elisabeth; Lefebvre, Louis A.; Talbot, Stephane; Le Hen, Gael

    2001-02-01

    General public environmental awareness and education is increasing, therefore stimulating the demand for reliable, objective and comparable information about products' environmental performances. The recently published standard series ISO 14040 and ISO 14025 are normalizing the preparation of Environmental Product Declarations (EPDs) containing comprehensive information relevant to a product's environmental impact during its life cycle. So far, only a few environmentally leading manufacturing organizations have experimented the preparation of EPDs (mostly from Europe), demonstrating its great potential as a marketing weapon. However the preparation of EPDs is a complex process, requiring collection and analysis of massive amounts of information coming from disparate sources (suppliers, sub-contractors, etc.). In a foreseeable future, the streamlining of the EPD preparation process will require product manufacturers to adapt their information systems (ERP, MES, SCADA) in order to make them capable of gathering, and transmitting the appropriate environmental information. It also requires strong functional integration all along the product supply chain in order to ensure that all the information is made available in a standardized and timely manner. The goal of the present paper is two fold: first to propose a transitional model towards green supply chain management and EPD preparation; second to identify key technologies and methodologies allowing to streamline the EPD process and subsequently the transition toward sustainable product development

  14. A trillion frames per second: the techniques and applications of light-in-flight photography.

    PubMed

    Faccio, Daniele; Velten, Andreas

    2018-06-14

    Cameras capable of capturing videos at a trillion frames per second allow to freeze light in motion, a very counterintuitive capability when related to our everyday experience in which light appears to travel instantaneously. By combining this capability with computational imaging techniques, new imaging opportunities emerge such as three dimensional imaging of scenes that are hidden behind a corner, the study of relativistic distortion effects, imaging through diffusive media and imaging of ultrafast optical processes such as laser ablation, supercontinuum and plasma generation. We provide an overview of the main techniques that have been developed for ultra-high speed photography with a particular focus on `light in flight' imaging, i.e. applications where the key element is the imaging of light itself at frame rates that allow to freeze it's motion and therefore extract information that would otherwise be blurred out and lost. . © 2018 IOP Publishing Ltd.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lingerfelt, Eric J; Endeve, Eirik; Hui, Yawei

    Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now--with the rise of multimodal acquisition systems and the associated processing capability--the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalablemore » data analysis and simulation and manage uploaded data files via an intuitive, cross-platform client user interface. This framework delivers authenticated, "push-button" execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing compute-and-data cloud infrastructures and HPC environments like Titan at the Oak Ridge Leadershp Computing Facility (OLCF).« less

  16. 7th Annual CMMI Technology Conference and User Group - Investigation, Measures and Lessons Learned about the Relationship between CMMI Process Capability and Project or Program Performance. Volume 2. Wednesday Presentations

    DTIC Science & Technology

    2007-11-15

    Intelligence and Information Systems (IIS) Enterprise CMMI® ML3 SCAMPI(SM) SE/SW/IPPD/SS #5382 Raymond L. Kile , SEI Authorized Lead Appraiser Kathryn...Kirby, Raytheon IIS Process Assessments IPT Lead Picking a Representative Sample For CMMI® Enterprise Appraisals Page 2 Introductions Ray Kile has thirty...University of Missouri. Raymond L. Kile Chief Engineer Center for Systems Management 1951 Kidwell Drive, Suite 750 Vienna, VA 22182 303-601-8978 rkile@csm.com

  17. Periodic, On-Demand, and User-Specified Information Reconciliation

    NASA Technical Reports Server (NTRS)

    Kolano, Paul

    2007-01-01

    Automated sequence generation (autogen) signifies both a process and software used to automatically generate sequences of commands to operate various spacecraft. Autogen requires fewer workers than are needed for older manual sequence-generation processes and reduces sequence-generation times from weeks to minutes. The autogen software comprises the autogen script plus the Activity Plan Generator (APGEN) program. APGEN can be used for planning missions and command sequences. APGEN includes a graphical user interface that facilitates scheduling of activities on a time line and affords a capability to automatically expand, decompose, and schedule activities.

  18. High Resolution Imaging of the Sun with CORONAS-1

    NASA Technical Reports Server (NTRS)

    Karovska, Margarita

    1998-01-01

    We applied several image restoration and enhancement techniques, to CORONAS-I images. We carried out the characterization of the Point Spread Function (PSF) using the unique capability of the Blind Iterative Deconvolution (BID) technique, which recovers the real PSF at a given location and time of observation, when limited a priori information is available on its characteristics. We also applied image enhancement technique to extract the small scale structure imbeded in bright large scale structures on the disk and on the limb. The results demonstrate the capability of the image post-processing to substantially increase the yield from the space observations by improving the resolution and reducing noise in the images.

  19. Anomaly Detection in Power Quality at Data Centers

    NASA Technical Reports Server (NTRS)

    Grichine, Art; Solano, Wanda M.

    2015-01-01

    The goal during my internship at the National Center for Critical Information Processing and Storage (NCCIPS) is to implement an anomaly detection method through the StruxureWare SCADA Power Monitoring system. The benefit of the anomaly detection mechanism is to provide the capability to detect and anticipate equipment degradation by monitoring power quality prior to equipment failure. First, a study is conducted that examines the existing techniques of power quality management. Based on these findings, and the capabilities of the existing SCADA resources, recommendations are presented for implementing effective anomaly detection. Since voltage, current, and total harmonic distortion demonstrate Gaussian distributions, effective set-points are computed using this model, while maintaining a low false positive count.

  20. ISHM Implementation for Constellation Systems

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Holland, Randy; Schmalzel, John; Duncavage, Dan; Crocker, Alan; Alena, Rick

    2006-01-01

    Integrated System Health Management (ISHM) is a capability that focuses on determining the condition (health) of every element in a complex System (detect anomalies, diagnose causes, prognosis of future anomalies), and provide data, information, and knowledge (DIaK) "not just data" to control systems for safe and effective operation. This capability is currently done by large teams of people, primarily from ground, but needs to be embedded on-board systems to a higher degree to enable NASA's new Exploration Mission (long term travel and stay in space), while increasing safety and decreasing life cycle costs of systems (vehicles; platforms; bases or outposts; and ground test, launch, and processing operations). This viewgraph presentation reviews the use of ISHM for the Constellation system.

  1. Literature Review and Assessment of Nanotechnology for Sensing of Timber Transportation Structures Final Report

    Treesearch

    Terry Wipf; Brent M. Phares; Micheal Ritter

    2012-01-01

    Recently efforts have been put toward the development of civil structures that have embedded sensors and on-board data processing capabilities, typically termed “smart structures.” The fusion of these smart technologies into infrastructures is intended to give bridge owners/managers better and more timely information on how structures are behaving and when they need...

  2. Open Innovation and Technology Maturity Analysis

    DTIC Science & Technology

    2007-09-11

    Management Process Develop a framework which incorporates DoD Acquisition Management framework (e.g: TRLs), DoD Business Transformation strategies...Public Organizations (DoD): DoD Force Transformation : • Support the Joint Warfighting Capability of the DoD • Enable Rapid Access to Information for...Survey - 2007  Defense Transformation : Clear Leadership, Accountability, and Management Tools Are Needed to Enhance DOD’s Efforts to Transform Military

  3. An automated system for terrain database construction

    NASA Technical Reports Server (NTRS)

    Johnson, L. F.; Fretz, R. K.; Logan, T. L.; Bryant, N. A.

    1987-01-01

    An automated Terrain Database Preparation System (TDPS) for the construction and editing of terrain databases used in computerized wargaming simulation exercises has been developed. The TDPS system operates under the TAE executive, and it integrates VICAR/IBIS image processing and Geographic Information System software with CAD/CAM data capture and editing capabilities. The terrain database includes such features as roads, rivers, vegetation, and terrain roughness.

  4. 15 CFR Supplement No. 2 to Part 742 - Anti-Terrorism Controls: North Korea, Syria and Sudan Contract Sanctity Dates and Related Policies

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) Robots capable of employing feedback information in real time processing to generate or modify programs...-uses in Syria will be considered on a case-by case basis. (A) Contract sanctity date for such robots... Supplement. (B) Contract sanctity date for all other such robots: August 28, 1991. (iii) Sudan. Applications...

  5. 15 CFR Supplement No. 2 to Part 742 - Anti-Terrorism Controls: North Korea, Syria and Sudan Contract Sanctity Dates and Related Policies

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) Robots capable of employing feedback information in real time processing to generate or modify programs...-uses in Syria will be considered on a case-by case basis. (A) Contract sanctity date for such robots... Supplement. (B) Contract sanctity date for all other such robots: August 28, 1991. (iii) Sudan. Applications...

  6. 15 CFR Supplement No. 2 to Part 742 - Anti-Terrorism Controls: North Korea, Syria and Sudan Contract Sanctity Dates and Related Policies

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) Robots capable of employing feedback information in real time processing to generate or modify programs...-uses in Syria will be considered on a case-by case basis. (A) Contract sanctity date for such robots... Supplement. (B) Contract sanctity date for all other such robots: August 28, 1991. (iii) Sudan. Applications...

  7. 15 CFR Supplement No. 2 to Part 742 - Anti-Terrorism Controls: North Korea, Syria and Sudan Contract Sanctity Dates and Related Policies

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) Robots capable of employing feedback information in real time processing to generate or modify programs...-uses in Syria will be considered on a case-by case basis. (A) Contract sanctity date for such robots... Supplement. (B) Contract sanctity date for all other such robots: August 28, 1991. (iii) Sudan. Applications...

  8. Assessment of Process Capability: the case of Soft Drinks Processing Unit

    NASA Astrophysics Data System (ADS)

    Sri Yogi, Kottala

    2018-03-01

    The process capability studies have significant impact in investigating process variation which is important in achieving product quality characteristics. Its indices are to measure the inherent variability of a process and thus to improve the process performance radically. The main objective of this paper is to understand capability of the process being produced within specification of the soft drinks processing unit, a premier brands being marketed in India. A few selected critical parameters in soft drinks processing: concentration of gas volume, concentration of brix, torque of crock has been considered for this study. Assessed some relevant statistical parameters: short term capability, long term capability as a process capability indices perspective. For assessment we have used real time data of soft drinks bottling company which is located in state of Chhattisgarh, India. As our research output suggested reasons for variations in the process which is validated using ANOVA and also predicted Taguchi cost function, assessed also predicted waste monetarily this shall be used by organization for improving process parameters. This research work has substantially benefitted the organization in understanding the various variations of selected critical parameters for achieving zero rejection.

  9. EEG-based time and spatial interpretation of activation areas for relaxation and words writing between poor and capable dyslexic children.

    PubMed

    Mohamad, N B; Lee, Khuan Y; Mansor, W; Mahmoodin, Z; Fadzal, C W N F C W; Amirin, S

    2015-01-01

    Symptoms of dyslexia such as difficulties with accurate and/or fluent word recognition, and/or poor spelling as well as decoding abilities, are easily misinterpreted as laziness and defiance amongst school children. Indeed, 37.9% of 699 school dropouts and failures are diagnosed as dyslexic. Currently, Screening for dyslexia relies heavily on therapists, whom are few and subjective, yet objective methods are still unavailable. EEG has long been a popular method to study the cognitive processes in human such as language processing and motor activity. However, its interpretation is limited to time and frequency domain, without visual information, which is still useful. Here, our research intends to illustrate an EEG-based time and spatial interpretation of activated brain areas for the poor and capable dyslexic during the state of relaxation and words writing, being the first attempt ever reported. From the 2D distribution of EEG spectral at the activation areas and its progress with time, it is observed that capable dyslexics are able to relax compared to poor dyslexics. During the state of words writing, neural activities are found higher on the right hemisphere than the left hemisphere of the capable dyslexics, which suggests a neurobiological compensation pathway in the right hemisphere, during reading and writing, which is not observed in the poor dyslexics.

  10. NOAA Climate Program Office Contributions to National ESPC

    NASA Astrophysics Data System (ADS)

    Higgins, W.; Huang, J.; Mariotti, A.; Archambault, H. M.; Barrie, D.; Lucas, S. E.; Mathis, J. T.; Legler, D. M.; Pulwarty, R. S.; Nierenberg, C.; Jones, H.; Cortinas, J. V., Jr.; Carman, J.

    2016-12-01

    NOAA is one of five federal agencies (DOD, DOE, NASA, NOAA, and NSF) which signed an updated charter in 2016 to partner on the National Earth System Prediction Capability (ESPC). Situated within NOAA's Office of Oceanic and Atmospheric Research (OAR), NOAA Climate Program Office (CPO) programs contribute significantly to the National ESPC goals and activities. This presentation will provide an overview of CPO contributions to National ESPC. First, we will discuss selected CPO research and transition activities that directly benefit the ESPC coupled model prediction capability, including The North American Multi-Model Ensemble (NMME) seasonal prediction system The Subseasonal Experiment (SubX) project to test real-time subseasonal ensemble prediction systems. Improvements to the NOAA operational Climate Forecast System (CFS), including software infrastructure and data assimilation. Next, we will show how CPO's foundational research activities are advancing future ESPC capabilities. Highlights will include: The Tropical Pacific Observing System (TPOS) to provide the basis for predicting climate on subseasonal to decadal timescales. Subseasonal-to-Seasonal (S2S) processes and predictability studies to improve understanding, modeling and prediction of the MJO. An Arctic Research Program to address urgent needs for advancing monitoring and prediction capabilities in this major area of concern. Advances towards building an experimental multi-decadal prediction system through studies on the Atlantic Meridional Overturning Circulation (AMOC). Finally, CPO has embraced Integrated Information Systems (IIS's) that build on the innovation of programs such as the National Integrated Drought Information System (NIDIS) to develop and deliver end to end environmental information for key societal challenges (e.g. extreme heat; coastal flooding). These contributions will help the National ESPC better understand and address societal needs and decision support requirements.

  11. Automated sleep stage detection with a classical and a neural learning algorithm--methodological aspects.

    PubMed

    Schwaibold, M; Schöchlin, J; Bolz, A

    2002-01-01

    For classification tasks in biosignal processing, several strategies and algorithms can be used. Knowledge-based systems allow prior knowledge about the decision process to be integrated, both by the developer and by self-learning capabilities. For the classification stages in a sleep stage detection framework, three inference strategies were compared regarding their specific strengths: a classical signal processing approach, artificial neural networks and neuro-fuzzy systems. Methodological aspects were assessed to attain optimum performance and maximum transparency for the user. Due to their effective and robust learning behavior, artificial neural networks could be recommended for pattern recognition, while neuro-fuzzy systems performed best for the processing of contextual information.

  12. Spectral OCT with speckle contrast reduction for evaluation of the healing process after PRK and transepithelial PRK

    PubMed Central

    Kaluzny, Bartlomiej J.; Szkulmowski, Maciej; Bukowska, Danuta M.; Wojtkowski, Maciej

    2014-01-01

    We evaluate Spectral OCT (SOCT) with a speckle contrast reduction technique using resonant scanner for assessment of corneal surface changes after excimer laser photorefractive keratectomy (PRK) and we compare healing process between conventional PRK and transepithelial PRK. The measurements were performed before and after the surgery. Obtained results show that SOCT with a resonant scanner speckle contrast reduction is capable of providing information regarding the healing process after PRK. The main difference between the healing processes of PRK and TransPRK, assessed by SOCT, was the time to cover the stroma with epithelium, which was shorter in the TransPRK group. PMID:24761291

  13. Community of Practice: A Path to Strategic Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nancy M. Carlson

    2003-04-01

    To explore the concept of community of practice, the research initially concentrates on a strategic business process in a research and applied engineering laboratory discovering essential communication tools and processes needed to cultivate a high functioning cross-disciplinary team engaged in proposal preparation. Qualitative research in the human ecology of the proposal process blends topic-oriented ethnography and grounded theory and includes an innovative addition to qualitative interviewing, called meta-inquiry. Meta-inquiry uses an initial interview protocol with a homogeneous pool of informants to enhance the researcher's sensitivity to the unique cultures involved in the proposal process before developing a formal interview protocol.more » In this study the preanalysis process uses data from editors, graphic artists, text processors, and production coordinators to assess, modify, enhance, and focus the formal interview protocol with scientists, engineers, and technical managers-the heterogeneous informants. Thus this human ecology-based interview protocol values homogeneous and heterogeneous informant data and acquires data from which concepts, categories, properties, and both substantive and formal theory emerges. The research discovers the five essential processes of owning, visioning, reviewing, producing, and contributing for strategic learning to occur in a proposal community of practice. The apprenticeship, developmental, and nurturing perspectives of adult learning provide the proposal community of practice with cohesion, interdependence, and caring, while core and boundary practices provide insight into the tacit and explicit dimensions of the proposal process. By making these dimensions explicit, the necessary competencies, absorptive capacity, and capabilities needed for strategic learning are discovered. Substantive theory emerges and provides insight into the ability of the proposal community of practice to evolve, flourish, and adapt to the strategic advantage of the laboratory. The substantive theory explores the dimensions of owning, visioning, reviewing, producing, and contributing and their interrelationship to community learning dynamics. Through dialogue, creative tension, and imagination, the proposal community of practice focuses on actionable goals linked by proactively participating in practice, creating possibilities, evaluating and enhancing potential, producing a valued product, and confirming strategic value. Lastly, a formal theory emerges linking competency-capacity-capability, cohesion, interdependence, and caring as essential attributes of strategic learning communities.« less

  14. The role of 3-D interactive visualization in blind surveys of H I in galaxies

    NASA Astrophysics Data System (ADS)

    Punzo, D.; van der Hulst, J. M.; Roerdink, J. B. T. M.; Oosterloo, T. A.; Ramatsoku, M.; Verheijen, M. A. W.

    2015-09-01

    Upcoming H I surveys will deliver large datasets, and automated processing using the full 3-D information (two positional dimensions and one spectral dimension) to find and characterize H I objects is imperative. In this context, visualization is an essential tool for enabling qualitative and quantitative human control on an automated source finding and analysis pipeline. We discuss how Visual Analytics, the combination of automated data processing and human reasoning, creativity and intuition, supported by interactive visualization, enables flexible and fast interaction with the 3-D data, helping the astronomer to deal with the analysis of complex sources. 3-D visualization, coupled to modeling, provides additional capabilities helping the discovery and analysis of subtle structures in the 3-D domain. The requirements for a fully interactive visualization tool are: coupled 1-D/2-D/3-D visualization, quantitative and comparative capabilities, combined with supervised semi-automated analysis. Moreover, the source code must have the following characteristics for enabling collaborative work: open, modular, well documented, and well maintained. We review four state of-the-art, 3-D visualization packages assessing their capabilities and feasibility for use in the case of 3-D astronomical data.

  15. Towards a systems approach to risk considerations for concurrent design

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Oberto, Robert E.

    2004-01-01

    This paper describes the new process used by the Project Design Center at NASA's Jet Propulsion Laboratory for the identification, assessment and communication of risk elements throughout the lifecycle of a mission design. This process includes a software tool, 'RAP' that collects and communicates risk information between the various designers and a 'risk expert' who mediates this process. The establishment of this process is an attempt towards the systematic consideration of risk in the design decision making process. Using this process, we are able to better keep track of the risks associated with the design decisions. Furthermore, it helps us develop better risk profiles for the studies under consideration. We aim to refine and expand the current process to enable more thorough risk analysis capabilities in the future.

  16. On-line soft sensing in upstream bioprocessing.

    PubMed

    Randek, Judit; Mandenius, Carl-Fredrik

    2018-02-01

    This review provides an overview and a critical discussion of novel possibilities of applying soft sensors for on-line monitoring and control of industrial bioprocesses. Focus is on bio-product formation in the upstream process but also the integration with other parts of the process is addressed. The term soft sensor is used for the combination of analytical hardware data (from sensors, analytical devices, instruments and actuators) with mathematical models that create new real-time information about the process. In particular, the review assesses these possibilities from an industrial perspective, including sensor performance, information value and production economy. The capabilities of existing analytical on-line techniques are scrutinized in view of their usefulness in soft sensor setups and in relation to typical needs in bioprocessing in general. The review concludes with specific recommendations for further development of soft sensors for the monitoring and control of upstream bioprocessing.

  17. Large-Scale Fluorescence Calcium-Imaging Methods for Studies of Long-Term Memory in Behaving Mammals

    PubMed Central

    Jercog, Pablo; Rogerson, Thomas; Schnitzer, Mark J.

    2016-01-01

    During long-term memory formation, cellular and molecular processes reshape how individual neurons respond to specific patterns of synaptic input. It remains poorly understood how such changes impact information processing across networks of mammalian neurons. To observe how networks encode, store, and retrieve information, neuroscientists must track the dynamics of large ensembles of individual cells in behaving animals, over timescales commensurate with long-term memory. Fluorescence Ca2+-imaging techniques can monitor hundreds of neurons in behaving mice, opening exciting avenues for studies of learning and memory at the network level. Genetically encoded Ca2+ indicators allow neurons to be targeted by genetic type or connectivity. Chronic animal preparations permit repeated imaging of neural Ca2+ dynamics over multiple weeks. Together, these capabilities should enable unprecedented analyses of how ensemble neural codes evolve throughout memory processing and provide new insights into how memories are organized in the brain. PMID:27048190

  18. Neuromorphic photonic networks using silicon photonic weight banks.

    PubMed

    Tait, Alexander N; de Lima, Thomas Ferreira; Zhou, Ellen; Wu, Allie X; Nahmias, Mitchell A; Shastri, Bhavin J; Prucnal, Paul R

    2017-08-07

    Photonic systems for high-performance information processing have attracted renewed interest. Neuromorphic silicon photonics has the potential to integrate processing functions that vastly exceed the capabilities of electronics. We report first observations of a recurrent silicon photonic neural network, in which connections are configured by microring weight banks. A mathematical isomorphism between the silicon photonic circuit and a continuous neural network model is demonstrated through dynamical bifurcation analysis. Exploiting this isomorphism, a simulated 24-node silicon photonic neural network is programmed using "neural compiler" to solve a differential system emulation task. A 294-fold acceleration against a conventional benchmark is predicted. We also propose and derive power consumption analysis for modulator-class neurons that, as opposed to laser-class neurons, are compatible with silicon photonic platforms. At increased scale, Neuromorphic silicon photonics could access new regimes of ultrafast information processing for radio, control, and scientific computing.

  19. A framework for the computer-aided planning and optimisation of manufacturing processes for components with functional graded properties

    NASA Astrophysics Data System (ADS)

    Biermann, D.; Gausemeier, J.; Heim, H.-P.; Hess, S.; Petersen, M.; Ries, A.; Wagner, T.

    2014-05-01

    In this contribution a framework for the computer-aided planning and optimisation of functional graded components is presented. The framework is divided into three modules - the "Component Description", the "Expert System" for the synthetisation of several process chains and the "Modelling and Process Chain Optimisation". The Component Description module enhances a standard computer-aided design (CAD) model by a voxel-based representation of the graded properties. The Expert System synthesises process steps stored in the knowledge base to generate several alternative process chains. Each process chain is capable of producing components according to the enhanced CAD model and usually consists of a sequence of heating-, cooling-, and forming processes. The dependencies between the component and the applied manufacturing processes as well as between the processes themselves need to be considered. The Expert System utilises an ontology for that purpose. The ontology represents all dependencies in a structured way and connects the information of the knowledge base via relations. The third module performs the evaluation of the generated process chains. To accomplish this, the parameters of each process are optimised with respect to the component specification, whereby the result of the best parameterisation is used as representative value. Finally, the process chain which is capable of manufacturing a functionally graded component in an optimal way regarding to the property distributions of the component description is presented by means of a dedicated specification technique.

  20. Technical Potential Assessment for the Renewable Energy Zone (REZ) Process: A GIS-Based Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Nathan; Roberts, Billy J

    Geographic Information Systems (GIS)-based energy resource and technical potential assessments identify areas capable of supporting high levels of renewable energy (RE) development as part of a Renewable Energy Zone (REZ) Transmission Planning process. This document expands on the REZ Process to aid practitioners in conducting GIS-based RE resource and technical potential assessments. The REZ process is an approach to plan, approve, and build transmission infrastructure that connects REZs - geographic areas that have high-quality RE resources, suitable topography and land-use designations, and demonstrated developer interest - to the power system. The REZ process helps to increase the share of solarmore » photovoltaic (PV), wind, and other resources while also maintaining reliability and economics.« less

Top