Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-23
...-02] RIN 0694-AE98 Simplified Network Application Processing System, On-Line Registration and Account...'') electronically via BIS's Simplified Network Application Processing (SNAP-R) system. Currently, parties must... Network Applications Processing System (SNAP-R) in October 2006. The SNAP-R system provides a Web based...
Teaching Case: MiHotel--Applicant Processing System Design Case
ERIC Educational Resources Information Center
Miller, Robert E.; Dunn, Paul
2018-01-01
This teaching case describes the functionality of an applicant processing system designed for a fictitious hotel chain. The system detailed in the case includes a webform where applicants complete and submit job applications. The system also includes a desktop application used by hotel managers and Human Resources to track applications and process…
The application of intelligent process control to space based systems
NASA Technical Reports Server (NTRS)
Wakefield, G. Steve
1990-01-01
The application of Artificial Intelligence to electronic and process control can help attain the autonomy and safety requirements of manned space systems. An overview of documented applications within various industries is presented. The development process is discussed along with associated issues for implementing an intelligence process control system.
DMD: a digital light processing application to projection displays
NASA Astrophysics Data System (ADS)
Feather, Gary A.
1989-01-01
Summary Revolutionary technologies achieve rapid product and subsequent business diffusion only when the in- ventors focus on technology application, maturation, and proliferation. A revolutionary technology is emerg- ing with micro-electromechanical systems (MEMS). MEMS are being developed by leveraging mature semi- conductor processing coupled with mechanical systems into complete, integrated, useful systems. The digital micromirror device (DMD), a Texas Instruments invented MEMS, has focused on its application to projec- tion displays. The DMD has demonstrated its application as a digital light processor, processing and produc- ing compelling computer and video projection displays. This tutorial discusses requirements in the projection display market and the potential solutions offered by this digital light processing system. The seminar in- cludes an evaluation of the market, system needs, design, fabrication, application, and performance results of a system using digital light processing solutions.
NASA Technical Reports Server (NTRS)
1972-01-01
The IDAPS (Image Data Processing System) is a user-oriented, computer-based, language and control system, which provides a framework or standard for implementing image data processing applications, simplifies set-up of image processing runs so that the system may be used without a working knowledge of computer programming or operation, streamlines operation of the image processing facility, and allows multiple applications to be run in sequence without operator interaction. The control system loads the operators, interprets the input, constructs the necessary parameters for each application, and cells the application. The overlay feature of the IBSYS loader (IBLDR) provides the means of running multiple operators which would otherwise overflow core storage.
NASA Technical Reports Server (NTRS)
Huh, Oscar Karl; Leibowitz, Scott G.; Dirosa, Donald; Hill, John M.
1986-01-01
The use of NOAA Advanced Very High Resolution Radar/High Resolution Picture Transmission (AVHRR/HRPT) imagery for earth resource applications is provided for the applications scientist for use within the various Earth science, resource, and agricultural disciplines. A guide to processing NOAA AVHRR data using the hardware and software systems integrated for this NASA project is provided. The processing steps from raw data on computer compatible tapes (1B data format) through usable qualitative and quantitative products for applications are given. The manual is divided into two parts. The first section describes the NOAA satellite system, its sensors, and the theoretical basis for using these data for environmental applications. Part 2 is a hands-on description of how to use a specific image processing system, the International Imaging Systems, Inc. (I2S) Model 75 Array Processor and S575 software, to process these data.
47 CFR 22.959 - Rules governing processing of applications for initial systems.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 2 2010-10-01 2010-10-01 false Rules governing processing of applications for...) COMMON CARRIER SERVICES PUBLIC MOBILE SERVICES Cellular Radiotelephone Service § 22.959 Rules governing processing of applications for initial systems. Pending applications for authority to operate the first...
Code of Federal Regulations, 2014 CFR
2014-10-01
..., development or installation of a statewide automated application processing and information retrieval system.... (2) The system is compatible with the claims processing and information retrieval systems used in the... in the title IV-A (AFDC) Automated Application Processing and Information Retrieval System Guide...
Code of Federal Regulations, 2013 CFR
2013-10-01
..., development or installation of a statewide automated application processing and information retrieval system.... (2) The system is compatible with the claims processing and information retrieval systems used in the... in the title IV-A (AFDC) Automated Application Processing and Information Retrieval System Guide...
Code of Federal Regulations, 2012 CFR
2012-10-01
..., development or installation of a statewide automated application processing and information retrieval system.... (2) The system is compatible with the claims processing and information retrieval systems used in the... in the title IV-A (AFDC) Automated Application Processing and Information Retrieval System Guide...
Code of Federal Regulations, 2011 CFR
2011-10-01
..., development or installation of a statewide automated application processing and information retrieval system.... (2) The system is compatible with the claims processing and information retrieval systems used in the... in the title IV-A (AFDC) Automated Application Processing and Information Retrieval System Guide...
Code of Federal Regulations, 2010 CFR
2010-10-01
..., development or installation of a statewide automated application processing and information retrieval system.... (2) The system is compatible with the claims processing and information retrieval systems used in the... in the title IV-A (AFDC) Automated Application Processing and Information Retrieval System Guide...
Expert systems in the process industries
NASA Technical Reports Server (NTRS)
Stanley, G. M.
1992-01-01
This paper gives an overview of industrial applications of real-time knowledge based expert systems (KBES's) in the process industries. After a brief overview of the features of a KBES useful in process applications, the general roles of KBES's are covered. A particular focus is diagnostic applications, one of the major applications areas. Many applications are seen as an expansion of supervisory control. The lessons learned from numerous online applications are summarized.
76 FR 46774 - Privacy Act of 1974; System of Records-Federal Student Aid Application File
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-03
... Information and Regulatory Affairs in the Office of Management and Budget (OMB), on July 21, 2011. This... altered system of records to: Director, Application Processing Division, Program Management Systems...: Director, Application Processing Division, Program Management Systems, Federal Student Aid, U.S. Department...
NASA Astrophysics Data System (ADS)
Ariana, I. M.; Bagiada, I. M.
2018-01-01
Development of spreadsheet-based integrated transaction processing systems and financial reporting systems is intended to optimize the capabilities of spreadsheet in accounting data processing. The purpose of this study are: 1) to describe the spreadsheet-based integrated transaction processing systems and financial reporting systems; 2) to test its technical and operational feasibility. This study type is research and development. The main steps of study are: 1) needs analysis (need assessment); 2) developing spreadsheet-based integrated transaction processing systems and financial reporting systems; and 3) testing the feasibility of spreadsheet-based integrated transaction processing systems and financial reporting systems. The technical feasibility include the ability of hardware and operating systems to respond the application of accounting, simplicity and ease of use. Operational feasibility include the ability of users using accounting applications, the ability of accounting applications to produce information, and control applications of the accounting applications. The instrument used to assess the technical and operational feasibility of the systems is the expert perception questionnaire. The instrument uses 4 Likert scale, from 1 (strongly disagree) to 4 (strongly agree). Data were analyzed using percentage analysis by comparing the number of answers within one (1) item by the number of ideal answer within one (1) item. Spreadsheet-based integrated transaction processing systems and financial reporting systems integrate sales, purchases, and cash transaction processing systems to produce financial reports (statement of profit or loss and other comprehensive income, statement of changes in equity, statement of financial position, and statement of cash flows) and other reports. Spreadsheet-based integrated transaction processing systems and financial reporting systems is feasible from the technical aspects (87.50%) and operational aspects (84.17%).
System on a chip with MPEG-4 capability
NASA Astrophysics Data System (ADS)
Yassa, Fathy; Schonfeld, Dan
2002-12-01
Current products supporting video communication applications rely on existing computer architectures. RISC processors have been used successfully in numerous applications over several decades. DSP processors have become ubiquitous in signal processing and communication applications. Real-time applications such as speech processing in cellular telephony rely extensively on the computational power of these processors. Video processors designed to implement the computationally intensive codec operations have also been used to address the high demands of video communication applications (e.g., cable set-top boxes and DVDs). This paper presents an overview of a system-on-chip (SOC) architecture used for real-time video in wireless communication applications. The SOC specifications answer to the system requirements imposed by the application environment. A CAM-based video processor is used to accelerate data intensive video compression tasks such as motion estimations and filtering. Other components are dedicated to system level data processing and audio processing. A rich set of I/Os allows the SOC to communicate with other system components such as baseband and memory subsystems.
47 CFR 22.959 - Rules governing processing of applications for initial systems.
Code of Federal Regulations, 2011 CFR
2011-10-01
... initial systems. 22.959 Section 22.959 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES PUBLIC MOBILE SERVICES Cellular Radiotelephone Service § 22.959 Rules governing processing of applications for initial systems. Pending applications for authority to operate the first...
Laadan, Oren; Nieh, Jason; Phung, Dan
2012-10-02
Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.
High Available COTS Based Computer for Space
NASA Astrophysics Data System (ADS)
Hartmann, J.; Magistrati, Giorgio
2015-09-01
The availability and reliability factors of a system are central requirements of a target application. From a simple fuel injection system used in cars up to a flight control system of an autonomous navigating spacecraft, each application defines its specific availability factor under the target application boundary conditions. Increasing quality requirements on data processing systems used in space flight applications calling for new architectures to fulfill the availability, reliability as well as the increase of the required data processing power. Contrary to the increased quality request simplification and use of COTS components to decrease costs while keeping the interface compatibility to currently used system standards are clear customer needs. Data processing system design is mostly dominated by strict fulfillment of the customer requirements and reuse of available computer systems were not always possible caused by obsolescence of EEE-Parts, insufficient IO capabilities or the fact that available data processing systems did not provide the required scalability and performance.
Parallel Signal Processing and System Simulation using aCe
NASA Technical Reports Server (NTRS)
Dorband, John E.; Aburdene, Maurice F.
2003-01-01
Recently, networked and cluster computation have become very popular for both signal processing and system simulation. A new language is ideally suited for parallel signal processing applications and system simulation since it allows the programmer to explicitly express the computations that can be performed concurrently. In addition, the new C based parallel language (ace C) for architecture-adaptive programming allows programmers to implement algorithms and system simulation applications on parallel architectures by providing them with the assurance that future parallel architectures will be able to run their applications with a minimum of modification. In this paper, we will focus on some fundamental features of ace C and present a signal processing application (FFT).
NASA Technical Reports Server (NTRS)
Spurlock, Paul; Spurlock, Jack M.; Evanich, Peggy L.
1991-01-01
An overview of recent developments in process-control technology which might have applications in future advanced life support systems for long-duration space operations is presented. Consideration is given to design criteria related to control system selection and optimization, and process-control interfacing methodology. Attention is also given to current life support system process control strategies, innovative sensors, instrumentation and control, and innovations in process supervision.
System and method for deriving a process-based specification
NASA Technical Reports Server (NTRS)
Hinchey, Michael Gerard (Inventor); Rouff, Christopher A. (Inventor); Rash, James Larry (Inventor)
2009-01-01
A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.
How to Take HRMS Process Management to the Next Level with Workflow Business Event System
NASA Technical Reports Server (NTRS)
Rajeshuni, Sarala; Yagubian, Aram; Kunamaneni, Krishna
2006-01-01
Oracle Workflow with the Business Event System offers a complete process management solution for enterprises to manage business processes cost-effectively. Using Workflow event messaging, event subscriptions, AQ Servlet and advanced queuing technologies, this presentation will demonstrate the step-by-step design and implementation of system solutions in order to integrate two dissimilar systems and establish communication remotely. As a case study, the presentation walks you through the process of propagating organization name changes in other applications that originated from the HRMS module without changing applications code. The solution can be applied to your particular business cases for streamlining or modifying business processes across Oracle and non-Oracle applications.
Distributed Processing with a Mainframe-Based Hospital Information System: A Generalized Solution
Kirby, J. David; Pickett, Michael P.; Boyarsky, M. William; Stead, William W.
1987-01-01
Over the last two years the Medical Center Information Systems Department at Duke University Medical Center has been developing a systematic approach to distributing the processing and data involved in computerized applications at DUMC. The resulting system has been named MAPS- the Micro-ADS Processing System. A key characteristic of MAPS is that it makes it easy to execute any existing mainframe ADS application with a request from a PC. This extends the functionality of the mainframe application set to the PC without compromising the maintainability of the PC or mainframe systems.
Fine grained event processing on HPCs with the ATLAS Yoda system
NASA Astrophysics Data System (ADS)
Calafiura, Paolo; De, Kaushik; Guan, Wen; Maeno, Tadashi; Nilsson, Paul; Oleynik, Danila; Panitkin, Sergey; Tsulaia, Vakhtang; Van Gemmeren, Peter; Wenaus, Torre
2015-12-01
High performance computing facilities present unique challenges and opportunities for HEP event processing. The massive scale of many HPC systems means that fractionally small utilization can yield large returns in processing throughput. Parallel applications which can dynamically and efficiently fill any scheduling opportunities the resource presents benefit both the facility (maximal utilization) and the (compute-limited) science. The ATLAS Yoda system provides this capability to HEP-like event processing applications by implementing event-level processing in an MPI-based master-client model that integrates seamlessly with the more broadly scoped ATLAS Event Service. Fine grained, event level work assignments are intelligently dispatched to parallel workers to sustain full utilization on all cores, with outputs streamed off to destination object stores in near real time with similarly fine granularity, such that processing can proceed until termination with full utilization. The system offers the efficiency and scheduling flexibility of preemption without requiring the application actually support or employ check-pointing. We will present the new Yoda system, its motivations, architecture, implementation, and applications in ATLAS data processing at several US HPC centers.
Helium refrigeration system for hydrogen liquefaction applications
NASA Astrophysics Data System (ADS)
Nair, J. Kumar, Sr.; Menon, RS; Goyal, M.; Ansari, NA; Chakravarty, A.; Joemon, V.
2017-02-01
Liquid hydrogen around 20 K is used as cold moderator for generating “cold neutron beam” in nuclear research reactors. A cryogenic helium refrigeration system is the core upon which such hydrogen liquefaction applications are built. A thermodynamic process based on reversed Brayton cycle with two stage expansion using high speed cryogenic turboexpanders (TEX) along with a pair of compact high effectiveness process heat exchangers (HX), is well suited for such applications. An existing helium refrigeration system, which had earlier demonstrated a refrigeration capacity of 470 W at around 20 K, is modified based on past operational experiences and newer application requirements. Modifications include addition of a new heat exchanger to simulate cryogenic process load and two other heat exchangers for controlling the temperatures of helium streams leading out to the application system. To incorporate these changes, cryogenic piping inside the cold box is suitably modified. This paper presents process simulation, sizing of new heat exchangers as well as fabrication aspects of the modified cryogenic process piping.
NASA Astrophysics Data System (ADS)
Boelger, B.; Ferwerda, H. A.
Various papers on optics, optical systems, and their applications are presented. The general topics addressed include: laser systems, optical and electrooptical materials and devices; novel spectroscopic techniques and applications; inspection, remote sensing, velocimetry, and gauging; optical design and image formation; holography, image processing, and storage; and integrated and fiber optics. Also discussed are: nonlinear optics; nonlinear photorefractive materials; scattering and diffractions applications in materials processing, deposition, and machining; medical and biological applications; and focus on industry.
The Need for V&V in Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1997-01-01
V&V is currently performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to entire' domain or product line rather than a critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. engineering. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for activities.
1999-11-01
represents the linear time invariant (LTI) response of the combined analysis /synthesis system while the second repre- sents the aliasing introduced into...effectively to implement voice scrambling systems based on time - frequency permutation . The most general form of such a system is shown in Fig. 22 where...92201 NEUILLY-SUR-SEINE CEDEX, FRANCE RTO LECTURE SERIES 216 Application of Mathematical Signal Processing Techniques to Mission Systems (1
Validation, Edits, and Application Processing System Report: Phase I.
ERIC Educational Resources Information Center
Gray, Susan; And Others
Findings of phase 1 of a study of the 1979-1980 Basic Educational Opportunity Grants validation, edits, and application processing system are presented. The study was designed to: assess the impact of the validation effort and processing system edits on the correct award of Basic Grants; and assess the characteristics of students most likely to…
NASA Astrophysics Data System (ADS)
Renken, Hartmut; Oelze, Holger W.; Rath, Hans J.
1998-04-01
The design and application of a digital high sped image data capturing system with a following image processing system applied to the Bremer Hochschul Hyperschallkanal BHHK is the content of this presentation. It is also the result of the cooperation between the departments aerodynamic and image processing at the ZARM-institute at the Drop Tower of Brennen. Similar systems are used by the combustion working group at ZARM and other external project partners. The BHHK, camera- and image storage system as well as the personal computer based image processing software are described next. Some examples of images taken at the BHHK are shown to illustrate the application. The new and very user-friendly Windows 32-bit system is capable to capture all camera data with a maximum pixel clock of 43 MHz and to process complete sequences of images in one step by using only one comfortable program.
The application of digital techniques to the analysis of metallurgical experiments
NASA Technical Reports Server (NTRS)
Rathz, T. J.
1977-01-01
The application of a specific digital computer system (known as the Image Data Processing System) to the analysis of three NASA-sponsored metallurgical experiments is discussed in some detail. The basic hardware and software components of the Image Data Processing System are presented. Many figures are presented in the discussion of each experimental analysis in an attempt to show the accuracy and speed that the Image Data Processing System affords in analyzing photographic images dealing with metallurgy, and in particular with material processing.
Effective Application of a Quality System in the Donation Process at Hospital Level.
Trujnara, M; Czerwiński, J; Osadzińska, J
2016-06-01
This article describes the application of a quality system at the hospital level at the Multidisciplinary Hospital in Warsaw-Międzylesie in Poland. A quality system of hospital procedures (in accordance with the ISO system 9001:2008) regarding the donation process, from the identification of a possible donor to the retrieval of organs, was applied there in 2014. Seven independent documents about hospital procedures, were designed to cover the entire process of donation. The number of donors identified increased after the application of the quality system. The reason for this increase is, above all, the cooperation of the well-trained team of specialists who have been engaged in the process of donation for many years, but formal procedures certainly organize the process and make it easier. Copyright © 2016. Published by Elsevier Inc.
A Framework for Performing Verification and Validation in Reuse Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1997-01-01
Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.
Workflow management systems in radiology
NASA Astrophysics Data System (ADS)
Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim
1998-07-01
In a situation of shrinking health care budgets, increasing cost pressure and growing demands to increase the efficiency and the quality of medical services, health care enterprises are forced to optimize or complete re-design their processes. Although information technology is agreed to potentially contribute to cost reduction and efficiency improvement, the real success factors are the re-definition and automation of processes: Business Process Re-engineering and Workflow Management. In this paper we discuss architectures for the use of workflow management systems in radiology. We propose to move forward from information systems in radiology (RIS, PACS) to Radiology Management Systems, in which workflow functionality (process definitions and process automation) is implemented through autonomous workflow management systems (WfMS). In a workflow oriented architecture, an autonomous workflow enactment service communicates with workflow client applications via standardized interfaces. In this paper, we discuss the need for and the benefits of such an approach. The separation of workflow management system and application systems is emphasized, and the consequences that arise for the architecture of workflow oriented information systems. This includes an appropriate workflow terminology, and the definition of standard interfaces for workflow aware application systems. Workflow studies in various institutions have shown that most of the processes in radiology are well structured and suited for a workflow management approach. Numerous commercially available Workflow Management Systems (WfMS) were investigated, and some of them, which are process- oriented and application independent, appear suitable for use in radiology.
Expert Systems: A Conceptual Analysis and Prospects for Their Library Applications.
ERIC Educational Resources Information Center
Dubey, Yogendra P.
This paper begins with a discussion of the decision-making process. The application of operations research technologies to managerial decision-making is noted, and the development of management information systems in organizations and some limitations of these systems are discussed. An overview of the Human Information Processing System (HIPS)…
NASA Technical Reports Server (NTRS)
Bracken, P. A.; Dalton, J. T.; Quann, J. J.; Billingsley, J. B.
1978-01-01
The Atmospheric and Oceanographic Information Processing System (AOIPS) was developed to help applications investigators perform required interactive image data analysis rapidly and to eliminate the inefficiencies and problems associated with batch operation. This paper describes the configuration and processing capabilities of AOIPS and presents unique subsystems for displaying, analyzing, storing, and manipulating digital image data. Applications of AOIPS to research investigations in meteorology and earth resources are featured.
NASA Technical Reports Server (NTRS)
1978-01-01
The discipline programs of the Space and Terrestrial (S&T) Applications Program are described and examples of research areas of current interest are given. Application of space techniques to improve conditions on earth are summarized. Discipline programs discussed include: resource observations; environmental observations; communications; materials processing in space; and applications systems/information systems. Format information on submission of unsolicited proposals for research related to the S&T Applications Program are given.
Bent, John M.; Faibish, Sorin; Grider, Gary
2016-04-19
Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.
45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.
Code of Federal Regulations, 2012 CFR
2012-10-01
... claims processing and information retrieval systems; definitions. Section 205.35 through 205.38 contain...: (a) A mechanized claims processing and information retrieval system, hereafter referred to as an automated application processing and information retrieval system (APIRS), or the system, means a system of...
45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.
Code of Federal Regulations, 2013 CFR
2013-10-01
... claims processing and information retrieval systems; definitions. Section 205.35 through 205.38 contain...: (a) A mechanized claims processing and information retrieval system, hereafter referred to as an automated application processing and information retrieval system (APIRS), or the system, means a system of...
45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.
Code of Federal Regulations, 2014 CFR
2014-10-01
... claims processing and information retrieval systems; definitions. Section 205.35 through 205.38 contain...: (a) A mechanized claims processing and information retrieval system, hereafter referred to as an automated application processing and information retrieval system (APIRS), or the system, means a system of...
A characterization of workflow management systems for extreme-scale applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia
We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less
A characterization of workflow management systems for extreme-scale applications
Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...
2017-02-16
We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less
NASA Technical Reports Server (NTRS)
Murray, R. W.
1973-01-01
A comprehensive study of advanced water recovery and solid waste processing techniques employed in both aerospace and domestic or commercial applications is reported. A systems approach was used to synthesize a prototype system design of an advanced water treatment/waste processing system. Household water use characteristics were studied and modified through the use of low water use devices and a limited amount of water reuse. This modified household system was then used as a baseline system for development of several water treatment waste processing systems employing advanced techniques. A hybrid of these systems was next developed and a preliminary design was generated to define system and hardware functions.
An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis
NASA Astrophysics Data System (ADS)
Kim, Yongmin; Alexander, Thomas
1986-06-01
In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.
NASA Technical Reports Server (NTRS)
Friedrich, Craig R.; Warrington, Robert O.
1995-01-01
Micromechanical machining processes are those micro fabrication techniques which directly remove work piece material by either a physical cutting tool or an energy process. These processes are direct and therefore they can help reduce the cost and time for prototype development of micro mechanical components and systems. This is especially true for aerospace applications where size and weight are critical, and reliability and the operating environment are an integral part of the design and development process. The micromechanical machining processes are rapidly being recognized as a complementary set of tools to traditional lithographic processes (such as LIGA) for the fabrication of micromechanical components. Worldwide efforts in the U.S., Germany, and Japan are leading to results which sometimes rival lithography at a fraction of the time and cost. Efforts to develop processes and systems specific to aerospace applications are well underway.
NASA Technical Reports Server (NTRS)
Morrison, D. B. (Editor); Scherer, D. J.
1977-01-01
Papers are presented on a variety of techniques for the machine processing of remotely sensed data. Consideration is given to preprocessing methods such as the correction of Landsat data for the effects of haze, sun angle, and reflectance and to the maximum likelihood estimation of signature transformation algorithm. Several applications of machine processing to agriculture are identified. Various types of processing systems are discussed such as ground-data processing/support systems for sensor systems and the transfer of remotely sensed data to operational systems. The application of machine processing to hydrology, geology, and land-use mapping is outlined. Data analysis is considered with reference to several types of classification methods and systems.
NASA Technical Reports Server (NTRS)
Pokras, V. M.; Yevdokimov, V. P.; Maslov, V. D.
1978-01-01
The structure and potential of the information reference system OZhUR designed for the automated data processing systems of scientific space vehicles (SV) is considered. The system OZhUR ensures control of the extraction phase of processing with respect to a concrete SV and the exchange of data between phases.The practical application of the system OZhUR is exemplified in the construction of a data processing system for satellites of the Cosmos series. As a result of automating the operations of exchange and control, the volume of manual preparation of data is significantly reduced, and there is no longer any need for individual logs which fix the status of data processing. The system Ozhur is included in the automated data processing system Nauka which is realized in language PL-1 in a binary one-address system one-state (BOS OS) electronic computer.
Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin
2017-04-01
Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, S.E.
This presentation will examine process systems engineering R&D needs for application to advanced fossil energy (FE) systems and highlight ongoing research activities at the National Energy Technology Laboratory (NETL) under the auspices of a recently launched Collaboratory for Process & Dynamic Systems Research. The three current technology focus areas include: 1) High-fidelity systems with NETL's award-winning Advanced Process Engineering Co-Simulator (APECS) technology for integrating process simulation with computational fluid dynamics (CFD) and virtual engineering concepts, 2) Dynamic systems with R&D on plant-wide IGCC dynamic simulation, control, and real-time training applications, and 3) Systems optimization including large-scale process optimization, stochastic simulationmore » for risk/uncertainty analysis, and cost estimation. Continued R&D aimed at these and other key process systems engineering models, methods, and tools will accelerate the development of advanced gasification-based FE systems and produce increasingly valuable outcomes for DOE and the Nation.« less
NASA Astrophysics Data System (ADS)
Friberg, P. A.; Luis, R. S.; Quintiliani, M.; Lisowski, S.; Hunter, S.
2014-12-01
Recently, a novel set of modules has been included in the Open Source Earthworm seismic data processing system, supporting the use of web applications. These include the Mole sub-system, for storing relevant event data in a MySQL database (see M. Quintiliani and S. Pintore, SRL, 2013), and an embedded webserver, Moleserv, for serving such data to web clients in QuakeML format. These modules have enabled, for the first time using Earthworm, the use of web applications for seismic data processing. These can greatly simplify the operation and maintenance of seismic data processing centers by having one or more servers providing the relevant data as well as the data processing applications themselves to client machines running arbitrary operating systems.Web applications with secure online web access allow operators to work anywhere, without the often cumbersome and bandwidth hungry use of secure shell or virtual private networks. Furthermore, web applications can seamlessly access third party data repositories to acquire additional information, such as maps. Finally, the usage of HTML email brought the possibility of specialized web applications, to be used in email clients. This is the case of EWHTMLEmail, which produces event notification emails that are in fact simple web applications for plotting relevant seismic data.Providing web services as part of Earthworm has enabled a number of other tools as well. One is ISTI's EZ Earthworm, a web based command and control system for an otherwise command line driven system; another is a waveform web service. The waveform web service serves Earthworm data to additional web clients for plotting, picking, and other web-based processing tools. The current Earthworm waveform web service hosts an advanced plotting capability for providing views of event-based waveforms from a Mole database served by Moleserve.The current trend towards the usage of cloud services supported by web applications is driving improvements in JavaScript, css and HTML, as well as faster and more efficient web browsers, including mobile. It is foreseeable that in the near future, web applications are as powerful and efficient as native applications. Hence the work described here has been the first step towards bringing the Open Source Earthworm seismic data processing system to this new paradigm.
Advanced processing for high-bandwidth sensor systems
NASA Astrophysics Data System (ADS)
Szymanski, John J.; Blain, Phil C.; Bloch, Jeffrey J.; Brislawn, Christopher M.; Brumby, Steven P.; Cafferty, Maureen M.; Dunham, Mark E.; Frigo, Janette R.; Gokhale, Maya; Harvey, Neal R.; Kenyon, Garrett; Kim, Won-Ha; Layne, J.; Lavenier, Dominique D.; McCabe, Kevin P.; Mitchell, Melanie; Moore, Kurt R.; Perkins, Simon J.; Porter, Reid B.; Robinson, S.; Salazar, Alfonso; Theiler, James P.; Young, Aaron C.
2000-11-01
Compute performance and algorithm design are key problems of image processing and scientific computing in general. For example, imaging spectrometers are capable of producing data in hundreds of spectral bands with millions of pixels. These data sets show great promise for remote sensing applications, but require new and computationally intensive processing. The goal of the Deployable Adaptive Processing Systems (DAPS) project at Los Alamos National Laboratory is to develop advanced processing hardware and algorithms for high-bandwidth sensor applications. The project has produced electronics for processing multi- and hyper-spectral sensor data, as well as LIDAR data, while employing processing elements using a variety of technologies. The project team is currently working on reconfigurable computing technology and advanced feature extraction techniques, with an emphasis on their application to image and RF signal processing. This paper presents reconfigurable computing technology and advanced feature extraction algorithm work and their application to multi- and hyperspectral image processing. Related projects on genetic algorithms as applied to image processing will be introduced, as will the collaboration between the DAPS project and the DARPA Adaptive Computing Systems program. Further details are presented in other talks during this conference and in other conferences taking place during this symposium.
NASA Technical Reports Server (NTRS)
Chien, Steve A.
1996-01-01
A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintainting the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems. This paper describes a planning application of automated imaging processing and our overall approach to knowledge acquisition for this application.
Suciu, George; Suciu, Victor; Martian, Alexandru; Craciunescu, Razvan; Vulpe, Alexandru; Marcu, Ioana; Halunga, Simona; Fratu, Octavian
2015-11-01
Big data storage and processing are considered as one of the main applications for cloud computing systems. Furthermore, the development of the Internet of Things (IoT) paradigm has advanced the research on Machine to Machine (M2M) communications and enabled novel tele-monitoring architectures for E-Health applications. However, there is a need for converging current decentralized cloud systems, general software for processing big data and IoT systems. The purpose of this paper is to analyze existing components and methods of securely integrating big data processing with cloud M2M systems based on Remote Telemetry Units (RTUs) and to propose a converged E-Health architecture built on Exalead CloudView, a search based application. Finally, we discuss the main findings of the proposed implementation and future directions.
Real-time optical fiber digital speckle pattern interferometry for industrial applications
NASA Astrophysics Data System (ADS)
Chan, Robert K.; Cheung, Y. M.; Lo, C. H.; Tam, T. K.
1997-03-01
There is current interest, especially in the industrial sector, to use the digital speckle pattern interferometry (DSPI) technique to measure surface stress. Indeed, many publications in the subject are evident of the growing interests in the field. However, to bring the technology to industrial use requires the integration of several emerging technologies, viz. optics, feedback control, electronics, imaging processing and digital signal processing. Due to the highly interdisciplinary nature of the technique, successful implementation and development require expertise in all of the fields. At Baptist University, under the funding of a major industrial grant, we are developing the technology for the industrial sector. Our system fully exploits optical fibers and diode lasers in the design to enable practical and rugged systems suited for industrial applications. Besides the development in optics, we have broken away from the reliance of a microcomputer PC platform for both image capture and processing, and have developed a digital signal processing array system that can handle simultaneous and independent image capture/processing with feedback control. The system, named CASPA for 'cascadable architecture signal processing array,' is a third generation development system that utilizes up to 7 digital signal processors has proved to be a very powerful system. With our CASPA we are now in a better position to developing novel optical measurement systems for industrial application that may require different measurement systems to operate concurrently and requiring information exchange between the systems. Applications in mind such as simultaneous in-plane and out-of-plane DSPI image capture/process, vibrational analysis with interactive DSPI and phase shifting control of optical systems are a few good examples of the potentials.
Parallel Processing with Digital Signal Processing Hardware and Software
NASA Technical Reports Server (NTRS)
Swenson, Cory V.
1995-01-01
The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.
NASA Technical Reports Server (NTRS)
Anderson, W. F.; Conway, J. R.; Keller, L. C.
1972-01-01
The characteristics of the application program were developed to verify and demonstrate the SEL 840MP Multi-Processing Control System - Version I (MPCS/1). The application program emphasizes the display support and task control capabilities. The application program is further intended to be used as an aid to familization with MPCS/1. It complements the information provided in the MPCS/1 Users Guide, Volume I and II.
Potential use of advanced process control for safety purposes during attack of a process plant.
Whiteley, James R
2006-03-17
Many refineries and commodity chemical plants employ advanced process control (APC) systems to improve throughputs and yields. These APC systems utilize empirical process models for control purposes and enable operation closer to constraints than can be achieved with traditional PID regulatory feedback control. Substantial economic benefits are typically realized from the addition of APC systems. This paper considers leveraging the control capabilities of existing APC systems to minimize the potential impact of a terrorist attack on a process plant (e.g., petroleum refinery). Two potential uses of APC are described. The first is a conventional application of APC and involves automatically moving the process to a reduced operating rate when an attack first begins. The second is a non-conventional application and involves reconfiguring the APC system to optimize safety rather than economics. The underlying intent in both cases is to reduce the demands on the operator to allow focus on situation assessment and optimal response planning. An overview of APC is provided along with a brief description of the modifications required for the proposed new applications of the technology.
Automation of experimental research of waveguide paths induction soldering
NASA Astrophysics Data System (ADS)
Tynchenko, V. S.; Petrenko, V. E.; Kukartsev, V. V.; Tynchenko, V. V.; Antamoshkin, O. A.
2018-05-01
The article presents an automated system of experimental studies of the waveguide paths induction soldering process. The system is a part of additional software for a complex of automated control of the technological process of induction soldering of thin-walled waveguide paths from aluminum alloys, expanding its capabilities. The structure of the software product, the general appearance of the controls and the potential application possibilities are presented. The utility of the developed application by approbation in a series of field experiments was considered and justified. The application of the experimental research system makes it possible to improve the process under consideration, providing the possibility of fine-tuning the control regulators, as well as keeping the statistics of the soldering process in a convenient form for analysis.
NASA Technical Reports Server (NTRS)
Eckhardt, D. E., Jr.
1979-01-01
A model of a central processor (CPU) which services background applications in the presence of time critical activity is presented. The CPU is viewed as an M/M/1 queueing system subject to periodic interrupts by deterministic, time critical process. The Laplace transform of the distribution of service times for the background applications is developed. The use of state of the art queueing models for studying the background processing capability of time critical computer systems is discussed and the results of a model validation study which support this application of queueing models are presented.
Modeling and Analysis of Power Processing Systems (MAPPS). Volume 1: Technical report
NASA Technical Reports Server (NTRS)
Lee, F. C.; Rahman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.
1980-01-01
Computer aided design and analysis techniques were applied to power processing equipment. Topics covered include: (1) discrete time domain analysis of switching regulators for performance analysis; (2) design optimization of power converters using augmented Lagrangian penalty function technique; (3) investigation of current-injected multiloop controlled switching regulators; and (4) application of optimization for Navy VSTOL energy power system. The generation of the mathematical models and the development and application of computer aided design techniques to solve the different mathematical models are discussed. Recommendations are made for future work that would enhance the application of the computer aided design techniques for power processing systems.
A Distributed Operating System for BMD Applications.
1982-01-01
Defense) applications executing on distributed hardware with local and shared memories. The objective was to develop real - time operating system functions...make the Basic Real - Time Operating System , and the set of new EPL language primitives that provide BMD application processes with efficient mechanisms
Systematic Review: Concept and tool development with application to the National Toxicology Program (NTP) and the Integrated Risk Information System (IRIS) Assessment Processes. There is growing interest within the environmental health community to incorporate systematic review m...
Multimedia content analysis and indexing: evaluation of a distributed and scalable architecture
NASA Astrophysics Data System (ADS)
Mandviwala, Hasnain; Blackwell, Scott; Weikart, Chris; Van Thong, Jean-Manuel
2003-11-01
Multimedia search engines facilitate the retrieval of documents from large media content archives now available via intranets and the Internet. Over the past several years, many research projects have focused on algorithms for analyzing and indexing media content efficiently. However, special system architectures are required to process large amounts of content from real-time feeds or existing archives. Possible solutions include dedicated distributed architectures for analyzing content rapidly and for making it searchable. The system architecture we propose implements such an approach: a highly distributed and reconfigurable batch media content analyzer that can process media streams and static media repositories. Our distributed media analysis application handles media acquisition, content processing, and document indexing. This collection of modules is orchestrated by a task flow management component, exploiting data and pipeline parallelism in the application. A scheduler manages load balancing and prioritizes the different tasks. Workers implement application-specific modules that can be deployed on an arbitrary number of nodes running different operating systems. Each application module is exposed as a web service, implemented with industry-standard interoperable middleware components such as Microsoft ASP.NET and Sun J2EE. Our system architecture is the next generation system for the multimedia indexing application demonstrated by www.speechbot.com. It can process large volumes of audio recordings with minimal support and maintenance, while running on low-cost commodity hardware. The system has been evaluated on a server farm running concurrent content analysis processes.
A Strategy for Improved System Assurance
2007-06-20
Quality (Measurements Life Cycle Safety, Security & Others) ISO /IEC 12207 * Software Life Cycle Processes ISO 9001 Quality Management System...14598 Software Product Evaluation Related ISO /IEC 90003 Guidelines for the Application of ISO 9001:2000 to Computer Software IEEE 12207 Industry...Implementation of International Standard ISO /IEC 12207 IEEE 1220 Standard for Application and Management of the System Engineering Process Use in
High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering
NASA Technical Reports Server (NTRS)
Maly, K.
1998-01-01
Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated with the monitoring architecture to reduce the volume of event traffic flow in the system, and thereby reduce the intrusiveness of the monitoring process. We are developing an event filtering architecture to efficiently process the large volume of event traffic generated by LSD systems (such as distributed interactive applications). This filtering architecture is used to monitor collaborative distance learning application for obtaining debugging and feedback information. Our architecture supports the dynamic (re)configuration and optimization of event filters in large-scale distributed systems. Our work represents a major contribution by (1) survey and evaluating existing event filtering mechanisms In supporting monitoring LSD systems and (2) devising an integrated scalable high- performance architecture of event filtering that spans several kev application domains, presenting techniques to improve the functionality, performance and scalability. This paper describes the primary characteristics and challenges of developing high-performance event filtering for monitoring LSD systems. We survey existing event filtering mechanisms and explain key characteristics for each technique. In addition, we discuss limitations with existing event filtering mechanisms and outline how our architecture will improve key aspects of event filtering.
48 CFR 301.607-76 - FAC-P/PM application process.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false FAC-P/PM application... 301.607-76 FAC-P/PM application process. The P/PM Handbook contains application procedures and forms...; recertification; and certification waiver. Applicants for HHS FAC-P/PM certification actions shall comply with the...
48 CFR 301.607-76 - FAC-P/PM application process.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false FAC-P/PM application... 301.607-76 FAC-P/PM application process. The P/PM Handbook contains application procedures and forms...; recertification; and certification waiver. Applicants for HHS FAC-P/PM certification actions shall comply with the...
48 CFR 301.607-76 - FAC-P/PM application process.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false FAC-P/PM application... 301.607-76 FAC-P/PM application process. The P/PM Handbook contains application procedures and forms...; recertification; and certification waiver. Applicants for HHS FAC-P/PM certification actions shall comply with the...
48 CFR 301.607-76 - FAC-P/PM application process.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false FAC-P/PM application... 301.607-76 FAC-P/PM application process. The P/PM Handbook contains application procedures and forms...; recertification; and certification waiver. Applicants for HHS FAC-P/PM certification actions shall comply with the...
48 CFR 301.607-76 - FAC-P/PM application process.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false FAC-P/PM application... 301.607-76 FAC-P/PM application process. The P/PM Handbook contains application procedures and forms...; recertification; and certification waiver. Applicants for HHS FAC-P/PM certification actions shall comply with the...
Task allocation in a distributed computing system
NASA Technical Reports Server (NTRS)
Seward, Walter D.
1987-01-01
A conceptual framework is examined for task allocation in distributed systems. Application and computing system parameters critical to task allocation decision processes are discussed. Task allocation techniques are addressed which focus on achieving a balance in the load distribution among the system's processors. Equalization of computing load among the processing elements is the goal. Examples of system performance are presented for specific applications. Both static and dynamic allocation of tasks are considered and system performance is evaluated using different task allocation methodologies.
Biomolecular logic systems: applications to biosensors and bioactuators
NASA Astrophysics Data System (ADS)
Katz, Evgeny
2014-05-01
The paper presents an overview of recent advances in biosensors and bioactuators based on the biocomputing concept. Novel biosensors digitally process multiple biochemical signals through Boolean logic networks of coupled biomolecular reactions and produce output in the form of YES/NO response. Compared to traditional single-analyte sensing devices, biocomputing approach enables a high-fidelity multi-analyte biosensing, particularly beneficial for biomedical applications. Multi-signal digital biosensors thus promise advances in rapid diagnosis and treatment of diseases by processing complex patterns of physiological biomarkers. Specifically, they can provide timely detection and alert to medical emergencies, along with an immediate therapeutic intervention. Application of the biocomputing concept has been successfully demonstrated for systems performing logic analysis of biomarkers corresponding to different injuries, particularly exemplified for liver injury. Wide-ranging applications of multi-analyte digital biosensors in medicine, environmental monitoring and homeland security are anticipated. "Smart" bioactuators, for example for signal-triggered drug release, were designed by interfacing switchable electrodes and biocomputing systems. Integration of novel biosensing and bioactuating systems with the biomolecular information processing systems keeps promise for further scientific advances and numerous practical applications.
Image processing techniques and applications to the Earth Resources Technology Satellite program
NASA Technical Reports Server (NTRS)
Polge, R. J.; Bhagavan, B. K.; Callas, L.
1973-01-01
The Earth Resources Technology Satellite system is studied, with emphasis on sensors, data processing requirements, and image data compression using the Fast Fourier and Hadamard transforms. The ERTS-A system and the fundamentals of remote sensing are discussed. Three user applications (forestry, crops, and rangelands) are selected and their spectral signatures are described. It is shown that additional sensors are needed for rangeland management. An on-board information processing system is recommended to reduce the amount of data transmitted.
A methodology for fostering commercialization of electric and hybrid vehicle propulsion systems
NASA Technical Reports Server (NTRS)
Thollot, P. A.; Musial, N. T.
1980-01-01
The rationale behind, and a proposed approach for, application of government assistance to accelerate the process of moving a new electric vehicle propulsion system product from technological readiness to profitable marketplace acceptance and utilization are described. Emphasis is on strategy, applicable incentives, and an implementation process.
Currie, Danielle J; Smith, Carl; Jagals, Paul
2018-03-27
Policy and decision-making processes are routinely challenged by the complex and dynamic nature of environmental health problems. System dynamics modelling has demonstrated considerable value across a number of different fields to help decision-makers understand and predict the dynamic behaviour of complex systems in support the development of effective policy actions. In this scoping review we investigate if, and in what contexts, system dynamics modelling is being used to inform policy or decision-making processes related to environmental health. Four electronic databases and the grey literature were systematically searched to identify studies that intersect the areas environmental health, system dynamics modelling, and decision-making. Studies identified in the initial screening were further screened for their contextual, methodological and application-related relevancy. Studies deemed 'relevant' or 'highly relevant' according to all three criteria were included in this review. Key themes related to the rationale, impact and limitation of using system dynamics in the context of environmental health decision-making and policy were analysed. We identified a limited number of relevant studies (n = 15), two-thirds of which were conducted between 2011 and 2016. The majority of applications occurred in non-health related sectors (n = 9) including transportation, public utilities, water, housing, food, agriculture, and urban and regional planning. Applications were primarily targeted at micro-level (local, community or grassroots) decision-making processes (n = 9), with macro-level (national or international) decision-making to a lesser degree. There was significant heterogeneity in the stated rationales for using system dynamics and the intended impact of the system dynamics model on decision-making processes. A series of user-related, technical and application-related limitations and challenges were identified. None of the reported limitations or challenges appeared unique to the application of system dynamics within the context of environmental health problems, but rather to the use of system dynamics in general. This review reveals that while system dynamics modelling is increasingly being used to inform decision-making related to environmental health, applications are currently limited. Greater application of system dynamics within this context is needed before its benefits and limitations can be fully understood.
Worklist handling in workflow-enabled radiological application systems
NASA Astrophysics Data System (ADS)
Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens
2000-05-01
For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.
Knowledge Interaction Design for Creative Knowledge Work
NASA Astrophysics Data System (ADS)
Nakakoji, Kumiyo; Yamamoto, Yasuhiro
This paper describes our approach for the development of application systems for creative knowledge work, particularly for early stages of information design tasks. Being a cognitive tool serving as a means of externalization, an application system affects how the user is engaged in the creative process through its visual interaction design. Knowledge interaction design described in this paper is a framework where a set of application systems for different information design domains are developed based on an interaction model, which is designed for a particular model of a thinking process. We have developed two sets of application systems using the knowledge interaction design framework: one includes systems for linear information design, such as writing, movie-editing, and video-analysis; the other includes systems for network information design, such as file-system navigation and hypertext authoring. Our experience shows that the resulting systems encourage users to follow a certain cognitive path through graceful user experience.
Krause, Paul; de Lusignan, Simon
2010-01-01
The allure of interoperable systems is that they should improve patient safety and make health services more efficient. The UK's National Programme for IT has made great strides in achieving interoperability; through linkage to a national electronic spine. However, there has been criticism of the usability of the applications in the clinical environment. Analysis of the procurement and assurance process to explore whether they predetermine usability. Processes separate developers from users, and test products against theoretical assurance models of use rather than simulate or pilot in a clinical environment. The current process appears to be effective for back office systems and high risk applications, but too inflexible for developing applications for the clinical setting. For clinical applications agile techniques are more appropriate. Usability testing should become an integrated part of the contractual process and be introduced earlier in the development process.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-12
... Systems, Inc.; Notice of Intent To File License Application, Filing of Pre-Application Document, and Approving Use of the Traditional Licensing Process a. Type of Filing: Notice of Intent to File License...: November 11, 2012. d. Submitted by: Aquenergy Systems, Inc., a fully owned subsidiaries of Enel Green Power...
NASA Astrophysics Data System (ADS)
Abisset-Chavanne, Emmanuelle; Duval, Jean Louis; Cueto, Elias; Chinesta, Francisco
2018-05-01
Traditionally, Simulation-Based Engineering Sciences (SBES) has relied on the use of static data inputs (model parameters, initial or boundary conditions, … obtained from adequate experiments) to perform simulations. A new paradigm in the field of Applied Sciences and Engineering has emerged in the last decade. Dynamic Data-Driven Application Systems [9, 10, 11, 12, 22] allow the linkage of simulation tools with measurement devices for real-time control of simulations and applications, entailing the ability to dynamically incorporate additional data into an executing application, and in reverse, the ability of an application to dynamically steer the measurement process. It is in that context that traditional "digital-twins" are giving raise to a new generation of goal-oriented data-driven application systems, also known as "hybrid-twins", embracing models based on physics and models exclusively based on data adequately collected and assimilated for filling the gap between usual model predictions and measurements. Within this framework new methodologies based on model learners, machine learning and kinetic goal-oriented design are defining a new paradigm in materials, processes and systems engineering.
1981-06-15
relationships 5 3. Normalized energy in ambiguity function for i = 0 14 k ilI SACLANTCEN SR-50 A RESUME OF STOCHASTIC, TIME-VARYING, LINEAR SYSTEM THEORY WITH...the order in which systems are concatenated is unimportant. These results are exactly analogous to the results of time-invariant linear system theory in...REFERENCES 1. MEIER, L. A rdsum6 of deterministic time-varying linear system theory with application to active sonar signal processing problems, SACLANTCEN
Advanced Information Processing System (AIPS)
NASA Technical Reports Server (NTRS)
Pitts, Felix L.
1993-01-01
Advanced Information Processing System (AIPS) is a computer systems philosophy, a set of validated hardware building blocks, and a set of validated services as embodied in system software. The goal of AIPS is to provide the knowledgebase which will allow achievement of validated fault-tolerant distributed computer system architectures, suitable for a broad range of applications, having failure probability requirements of 10E-9 at 10 hours. A background and description is given followed by program accomplishments, the current focus, applications, technology transfer, FY92 accomplishments, and funding.
Markert, Sven; Joeris, Klaus
2017-01-01
We developed an automated microtiter plate (MTP)-based system for suspension cell culture to meet the increased demands for miniaturized high throughput applications in biopharmaceutical process development. The generic system is based on off-the-shelf commercial laboratory automation equipment and is able to utilize MTPs of different configurations (6-24 wells per plate) in orbital shaken mode. The shaking conditions were optimized by Computational Fluid Dynamics simulations. The fully automated system handles plate transport, seeding and feeding of cells, daily sampling, and preparation of analytical assays. The integration of all required analytical instrumentation into the system enables a hands-off operation which prevents bottlenecks in sample processing. The modular set-up makes the system flexible and adaptable for a continuous extension of analytical parameters and add-on components. The system proved suitable as screening tool for process development by verifying the comparability of results for the MTP-based system and bioreactors regarding profiles of viable cell density, lactate, and product concentration of CHO cell lines. These studies confirmed that 6 well MTPs as well as 24 deepwell MTPs were predictive for a scale up to a 1000 L stirred tank reactor (scale factor 1:200,000). Applying the established cell culture system for automated media blend screening in late stage development, a 22% increase in product yield was achieved in comparison to the reference process. The predicted product increase was subsequently confirmed in 2 L bioreactors. Thus, we demonstrated the feasibility of the automated MTP-based cell culture system for enhanced screening and optimization applications in process development and identified further application areas such as process robustness. The system offers a great potential to accelerate time-to-market for new biopharmaceuticals. Biotechnol. Bioeng. 2017;114: 113-121. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Dreuw, Andreas
2006-11-13
With the advent of modern computers and advances in the development of efficient quantum chemical computer codes, the meaningful computation of large molecular systems at a quantum mechanical level became feasible. Recent experimental effort to understand photoinitiated processes in biological systems, for instance photosynthesis or vision, at a molecular level also triggered theoretical investigations in this field. In this Minireview, standard quantum chemical methods are presented that are applicable and recently used for the calculation of excited states of photoinitiated processes in biological molecular systems. These methods comprise configuration interaction singles, the complete active space self-consistent field method, and time-dependent density functional theory and its variants. Semiempirical approaches are also covered. Their basic theoretical concepts and mathematical equations are briefly outlined, and their properties and limitations are discussed. Recent successful applications of the methods to photoinitiated processes in biological systems are described and theoretical tools for the analysis of excited states are presented.
77 FR 10621 - Changes to the In-Bond Process
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-22
... submit in-bond applications electronically using a CBP-approved electronic data interchange (EDI) system... electronically submit the in-bond application to CBP via a CBP-approved EDI system. \\6\\ Due to the unique... as the CBP-approved EDI system for submitting the in-bond application and other information that is...
Design of special purpose database for credit cooperation bank business processing network system
NASA Astrophysics Data System (ADS)
Yu, Yongling; Zong, Sisheng; Shi, Jinfa
2011-12-01
With the popularization of e-finance in the city, the construction of e-finance is transfering to the vast rural market, and quickly to develop in depth. Developing the business processing network system suitable for the rural credit cooperative Banks can make business processing conveniently, and have a good application prospect. In this paper, We analyse the necessity of adopting special purpose distributed database in Credit Cooperation Band System, give corresponding distributed database system structure , design the specical purpose database and interface technology . The application in Tongbai Rural Credit Cooperatives has shown that system has better performance and higher efficiency.
Intelligent systems/software engineering methodology - A process to manage cost and risk
NASA Technical Reports Server (NTRS)
Friedlander, Carl; Lehrer, Nancy
1991-01-01
A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.
The application of artificial intelligence techniques to large distributed networks
NASA Technical Reports Server (NTRS)
Dubyah, R.; Smith, T. R.; Star, J. L.
1985-01-01
Data accessibility and transfer of information, including the land resources information system pilot, are structured as large computer information networks. These pilot efforts include the reduction of the difficulty to find and use data, reducing processing costs, and minimize incompatibility between data sources. Artificial Intelligence (AI) techniques were suggested to achieve these goals. The applicability of certain AI techniques are explored in the context of distributed problem solving systems and the pilot land data system (PLDS). The topics discussed include: PLDS and its data processing requirements, expert systems and PLDS, distributed problem solving systems, AI problem solving paradigms, query processing, and distributed data bases.
Applications of High-speed motion analysis system on Solid Rocket Motor (SRM)
NASA Astrophysics Data System (ADS)
Liu, Yang; He, Guo-qiang; Li, Jiang; Liu, Pei-jin; Chen, Jian
2007-01-01
High-speed motion analysis system could record images up to 12,000fps and analyzed with the image processing system. The system stored data and images directly in electronic memory convenient for managing and analyzing. The high-speed motion analysis system and the X-ray radiography system were established the high-speed real-time X-ray radiography system, which could diagnose and measure the dynamic and high-speed process in opaque. The image processing software was developed for improve quality of the original image for acquiring more precise information. The typical applications of high-speed motion analysis system on solid rocket motor (SRM) were introduced in the paper. The research of anomalous combustion of solid propellant grain with defects, real-time measurement experiment of insulator eroding, explosion incision process of motor, structure and wave character of plume during the process of ignition and flameout, measurement of end burning of solid propellant, measurement of flame front and compatibility between airplane and missile during the missile launching were carried out using high-speed motion analysis system. The significative results were achieved through the research. Aim at application of high-speed motion analysis system on solid rocket motor, the key problem, such as motor vibrancy, electrical source instability, geometry aberrance, and yawp disturbance, which damaged the image quality, was solved. The image processing software was developed which improved the capability of measuring the characteristic of image. The experimental results showed that the system was a powerful facility to study instantaneous and high-speed process in solid rocket motor. With the development of the image processing technique, the capability of high-speed motion analysis system was enhanced.
Classification of cognitive systems dedicated to data sharing
NASA Astrophysics Data System (ADS)
Ogiela, Lidia; Ogiela, Marek R.
2017-08-01
In this paper will be presented classification of new cognitive information systems dedicated to cryptographic data splitting and sharing processes. Cognitive processes of semantic data analysis and interpretation, will be used to describe new classes of intelligent information and vision systems. In addition, cryptographic data splitting algorithms and cryptographic threshold schemes will be used to improve processes of secure and efficient information management with application of such cognitive systems. The utility of the proposed cognitive sharing procedures and distributed data sharing algorithms will be also presented. A few possible application of cognitive approaches for visual information management and encryption will be also described.
Quantum state conversion in opto-electro-mechanical systems via shortcut to adiabaticity
NASA Astrophysics Data System (ADS)
Zhou, Xiao; Liu, Bao-Jie; Shao, L.-B.; Zhang, Xin-Ding; Xue, Zheng-Yuan
2017-09-01
Adiabatic processes have found many important applications in modern physics, the distinct merit of which is that accurate control over process timing is not required. However, such processes are slow, which limits their application in quantum computation, due to the limited coherent times of typical quantum systems. Here, we propose a scheme to implement quantum state conversion in opto-electro-mechanical systems via a shortcut to adiabaticity, where the process can be greatly speeded up while precise timing control is still not necessary. In our scheme, by modifying only the coupling strength, we can achieve fast quantum state conversion with high fidelity, where the adiabatic condition does not need to be met. In addition, the population of the unwanted intermediate state can be further suppressed. Therefore, our protocol presents an important step towards practical state conversion between optical and microwave photons, and thus may find many important applications in hybrid quantum information processing.
Thermodynamic and economic analysis of heat pumps for energy recovery in industrial processes
NASA Astrophysics Data System (ADS)
Urdaneta-B, A. H.; Schmidt, P. S.
1980-09-01
A computer code has been developed for analyzing the thermodynamic performance, cost and economic return for heat pump applications in industrial heat recovery. Starting with basic defining characteristics of the waste heat stream and the desired heat sink, the algorithm first evaluates the potential for conventional heat recovery with heat exchangers, and if applicable, sizes the exchanger. A heat pump system is then designed to process the residual heating and cooling requirements of the streams. In configuring the heat pump, the program searches a number of parameters, including condenser temperature, evaporator temperature, and condenser and evaporator approaches. All system components are sized for each set of parameters, and economic return is estimated and compared with system economics for conventional processing of the heated and cooled streams (i.e., with process heaters and coolers). Two case studies are evaluated, one in a food processing application and the other in an oil refinery unit.
Fuels processing for transportation fuel cell systems
NASA Astrophysics Data System (ADS)
Kumar, R.; Ahmed, S.
Fuel cells primarily use hydrogen as the fuel. This hydrogen must be produced from other fuels such as natural gas or methanol. The fuel processor requirements are affected by the fuel to be converted, the type of fuel cell to be supplied, and the fuel cell application. The conventional fuel processing technology has been reexamined to determine how it must be adapted for use in demanding applications such as transportation. The two major fuel conversion processes are steam reforming and partial oxidation reforming. The former is established practice for stationary applications; the latter offers certain advantages for mobile systems and is presently in various stages of development. This paper discusses these fuel processing technologies and the more recent developments for fuel cell systems used in transportation. The need for new materials in fuels processing, particularly in the area of reforming catalysis and hydrogen purification, is discussed.
A Review of Diagnostic Techniques for ISHM Applications
NASA Technical Reports Server (NTRS)
Patterson-Hine, Ann; Biswas, Gautam; Aaseng, Gordon; Narasimhan, Sriam; Pattipati, Krishna
2005-01-01
System diagnosis is an integral part of any Integrated System Health Management application. Diagnostic applications make use of system information from the design phase, such as safety and mission assurance analysis, failure modes and effects analysis, hazards analysis, functional models, fault propagation models, and testability analysis. In modern process control and equipment monitoring systems, topological and analytic , models of the nominal system, derived from design documents, are also employed for fault isolation and identification. Depending on the complexity of the monitored signals from the physical system, diagnostic applications may involve straightforward trending and feature extraction techniques to retrieve the parameters of importance from the sensor streams. They also may involve very complex analysis routines, such as signal processing, learning or classification methods to derive the parameters of importance to diagnosis. The process that is used to diagnose anomalous conditions from monitored system signals varies widely across the different approaches to system diagnosis. Rule-based expert systems, case-based reasoning systems, model-based reasoning systems, learning systems, and probabilistic reasoning systems are examples of the many diverse approaches ta diagnostic reasoning. Many engineering disciplines have specific approaches to modeling, monitoring and diagnosing anomalous conditions. Therefore, there is no "one-size-fits-all" approach to building diagnostic and health monitoring capabilities for a system. For instance, the conventional approaches to diagnosing failures in rotorcraft applications are very different from those used in communications systems. Further, online and offline automated diagnostic applications are integrated into an operations framework with flight crews, flight controllers and maintenance teams. While the emphasis of this paper is automation of health management functions, striking the correct balance between automated and human-performed tasks is a vital concern.
System design package for the solar heating and cooling central data processing system
NASA Technical Reports Server (NTRS)
1978-01-01
The central data processing system provides the resources required to assess the performance of solar heating and cooling systems installed at remote sites. These sites consist of residential, commercial, government, and educational types of buildings, and the solar heating and cooling systems can be hot-water, space heating, cooling, and combinations of these. The instrumentation data associated with these systems will vary according to the application and must be collected, processed, and presented in a form which supports continuity of performance evaluation across all applications. Overall software system requirements were established for use in the central integration facility which transforms raw data collected at remote sites into performance evaluation information for assessing the performance of solar heating and cooling systems.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-09
... DEPARTMENT OF COMMERCE Bureau of Industry and Security 15 CFR Part 748 [Docket No. 100826397-1059-02] RIN 0694-AE98 Simplified Network Application Processing System, On-line Registration and Account Maintenance AGENCY: Bureau of Industry and Security, Commerce. ACTION: Final rule. SUMMARY: The Bureau of...
From photons to big-data applications: terminating terabits
2016-01-01
Computer architectures have entered a watershed as the quantity of network data generated by user applications exceeds the data-processing capacity of any individual computer end-system. It will become impossible to scale existing computer systems while a gap grows between the quantity of networked data and the capacity for per system data processing. Despite this, the growth in demand in both task variety and task complexity continues unabated. Networked computer systems provide a fertile environment in which new applications develop. As networked computer systems become akin to infrastructure, any limitation upon the growth in capacity and capabilities becomes an important constraint of concern to all computer users. Considering a networked computer system capable of processing terabits per second, as a benchmark for scalability, we critique the state of the art in commodity computing, and propose a wholesale reconsideration in the design of computer architectures and their attendant ecosystem. Our proposal seeks to reduce costs, save power and increase performance in a multi-scale approach that has potential application from nanoscale to data-centre-scale computers. PMID:26809573
From photons to big-data applications: terminating terabits.
Zilberman, Noa; Moore, Andrew W; Crowcroft, Jon A
2016-03-06
Computer architectures have entered a watershed as the quantity of network data generated by user applications exceeds the data-processing capacity of any individual computer end-system. It will become impossible to scale existing computer systems while a gap grows between the quantity of networked data and the capacity for per system data processing. Despite this, the growth in demand in both task variety and task complexity continues unabated. Networked computer systems provide a fertile environment in which new applications develop. As networked computer systems become akin to infrastructure, any limitation upon the growth in capacity and capabilities becomes an important constraint of concern to all computer users. Considering a networked computer system capable of processing terabits per second, as a benchmark for scalability, we critique the state of the art in commodity computing, and propose a wholesale reconsideration in the design of computer architectures and their attendant ecosystem. Our proposal seeks to reduce costs, save power and increase performance in a multi-scale approach that has potential application from nanoscale to data-centre-scale computers. © 2016 The Authors.
Role of biomolecular logic systems in biosensors and bioactuators
NASA Astrophysics Data System (ADS)
Mailloux, Shay; Katz, Evgeny
2014-09-01
An overview of recent advances in biosensors and bioactuators based on biocomputing systems is presented. Biosensors digitally process multiple biochemical signals through Boolean logic networks of coupled biomolecular reactions and produce an output in the form of a YES/NO response. Compared to traditional single-analyte sensing devices, the biocomputing approach enables high-fidelity multianalyte biosensing, which is particularly beneficial for biomedical applications. Multisignal digital biosensors thus promise advances in rapid diagnosis and treatment of diseases by processing complex patterns of physiological biomarkers. Specifically, they can provide timely detection and alert medical personnel of medical emergencies together with immediate therapeutic intervention. Application of the biocomputing concept has been successfully demonstrated for systems performing logic analysis of biomarkers corresponding to different injuries, particularly as exemplified for liver injury. Wide-ranging applications of multianalyte digital biosensors in medicine, environmental monitoring, and homeland security are anticipated. "Smart" bioactuators, for signal-triggered drug release, for example, were designed by interfacing switchable electrodes with biocomputing systems. Integration of biosensing and bioactuating systems with biomolecular information processing systems advances the potential for further scientific innovations and various practical applications.
Reduced equations of motion for quantum systems driven by diffusive Markov processes.
Sarovar, Mohan; Grace, Matthew D
2012-09-28
The expansion of a stochastic Liouville equation for the coupled evolution of a quantum system and an Ornstein-Uhlenbeck process into a hierarchy of coupled differential equations is a useful technique that simplifies the simulation of stochastically driven quantum systems. We expand the applicability of this technique by completely characterizing the class of diffusive Markov processes for which a useful hierarchy of equations can be derived. The expansion of this technique enables the examination of quantum systems driven by non-Gaussian stochastic processes with bounded range. We present an application of this extended technique by simulating Stark-tuned Förster resonance transfer in Rydberg atoms with nonperturbative position fluctuations.
The automated Army ROTC Questionnaire (ARQ)
NASA Technical Reports Server (NTRS)
Young, David L. H.
1991-01-01
The Reserve Officer Training Corps Cadet Command (ROTCCC) takes applications for its officer training program from college students and Army enlisted personnel worldwide. Each applicant is required to complete a set of application forms prior to acceptance into the ROTC program. These forms are covered by several regulations that govern the eligibility of potential applicants and guide the applicant through the application process. Eligibility criteria changes as Army regulations are periodically revised. Outdated information results in a loss of applications attributable to frustration and error. ROTCCC asked for an inexpensive and reliable way of automating their application process. After reviewing the process, it was determined that an expert system with good end user interface capabilities could be used to solve a large part of the problem. The system captures the knowledge contained within the regulations, enables the quick distribution and implementation of eligibility criteria changes, and distributes the expertise of the admissions personnel to the education centers and colleges. The expert system uses a modified version of CLIPS that was streamlined to make the most efficient use of its capabilities. A user interface with windowing capabilities provides the applicant with a simple and effective way to input his/her personal data.
NASA Technical Reports Server (NTRS)
Gat, N.; Subramanian, S.; Barhen, J.; Toomarian, N.
1996-01-01
This paper reviews the activities at OKSI related to imaging spectroscopy presenting current and future applications of the technology. The authors discuss the development of several systems including hardware, signal processing, data classification algorithms and benchmarking techniques to determine algorithm performance. Signal processing for each application is tailored by incorporating the phenomenology appropriate to the process, into the algorithms. Pixel signatures are classified using techniques such as principal component analyses, generalized eigenvalue analysis and novel very fast neural network methods. The major hyperspectral imaging systems developed at OKSI include the Intelligent Missile Seeker (IMS) demonstration project for real-time target/decoy discrimination, and the Thermal InfraRed Imaging Spectrometer (TIRIS) for detection and tracking of toxic plumes and gases. In addition, systems for applications in medical photodiagnosis, manufacturing technology, and for crop monitoring are also under development.
Li, Jia; Zhou, Quan; Xu, Zhenming
2014-12-01
Although corona electrostatic separation is successfully used in recycling waste printed circuit boards in industrial applications, there are problems that cannot be resolved completely, such as nonmetal particle aggregation and spark discharge. Both of these problems damage the process of separation and are not easy to identify during the process of separation in industrial applications. This paper provides a systematic study on a real-time monitoring system. Weight monitoring systems were established to continuously monitor the separation process. A Virtual Instrumentation program written by LabVIEW was utilized to sample and analyse the mass increment of the middling product. It includes four modules: historical data storage, steady-state analysis, data computing and alarm. Three kinds of operating conditions were used to verify the applicability of the monitoring system. It was found that the system achieved the goal of monitoring during the separation process and realized the function of real-time analysis of the received data. The system also gave comprehensible feedback on the accidents of material blockages in the feed inlet and high-voltage spark discharge. With the warning function of the alarm system, the whole monitoring system could save the human cost and help the new technology to be more easily applied in industry. © The Author(s) 2014.
A study of compositional verification based IMA integration method
NASA Astrophysics Data System (ADS)
Huang, Hui; Zhang, Guoquan; Xu, Wanmeng
2018-03-01
The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.
Aerospace Applications Conference, Steamboat Springs, CO, Feb. 1-8, 1986, Digest
NASA Astrophysics Data System (ADS)
The present conference considers topics concerning the projected NASA Space Station's systems, digital signal and data processing applications, and space science and microwave applications. Attention is given to Space Station video and audio subsystems design, clock error, jitter, phase error and differential time-of-arrival in satellite communications, automation and robotics in space applications, target insertion into synthetic background scenes, and a novel scheme for the computation of the discrete Fourier transform on a systolic processor. Also discussed are a novel signal parameter measurement system employing digital signal processing, EEPROMS for spacecraft applications, a unique concurrent processor architecture for high speed simulation of dynamic systems, a dual polarization flat plate antenna, Fresnel diffraction, and ultralinear TWTs for high efficiency satellite communications.
42 CFR 433.110 - Basis, purpose, and applicability.
Code of Federal Regulations, 2010 CFR
2010-10-01
... and Information Retrieval Systems § 433.110 Basis, purpose, and applicability. (a) This subpart... information retrieval systems and for the operation of certain systems. Additional HHS regulations and CMS... mechanized claims processing and information retrieval system or if the system fails to meet certain...
42 CFR 433.110 - Basis, purpose, and applicability.
Code of Federal Regulations, 2011 CFR
2011-10-01
... and Information Retrieval Systems § 433.110 Basis, purpose, and applicability. (a) This subpart... information retrieval systems and for the operation of certain systems. Additional HHS regulations and CMS... conditions on mechanized claims processing and information retrieval systems (including eligibility...
42 CFR 433.110 - Basis, purpose, and applicability.
Code of Federal Regulations, 2012 CFR
2012-10-01
... and Information Retrieval Systems § 433.110 Basis, purpose, and applicability. (a) This subpart... information retrieval systems and for the operation of certain systems. Additional HHS regulations and CMS... conditions on mechanized claims processing and information retrieval systems (including eligibility...
42 CFR 433.110 - Basis, purpose, and applicability.
Code of Federal Regulations, 2014 CFR
2014-10-01
... and Information Retrieval Systems § 433.110 Basis, purpose, and applicability. (a) This subpart... information retrieval systems and for the operation of certain systems. Additional HHS regulations and CMS... conditions on mechanized claims processing and information retrieval systems (including eligibility...
42 CFR 433.110 - Basis, purpose, and applicability.
Code of Federal Regulations, 2013 CFR
2013-10-01
... and Information Retrieval Systems § 433.110 Basis, purpose, and applicability. (a) This subpart... information retrieval systems and for the operation of certain systems. Additional HHS regulations and CMS... conditions on mechanized claims processing and information retrieval systems (including eligibility...
75 FR 68806 - Statement of Organization, Functions and Delegations of Authority
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-09
... Agency business applications architectures, the engineering of business processes, the building and... architecture, engineers technology for business processes, builds, deploys, maintains and manages enterprise systems and data collections efforts; (5) applies business applications architecture to process specific...
NASA Technical Reports Server (NTRS)
1979-01-01
The functions performed by the systems management (SM) application software are described along with the design employed to accomplish these functions. The operational sequences (OPS) control segments and the cyclic processes they control are defined. The SM specialist function control (SPEC) segments and the display controlled 'on-demand' processes that are invoked by either an OPS or SPEC control segment as a direct result of an item entry to a display are included. Each processing element in the SM application is described including an input/output table and a structured control flow diagram. The flow through the module and other information pertinent to that process and its interfaces to other processes are included.
Harbison, K; Kelly, J; Burnell, L; Silva, J
1995-01-01
The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.
NASA Technical Reports Server (NTRS)
Nagle, Gail; Masotto, Thomas; Alger, Linda
1990-01-01
The need to meet the stringent performance and reliability requirements of advanced avionics systems has frequently led to implementations which are tailored to a specific application and are therefore difficult to modify or extend. Furthermore, many integrated flight critical systems are input/output intensive. By using a design methodology which customizes the input/output mechanism for each new application, the cost of implementing new systems becomes prohibitively expensive. One solution to this dilemma is to design computer systems and input/output subsystems which are general purpose, but which can be easily configured to support the needs of a specific application. The Advanced Information Processing System (AIPS), currently under development has these characteristics. The design and implementation of the prototype I/O communication system for AIPS is described. AIPS addresses reliability issues related to data communications by the use of reconfigurable I/O networks. When a fault or damage event occurs, communication is restored to functioning parts of the network and the failed or damage components are isolated. Performance issues are addressed by using a parallelized computer architecture which decouples Input/Output (I/O) redundancy management and I/O processing from the computational stream of an application. The autonomous nature of the system derives from the highly automated and independent manner in which I/O transactions are conducted for the application as well as from the fact that the hardware redundancy management is entirely transparent to the application.
Software Engineering and Its Application to Avionics
1988-01-01
34Automated Software Development Methodolgy (ASDM): An Architecture of a Knowledge-Based Expert System," Masters Thesis , Florida Atlantic University, Boca...operating system provides the control semnrim and aplication services within the miltiproossur system. Them processes timt aks up the application sofhwae...as a high-value target may no longer be occupied by the time the film is processed and analyzed. With the high mobility of today’s enemy forces
Optimization of MLS receivers for multipath environments
NASA Technical Reports Server (NTRS)
Mcalpine, G. A.; Highfill, J. H., III
1976-01-01
The design of a microwave landing system (MLS) aircraft receiver, capable of optimal performance in multipath environments found in air terminal areas, is reported. Special attention was given to the angle tracking problem of the receiver and includes tracking system design considerations, study and application of locally optimum estimation involving multipath adaptive reception and then envelope processing, and microcomputer system design. Results show processing is competitive in this application with i-f signal processing performance-wise and is much more simple and cheaper. A summary of the signal model is given.
NASA Technical Reports Server (NTRS)
1973-01-01
The applications are reported of new remote sensing techniques for earth resources surveys and environmental monitoring. Applications discussed include: vegetation systems, environmental monitoring, and plant protection. Data processing systems are described.
DOT National Transportation Integrated Search
2009-08-25
In cooperation with the California Department of Transportation, Montana State University's Western Transportation Institute has developed the WeatherShare Phase II system by applying Systems Engineering and Software Engineering processes. The system...
A single FPGA-based portable ultrasound imaging system for point-of-care applications.
Kim, Gi-Duck; Yoon, Changhan; Kye, Sang-Bum; Lee, Youngbae; Kang, Jeeun; Yoo, Yangmo; Song, Tai-kyong
2012-07-01
We present a cost-effective portable ultrasound system based on a single field-programmable gate array (FPGA) for point-of-care applications. In the portable ultrasound system developed, all the ultrasound signal and image processing modules, including an effective 32-channel receive beamformer with pseudo-dynamic focusing, are embedded in an FPGA chip. For overall system control, a mobile processor running Linux at 667 MHz is used. The scan-converted ultrasound image data from the FPGA are directly transferred to the system controller via external direct memory access without a video processing unit. The potable ultrasound system developed can provide real-time B-mode imaging with a maximum frame rate of 30, and it has a battery life of approximately 1.5 h. These results indicate that the single FPGA-based portable ultrasound system developed is able to meet the processing requirements in medical ultrasound imaging while providing improved flexibility for adapting to emerging POC applications.
[Sustainable process improvement with application of 'lean philosophy'].
Rouppe van der Voort, Marc B V; van Merode, G G Frits; Veraart, Henricus G N
2013-01-01
Process improvement is increasingly being implemented, particularly with the aid of 'lean philosophy'. This management philosophy aims to improve quality by reducing 'wastage'. Local improvements can produce negative effects elsewhere due to interdependence of processes. An 'integrated system approach' is required to prevent this. Some hospitals claim that this has been successful. Research into process improvement with the application of lean philosophy has reported many positive effects, defined as improved safety, quality and efficiency. Due to methodological shortcomings and lack of rigorous evaluations it is, however, not yet possible to determine the impact of this approach. It is, however, obvious that the investigated applications are fragmentary, with a dominant focus on the instrumental aspect of the philosophy and a lack of integration in a total system, and with insufficient attention to human aspects. Process improvement is required to achieve better and more goal-oriented healthcare. To achieve this, hospitals must develop integrated system approaches that combine methods for process design with continuous improvement of processes and with personnel management. It is crucial that doctors take the initiative to guide and improve processes in an integral manner.
The UNIX/XENIX Advantage: Applications in Libraries.
ERIC Educational Resources Information Center
Gordon, Kelly L.
1988-01-01
Discusses the application of the UNIX/XENIX operating system to support administrative office automation functions--word processing, spreadsheets, database management systems, electronic mail, and communications--at the Central Michigan University Libraries. Advantages and disadvantages of the XENIX operating system and system configuration are…
Spaceborne VHSIC multiprocessor system for AI applications
NASA Technical Reports Server (NTRS)
Lum, Henry, Jr.; Shrobe, Howard E.; Aspinall, John G.
1988-01-01
A multiprocessor system, under design for space-station applications, makes use of the latest generation symbolic processor and packaging technology. The result will be a compact, space-qualified system two to three orders of magnitude more powerful than present-day symbolic processing systems.
48 CFR 1615.170 - Applicability.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Applicability. 1615.170 Section 1615.170 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES... Source Selection Processes and Techniques. 1615.170 Applicability. FAR subpart 15.1 has no practical...
48 CFR 2115.170 - Applicability.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Applicability. 2115.170 Section 2115.170 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT, FEDERAL EMPLOYEES... BY NEGOTIATION Source Selection Processes and Techniques 2115.170 Applicability. FAR subpart 15.1 has...
48 CFR 1615.170 - Applicability.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Applicability. 1615.170 Section 1615.170 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES... Source Selection Processes and Techniques. 1615.170 Applicability. FAR subpart 15.1 has no practical...
48 CFR 1615.170 - Applicability.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 6 2014-10-01 2014-10-01 false Applicability. 1615.170 Section 1615.170 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES... Source Selection Processes and Techniques 1615.170 Applicability. FAR subpart 15.1 has no practical...
48 CFR 1615.170 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Applicability. 1615.170 Section 1615.170 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT FEDERAL EMPLOYEES... Source Selection Processes and Techniques 1615.170 Applicability. FAR subpart 15.1 has no practical...
48 CFR 2115.170 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Applicability. 2115.170 Section 2115.170 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT, FEDERAL EMPLOYEES... BY NEGOTIATION Source Selection Processes and Techniques 2115.170 Applicability. FAR subpart 15.1 has...
48 CFR 2115.170 - Applicability.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 6 2014-10-01 2014-10-01 false Applicability. 2115.170 Section 2115.170 Federal Acquisition Regulations System OFFICE OF PERSONNEL MANAGEMENT, FEDERAL EMPLOYEES... BY NEGOTIATION Source Selection Processes and Techniques 2115.170 Applicability. FAR subpart 15.1 has...
NASA Technical Reports Server (NTRS)
Watson, Michael D.; Kelley, Gary W.
2012-01-01
The Department of Defense (DoD) defined System Operational Effectiveness (SOE) model provides an exceptional framework for an affordable approach to the development and operation of space launch vehicles and their supporting infrastructure. The SOE model provides a focal point from which to direct and measure technical effectiveness and process efficiencies of space launch vehicles. The application of the SOE model to a space launch vehicle's development and operation effort leads to very specific approaches and measures that require consideration during the design phase. This paper provides a mapping of the SOE model to the development of space launch vehicles for human exploration by addressing the SOE model key points of measurement including System Performance, System Availability, Technical Effectiveness, Process Efficiency, System Effectiveness, Life Cycle Cost, and Affordable Operational Effectiveness. In addition, the application of the SOE model to the launch vehicle development process is defined providing the unique aspects of space launch vehicle production and operations in lieu of the traditional broader SOE context that examines large quantities of fielded systems. The tailoring and application of the SOE model to space launch vehicles provides some key insights into the operational design drivers, capability phasing, and operational support systems.
NASA Technical Reports Server (NTRS)
Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw
1990-01-01
Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.
The Network Configuration of an Object Relational Database Management System
NASA Technical Reports Server (NTRS)
Diaz, Philip; Harris, W. C.
2000-01-01
The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.
Fenna, D
1977-09-01
For nearly two decades, the development of computerized information systems has struggled for acceptable compromises between the unattainable "total system" and the unacceptable separate applications. Integration of related applications is essential if the computer is to be exploited fully, yet relative simplicity is necessary for systems to be implemented in a reasonable time-scale. This paper discusses a system being progressively developed from minimal beginnings but which, from the outset, had a highly flexible and fully integrated system basis. The system is for batch processing, but can accommodate on-line data input; it is similar in its approach to many transaction-processing real-time systems.
Applicability and Limitations of Reliability Allocation Methods
NASA Technical Reports Server (NTRS)
Cruz, Jose A.
2016-01-01
Reliability allocation process may be described as the process of assigning reliability requirements to individual components within a system to attain the specified system reliability. For large systems, the allocation process is often performed at different stages of system design. The allocation process often begins at the conceptual stage. As the system design develops, more information about components and the operating environment becomes available, different allocation methods can be considered. Reliability allocation methods are usually divided into two categories: weighting factors and optimal reliability allocation. When properly applied, these methods can produce reasonable approximations. Reliability allocation techniques have limitations and implied assumptions that need to be understood by system engineers. Applying reliability allocation techniques without understanding their limitations and assumptions can produce unrealistic results. This report addresses weighting factors, optimal reliability allocation techniques, and identifies the applicability and limitations of each reliability allocation technique.
NASA Astrophysics Data System (ADS)
Gill, Douglas M.; Rasras, Mahmoud; Tu, Kun-Yii; Chen, Young-Kai; White, Alice E.; Patel, Sanjay S.; Carothers, Daniel; Pomerene, Andrew; Kamocsai, Robert; Beattie, James; Kopa, Anthony; Apsel, Alyssa; Beals, Mark; Mitchel, Jurgen; Liu, Jifeng; Kimerling, Lionel C.
2008-02-01
Integrating electronic and photonic functions onto a single silicon-based chip using techniques compatible with mass-production CMOS electronics will enable new design paradigms for existing system architectures and open new opportunities for electro-optic applications with the potential to dramatically change the management, cost, footprint, weight, and power consumption of today's communication systems. While broadband analog system applications represent a smaller volume market than that for digital data transmission, there are significant deployments of analog electro-optic systems for commercial and military applications. Broadband linear modulation is a critical building block in optical analog signal processing and also could have significant applications in digital communication systems. Recently, broadband electro-optic modulators on a silicon platform have been demonstrated based on the plasma dispersion effect. The use of the plasma dispersion effect within a CMOS compatible waveguide creates new challenges and opportunities for analog signal processing since the index and propagation loss change within the waveguide during modulation. We will review the current status of silicon-based electrooptic modulators and also linearization techniques for optical modulation.
MIRIADS: miniature infrared imaging applications development system description and operation
NASA Astrophysics Data System (ADS)
Baxter, Christopher R.; Massie, Mark A.; McCarley, Paul L.; Couture, Michael E.
2001-10-01
A cooperative effort between the U.S. Air Force Research Laboratory, Nova Research, Inc., the Raytheon Infrared Operations (RIO) and Optics 1, Inc. has successfully produced a miniature infrared camera system that offers significant real-time signal and image processing capabilities by virtue of its modular design. This paper will present an operational overview of the system as well as results from initial testing of the 'Modular Infrared Imaging Applications Development System' (MIRIADS) configured as a missile early-warning detection system. The MIRIADS device can operate virtually any infrared focal plane array (FPA) that currently exists. Programmable on-board logic applies user-defined processing functions to the real-time digital image data for a variety of functions. Daughterboards may be plugged onto the system to expand the digital and analog processing capabilities of the system. A unique full hemispherical infrared fisheye optical system designed and produced by Optics 1, Inc. is utilized by the MIRIADS in a missile warning application to demonstrate the flexibility of the overall system to be applied to a variety of current and future AFRL missions.
Applications integration in a hybrid cloud computing environment: modelling and platform
NASA Astrophysics Data System (ADS)
Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang
2013-08-01
With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.
Object positioning in storages of robotized workcells using LabVIEW Vision
NASA Astrophysics Data System (ADS)
Hryniewicz, P.; Banaś, W.; Sękala, A.; Gwiazda, A.; Foit, K.; Kost, G.
2015-11-01
During the manufacturing process, each performed task is previously developed and adapted to the conditions and the possibilities of the manufacturing plant. The production process is supervised by a team of specialists because any downtime causes great loss of time and hence financial loss. Sensors used in industry for tracking and supervision various stages of a production process make it much easier to maintain it continuous. One of groups of sensors used in industrial applications are non-contact sensors. This group includes: light barriers, optical sensors, rangefinders, vision systems, and ultrasonic sensors. Through to the rapid development of electronics the vision systems were widespread as the most flexible type of non-contact sensors. These systems consist of cameras, devices for data acquisition, devices for data analysis and specialized software. Vision systems work well as sensors that control the production process itself as well as the sensors that control the product quality level. The LabVIEW program as well as the LabVIEW Vision and LabVIEW Builder represent the application that enables program the informatics system intended to process and product quality control. The paper presents elaborated application for positioning elements in a robotized workcell. Basing on geometric parameters of manipulated object or on the basis of previously developed graphical pattern it is possible to determine the position of particular manipulated elements. This application could work in an automatic mode and in real time cooperating with the robot control system. It allows making the workcell functioning more autonomous.
Tanaka, Ryoma; Takahashi, Naoyuki; Nakamura, Yasuaki; Hattori, Yusuke; Ashizawa, Kazuhide; Otsuka, Makoto
2017-01-01
Resonant acoustic ® mixing (RAM) technology is a system that performs high-speed mixing by vibration through the control of acceleration and frequency. In recent years, real-time process monitoring and prediction has become of increasing interest, and process analytical technology (PAT) systems will be increasingly introduced into actual manufacturing processes. This study examined the application of PAT with the combination of RAM, near-infrared spectroscopy, and chemometric technology as a set of PAT tools for introduction into actual pharmaceutical powder blending processes. Content uniformity was based on a robust partial least squares regression (PLSR) model constructed to manage the RAM configuration parameters and the changing concentration of the components. As a result, real-time monitoring may be possible and could be successfully demonstrated for in-line real-time prediction of active pharmaceutical ingredients and other additives using chemometric technology. This system is expected to be applicable to the RAM method for the risk management of quality.
NASA Technical Reports Server (NTRS)
1987-01-01
Potential applications of robots for cost effective commercial microelectronic processes in space were studied and the associated robotic requirements were defined. Potential space application areas include advanced materials processing, bulk crystal growth, and epitaxial thin film growth and related processes. All possible automation of these processes was considered, along with energy and environmental requirements. Aspects of robot capabilities considered include system intelligence, ROM requirements, kinematic and dynamic specifications, sensor design and configuration, flexibility and maintainability. Support elements discussed included facilities, logistics, ground support, launch and recovery, and management systems.
Jha, Abhinav K; Barrett, Harrison H; Frey, Eric C; Clarkson, Eric; Caucci, Luca; Kupinski, Matthew A
2015-09-21
Recent advances in technology are enabling a new class of nuclear imaging systems consisting of detectors that use real-time maximum-likelihood (ML) methods to estimate the interaction position, deposited energy, and other attributes of each photon-interaction event and store these attributes in a list format. This class of systems, which we refer to as photon-processing (PP) nuclear imaging systems, can be described by a fundamentally different mathematical imaging operator that allows processing of the continuous-valued photon attributes on a per-photon basis. Unlike conventional photon-counting (PC) systems that bin the data into images, PP systems do not have any binning-related information loss. Mathematically, while PC systems have an infinite-dimensional null space due to dimensionality considerations, PP systems do not necessarily suffer from this issue. Therefore, PP systems have the potential to provide improved performance in comparison to PC systems. To study these advantages, we propose a framework to perform the singular-value decomposition (SVD) of the PP imaging operator. We use this framework to perform the SVD of operators that describe a general two-dimensional (2D) planar linear shift-invariant (LSIV) PP system and a hypothetical continuously rotating 2D single-photon emission computed tomography (SPECT) PP system. We then discuss two applications of the SVD framework. The first application is to decompose the object being imaged by the PP imaging system into measurement and null components. We compare these components to the measurement and null components obtained with PC systems. In the process, we also present a procedure to compute the null functions for a PC system. The second application is designing analytical reconstruction algorithms for PP systems. The proposed analytical approach exploits the fact that PP systems acquire data in a continuous domain to estimate a continuous object function. The approach is parallelizable and implemented for graphics processing units (GPUs). Further, this approach leverages another important advantage of PP systems, namely the possibility to perform photon-by-photon real-time reconstruction. We demonstrate the application of the approach to perform reconstruction in a simulated 2D SPECT system. The results help to validate and demonstrate the utility of the proposed method and show that PP systems can help overcome the aliasing artifacts that are otherwise intrinsically present in PC systems.
NASA Astrophysics Data System (ADS)
Jha, Abhinav K.; Barrett, Harrison H.; Frey, Eric C.; Clarkson, Eric; Caucci, Luca; Kupinski, Matthew A.
2015-09-01
Recent advances in technology are enabling a new class of nuclear imaging systems consisting of detectors that use real-time maximum-likelihood (ML) methods to estimate the interaction position, deposited energy, and other attributes of each photon-interaction event and store these attributes in a list format. This class of systems, which we refer to as photon-processing (PP) nuclear imaging systems, can be described by a fundamentally different mathematical imaging operator that allows processing of the continuous-valued photon attributes on a per-photon basis. Unlike conventional photon-counting (PC) systems that bin the data into images, PP systems do not have any binning-related information loss. Mathematically, while PC systems have an infinite-dimensional null space due to dimensionality considerations, PP systems do not necessarily suffer from this issue. Therefore, PP systems have the potential to provide improved performance in comparison to PC systems. To study these advantages, we propose a framework to perform the singular-value decomposition (SVD) of the PP imaging operator. We use this framework to perform the SVD of operators that describe a general two-dimensional (2D) planar linear shift-invariant (LSIV) PP system and a hypothetical continuously rotating 2D single-photon emission computed tomography (SPECT) PP system. We then discuss two applications of the SVD framework. The first application is to decompose the object being imaged by the PP imaging system into measurement and null components. We compare these components to the measurement and null components obtained with PC systems. In the process, we also present a procedure to compute the null functions for a PC system. The second application is designing analytical reconstruction algorithms for PP systems. The proposed analytical approach exploits the fact that PP systems acquire data in a continuous domain to estimate a continuous object function. The approach is parallelizable and implemented for graphics processing units (GPUs). Further, this approach leverages another important advantage of PP systems, namely the possibility to perform photon-by-photon real-time reconstruction. We demonstrate the application of the approach to perform reconstruction in a simulated 2D SPECT system. The results help to validate and demonstrate the utility of the proposed method and show that PP systems can help overcome the aliasing artifacts that are otherwise intrinsically present in PC systems.
1985-11-01
User Interface that consists of a set of callable execution time routines available to an application program for form processing . IISS Function Screen...provisions for test consists of the normal testing techniques that are accomplished during the construction process . They consist of design and code...application presents a form * to the user which must be filled in with information for processing by that application. The application then
Marshall Space Flight Center's Virtual Reality Applications Program 1993
NASA Technical Reports Server (NTRS)
Hale, Joseph P., II
1993-01-01
A Virtual Reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. Other NASA Centers, most notably Ames Research Center (ARC), have contributed to the development of the VR enabling technologies and VR systems. This VR technology development has now reached a level of maturity where specific applications of VR as a tool can be considered. The objectives of the MSFC VR Applications Program are to develop, validate, and utilize VR as a Human Factors design and operations analysis tool and to assess and evaluate VR as a tool in other applications (e.g., training, operations development, mission support, teleoperations planning, etc.). The long-term goals of this technology program is to enable specialized Human Factors analyses earlier in the hardware and operations development process and develop more effective training and mission support systems. The capability to perform specialized Human Factors analyses earlier in the hardware and operations development process is required to better refine and validate requirements during the requirements definition phase. This leads to a more efficient design process where perturbations caused by late-occurring requirements changes are minimized. A validated set of VR analytical tools must be developed to enable a more efficient process for the design and development of space systems and operations. Similarly, training and mission support systems must exploit state-of-the-art computer-based technologies to maximize training effectiveness and enhance mission support. The approach of the VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems.
SDI-based business processes: A territorial analysis web information system in Spain
NASA Astrophysics Data System (ADS)
Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.
2012-09-01
Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.
41 CFR Appendix B to Part 60 - 741-Developing Reasonable Accommodation Procedures
Code of Federal Regulations, 2014 CFR
2014-07-01
... may initiate an interactive process with the accommodation requester. 3. Form of requests for... using the contractor's online or other electronic application system, are made aware of the contractor's... participate fully in the application process. All applicants should also be provided with contact information...
7 CFR 1776.7 - HWWS Grant application process.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 12 2011-01-01 2011-01-01 false HWWS Grant application process. 1776.7 Section 1776.7 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE (CONTINUED) HOUSEHOLD WATER WELL SYSTEM GRANT PROGRAM HWWS Grants § 1776.7 HWWS Grant application...
7 CFR 1776.7 - HWWS Grant application process.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 12 2014-01-01 2013-01-01 true HWWS Grant application process. 1776.7 Section 1776.7 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE (CONTINUED) HOUSEHOLD WATER WELL SYSTEM GRANT PROGRAM HWWS Grants § 1776.7 HWWS Grant application...
Experiences with a generator tool for building clinical application modules.
Kuhn, K A; Lenz, R; Elstner, T; Siegele, H; Moll, R
2003-01-01
To elaborate main system characteristics and relevant deployment experiences for the health information system (HIS) Orbis/OpenMed, which is in widespread use in Germany, Austria, and Switzerland. In a deployment phase of 3 years in a 1.200 bed university hospital, where the system underwent significant improvements, the system's functionality and its software design have been analyzed in detail. We focus on an integrated CASE tool for generating embedded clinical applications and for incremental system evolution. We present a participatory and iterative software engineering process developed for efficient utilization of such a tool. The system's functionality is comparable to other commercial products' functionality; its components are embedded in a vendor-specific application framework, and standard interfaces are being used for connecting subsystems. The integrated generator tool is a remarkable feature; it became a key factor of our project. Tool generated applications are workflow enabled and embedded into the overall data base schema. Rapid prototyping and iterative refinement are supported, so application modules can be adapted to the users' work practice. We consider tools supporting an iterative and participatory software engineering process highly relevant for health information system architects. The potential of a system to continuously evolve and to be effectively adapted to changing needs may be more important than sophisticated but hard-coded HIS functionality. More work will focus on HIS software design and on software engineering. Methods and tools are needed for quick and robust adaptation of systems to health care processes and changing requirements.
Adapting the SpaceCube v2.0 Data Processing System for Mission-Unique Application Requirements
NASA Technical Reports Server (NTRS)
Petrick, David; Gill, Nat; Hasouneh, Munther; Stone, Robert; Winternitz, Luke; Thomas, Luke; Davis, Milton; Sparacino, Pietro; Flatley, Thomas
2015-01-01
The SpaceCube (sup TM) v2.0 system is a superior high performance, reconfigurable, hybrid data processing system that can be used in a multitude of applications including those that require a radiation hardened and reliable solution. This paper provides an overview of the design architecture, flexibility, and the advantages of the modular SpaceCube v2.0 high performance data processing system for space applications. The current state of the proven SpaceCube technology is based on nine years of engineering and operations. Five systems have been successfully operated in space starting in 2008 with four more to be delivered for launch vehicle integration in 2015. The SpaceCube v2.0 system is also baselined as the avionics solution for five additional flight projects and is always a top consideration as the core avionics for new instruments or spacecraft control. This paper will highlight how this multipurpose system is currently being used to solve design challenges of three independent applications. The SpaceCube hardware adapts to new system requirements by allowing for application-unique interface cards that are utilized by reconfiguring the underlying programmable elements on the core processor card. We will show how this system is being used to improve on a heritage NASA GPS technology, enable a cutting-edge LiDAR instrument, and serve as a typical command and data handling (C&DH) computer for a space robotics technology demonstration.
Adapting the SpaceCube v2.0 Data Processing System for Mission-Unique Application Requirements
NASA Technical Reports Server (NTRS)
Petrick, David
2015-01-01
The SpaceCubeTM v2.0 system is a superior high performance, reconfigurable, hybrid data processing system that can be used in a multitude of applications including those that require a radiation hardened and reliable solution. This paper provides an overview of the design architecture, flexibility, and the advantages of the modular SpaceCube v2.0 high performance data processing system for space applications. The current state of the proven SpaceCube technology is based on nine years of engineering and operations. Five systems have been successfully operated in space starting in 2008 with four more to be delivered for launch vehicle integration in 2015. The SpaceCube v2.0 system is also baselined as the avionics solution for five additional flight projects and is always a top consideration as the core avionics for new instruments or spacecraft control. This paper will highlight how this multipurpose system is currently being used to solve design challenges of three independent applications. The SpaceCube hardware adapts to new system requirements by allowing for application-unique interface cards that are utilized by reconfiguring the underlying programmable elements on the core processor card. We will show how this system is being used to improve on a heritage NASA GPS technology, enable a cutting-edge LiDAR instrument, and serve as a typical command and data handling (CDH) computer for a space robotics technology demonstration.
76 FR 58252 - Applications for New Awards; Statewide, Longitudinal Data Systems Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-20
... DEPARTMENT OF EDUCATION Applications for New Awards; Statewide, Longitudinal Data Systems Program... analysis and informed decision- making at all levels of the education system, increase the efficiency with... accountability systems, and simplify the processes used by SEAs to make education data transparent through...
Model-based design of experiments for cellular processes.
Chakrabarty, Ankush; Buzzard, Gregery T; Rundell, Ann E
2013-01-01
Model-based design of experiments (MBDOE) assists in the planning of highly effective and efficient experiments. Although the foundations of this field are well-established, the application of these techniques to understand cellular processes is a fertile and rapidly advancing area as the community seeks to understand ever more complex cellular processes and systems. This review discusses the MBDOE paradigm along with applications and challenges within the context of cellular processes and systems. It also provides a brief tutorial on Fisher information matrix (FIM)-based and Bayesian experiment design methods along with an overview of existing software packages and computational advances that support MBDOE application and adoption within the Systems Biology community. As cell-based products and biologics progress into the commercial sector, it is anticipated that MBDOE will become an essential practice for design, quality control, and production. Copyright © 2013 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2013-04-04
Spindle is software infrastructure that solves file system scalabiltiy problems associated with starting dynamically linked applications in HPC environments. When an HPC applications starts up thousands of pricesses at once, and those processes simultaneously access a shared file system to look for shared libraries, it can cause significant performance problems for both the application and other users. Spindle scalably coordinates the distribution of shared libraries to an application to avoid hammering the shared file system.
NASA Astrophysics Data System (ADS)
Fasel, Markus
2016-10-01
High-Performance Computing Systems are powerful tools tailored to support large- scale applications that rely on low-latency inter-process communications to run efficiently. By design, these systems often impose constraints on application workflows, such as limited external network connectivity and whole node scheduling, that make more general-purpose computing tasks, such as those commonly found in high-energy nuclear physics applications, more difficult to carry out. In this work, we present a tool designed to simplify access to such complicated environments by handling the common tasks of job submission, software management, and local data management, in a framework that is easily adaptable to the specific requirements of various computing systems. The tool, initially constructed to process stand-alone ALICE simulations for detector and software development, was successfully deployed on the NERSC computing systems, Carver, Hopper and Edison, and is being configured to provide access to the next generation NERSC system, Cori. In this report, we describe the tool and discuss our experience running ALICE applications on NERSC HPC systems. The discussion will include our initial benchmarks of Cori compared to other systems and our attempts to leverage the new capabilities offered with Cori to support data-intensive applications, with a future goal of full integration of such systems into ALICE grid operations.
Optimum process design of packed bed type thermal storage systems and other applications
Bindra, Hitesh; Bueno, Pablo
2016-10-25
Methods and systems for optimizing the process of heat and/or mass transfer operations in packed beds and embodiments of applications of the methods are disclosed herein below. In one instance, the method results in the profile of the quantity representative of the heat and/or mass transfer operation having a propagating substantially sharp front.
ERIC Educational Resources Information Center
1980
This collection of 22 papers examines various word processing (WP) technologies, systems, and applications. The first five papers by C. Briggs, C. Taylor, G. McLean, D. Remsen, and C. Norris discuss WP applications in the Army, a WP system for an insurance firm, the organization of the International Word Processing Association, WP fundamentals,…
2010-06-01
DATES COVEREDAPR 2009 – JAN 2010 (From - To) APR 2009 – JAN 2010 4. TITLE AND SUBTITLE EMERGING NEUROMORPHIC COMPUTING ARCHITECTURES AND ENABLING...14. ABSTRACT The highly cross-disciplinary emerging field of neuromorphic computing architectures for cognitive information processing applications...belief systems, software, computer engineering, etc. In our effort to develop cognitive systems atop a neuromorphic computing architecture, we explored
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 4 2012-01-01 2012-01-01 false Application of the System and Technical Guidelines During the Siting Process III Appendix III to Part 960 Energy DEPARTMENT OF ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Pt. 960, App. III Appendix III to Part...
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 4 2014-01-01 2014-01-01 false Application of the System and Technical Guidelines During the Siting Process III Appendix III to Part 960 Energy DEPARTMENT OF ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Pt. 960, App. III Appendix III to Part...
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 4 2013-01-01 2013-01-01 false Application of the System and Technical Guidelines During the Siting Process III Appendix III to Part 960 Energy DEPARTMENT OF ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Pt. 960, App. III Appendix III to Part...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Application of the System and Technical Guidelines During the Siting Process III Appendix III to Part 960 Energy DEPARTMENT OF ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Pt. 960, App. III Appendix III to Part...
Kino-oka, Masahiro; Taya, Masahito
2009-10-01
Innovative techniques of cell and tissue processing, based on tissue engineering, have been developed for therapeutic applications. Cell expansion and tissue reconstruction through ex vivo cultures are core processes used to produce engineered tissues with sufficient structural integrity and functionality. In manufacturing, strict management against contamination and human error is compelled due to direct use of un-sterilable products and the laboriousness of culture operations, respectively. Therefore, the development of processing systems for cell and tissue cultures is one of the critical issues for ensuring a stable process and quality of therapeutic products. However, the siting criterion of culture systems to date has not been made clear. This review article classifies some of the known processing systems into 'sealed-chamber' and 'sealed-vessel' culture systems based on the difference in their aseptic spaces, and describes the potential advantages of these systems and current states of culture systems, especially those established by Japanese companies. Moreover, on the basis of the guidelines for isolator systems used in aseptic processing for healthcare products, which are issued by the International Organization for Standardization, the siting criterion of the processing systems for cells and tissue cultures is discussed in perspective of manufacturing therapeutic products in consideration of the regulations according to the Good Manufacturing Practice.
Introduction to the Space Weather Monitoring System at KASI
NASA Astrophysics Data System (ADS)
Baek, J.; Choi, S.; Kim, Y.; Cho, K.; Bong, S.; Lee, J.; Kwak, Y.; Hwang, J.; Park, Y.; Hwang, E.
2014-05-01
We have developed the Space Weather Monitoring System (SWMS) at the Korea Astronomy and Space Science Institute (KASI). Since 2007, the system has continuously evolved into a better system. The SWMS consists of several subsystems: applications which acquire and process observational data, servers which run the applications, data storage, and display facilities which show the space weather information. The applications collect solar and space weather data from domestic and oversea sites. The collected data are converted to other format and/or visualized in real time as graphs and illustrations. We manage 3 data acquisition and processing servers, a file service server, a web server, and 3 sets of storage systems. We have developed 30 applications for a variety of data, and the volume of data is about 5.5 GB per day. We provide our customers with space weather contents displayed at the Space Weather Monitoring Lab (SWML) using web services.
NASA Astrophysics Data System (ADS)
Saldan, Yosyp R.; Pavlov, Sergii V.; Vovkotrub, Dina V.; Saldan, Yulia Y.; Vassilenko, Valentina B.; Mazur, Nadia I.; Nikolaichuk, Daria V.; Wójcik, Waldemar; Romaniuk, Ryszard; Suleimenov, Batyrbek; Bainazarov, Ulan
2017-08-01
Process of eye tomogram obtaining by means of optical coherent tomography is studied. Stages of idiopathic macula holes formation in the process of eye grounds diagnostics are considered. Main stages of retina pathology progression are determined: Fuzzy logic units for obtaining reliable conclusions regarding the result of diagnosis are developed. By the results of theoretical and practical research system and technique of retinal macular region of the eye state analysis is developed ; application of the system, based on fuzzy logic device, improves the efficiency of eye retina complex.
Precision and resolution in laser direct microstructuring with bursts of picosecond pulses
NASA Astrophysics Data System (ADS)
Mur, Jaka; Petkovšek, Rok
2018-01-01
Pulsed laser sources facilitate various applications, including efficient material removal in different scientific and industrial applications. Commercially available laser systems in the field typically use a focused laser beam of 10-20 μm in diameter. In line with the ongoing trends of miniaturization, we have developed a picosecond fiber laser-based system combining fast beam deflection and tight focusing for material processing and optical applications. We have predicted and verified the system's precision, resolution, and minimum achievable feature size for material processing applications. The analysis of the laser's performance requirements for the specific applications of high-precision laser processing is an important aspect for further development of the technique. We have predicted and experimentally verified that maximal edge roughness of single-micrometer-sized features was below 200 nm, including the laser's energy and positioning stability, beam deflection, the effect of spot spacing, and efficient isolation of mechanical vibrations. We have demonstrated that a novel fiber laser operating regime in bursts of pulses increases the laser energy stability. The results of our research improve the potential of fiber laser sources for material processing applications and facilitate their use through enabling the operation at lower pulse energies in bursts as opposed to single pulse regimes.
BIO-Plex Information System Concept
NASA Technical Reports Server (NTRS)
Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)
1999-01-01
This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.
Application of advanced on-board processing concepts to future satellite communications systems
NASA Technical Reports Server (NTRS)
Katz, J. L.; Hoffman, M.; Kota, S. L.; Ruddy, J. M.; White, B. F.
1979-01-01
An initial definition of on-board processing requirements for an advanced satellite communications system to service domestic markets in the 1990's is presented. An exemplar system architecture with both RF on-board switching and demodulation/remodulation baseband processing was used to identify important issues related to system implementation, cost, and technology development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berry, K.R.; Hansen, F.R.; Napolitano, L.M.
1992-01-01
DART (DSP Arrary for Reconfigurable Tasks) is a parallel architecture of two high-performance SDP (digital signal processing) chips with the flexibility to handle a wide range of real-time applications. Each of the 32-bit floating-point DSP processes in DART is programmable in a high-level languate ( C'' or Ada). We have added extensions to the real-time operating system used by DART in order to support parallel processor. The combination of high-level language programmability, a real-time operating system, and parallel processing support significantly reduces the development cost of application software for signal processing and control applications. We have demonstrated this capability bymore » using DART to reconstruct images in the prototype VIP (Video Imaging Projectile) groundstation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berry, K.R.; Hansen, F.R.; Napolitano, L.M.
1992-01-01
DART (DSP Arrary for Reconfigurable Tasks) is a parallel architecture of two high-performance SDP (digital signal processing) chips with the flexibility to handle a wide range of real-time applications. Each of the 32-bit floating-point DSP processes in DART is programmable in a high-level languate (``C`` or Ada). We have added extensions to the real-time operating system used by DART in order to support parallel processor. The combination of high-level language programmability, a real-time operating system, and parallel processing support significantly reduces the development cost of application software for signal processing and control applications. We have demonstrated this capability by usingmore » DART to reconstruct images in the prototype VIP (Video Imaging Projectile) groundstation.« less
Printed Carbon Nanotube Electronics and Sensor Systems.
Chen, Kevin; Gao, Wei; Emaminejad, Sam; Kiriya, Daisuke; Ota, Hiroki; Nyein, Hnin Yin Yin; Takei, Kuniharu; Javey, Ali
2016-06-01
Printing technologies offer large-area, high-throughput production capabilities for electronics and sensors on mechanically flexible substrates that can conformally cover different surfaces. These capabilities enable a wide range of new applications such as low-cost disposable electronics for health monitoring and wearables, extremely large format electronic displays, interactive wallpapers, and sensing arrays. Solution-processed carbon nanotubes have been shown to be a promising candidate for such printing processes, offering stable devices with high performance. Here, recent progress made in printed carbon nanotube electronics is discussed in terms of materials, processing, devices, and applications. Research challenges and opportunities moving forward from processing and system-level integration points of view are also discussed for enabling practical applications. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Solar energy for process heat: Design/cost studies of four industrial retrofit applications
NASA Technical Reports Server (NTRS)
French, R. L.; Bartera, R. E.
1978-01-01
Five specific California plants with potentially attractive solar applications were identified in a process heat survey. These five plants were visited, process requirements evaluated, and conceptual solar system designs were generated. Four DOE (ERDA) sponsored solar energy system demonstration projects were also reviewed and compared to the design/cost cases included in this report. In four of the five cases investigated, retrofit installations providing significant amounts of thermal energy were found to be feasible. The fifth was rejected because of the condition of the building involved, but the process (soap making) appears to be an attractive potential solar application. Costs, however, tend to be high. Several potential areas for cost reduction were identified including larger collector modules and higher duty cycles.
Druzinec, Damir; Salzig, Denise; Brix, Alexander; Kraume, Matthias; Vilcinskas, Andreas; Kollewe, Christian; Czermak, Peter
2013-01-01
Due to the increasing use of insect cell based expression systems in research and industrial recombinant protein production, the development of efficient and reproducible production processes remains a challenging task. In this context, the application of online monitoring techniques is intended to ensure high and reproducible product qualities already during the early phases of process development. In the following chapter, the most common transient and stable insect cell based expression systems are briefly introduced. Novel applications of insect cell based expression systems for the production of insect derived antimicrobial peptides/proteins (AMPs) are discussed using the example of G. mellonella derived gloverin. Suitable in situ sensor techniques for insect cell culture monitoring in disposable and common bioreactor systems are outlined with respect to optical and capacitive sensor concepts. Since scale up of production processes is one of the most critical steps in process development, a conclusive overview is given about scale up aspects for industrial insect cell culture processes.
VHP - An environment for the remote visualization of heuristic processes
NASA Technical Reports Server (NTRS)
Crawford, Stuart L.; Leiner, Barry M.
1991-01-01
A software system called VHP is introduced which permits the visualization of heuristic algorithms on both resident and remote hardware platforms. The VHP is based on the DCF tool for interprocess communication and is applicable to remote algorithms which can be on different types of hardware and in languages other than VHP. The VHP system is of particular interest to systems in which the visualization of remote processes is required such as robotics for telescience applications.
The application of charge-coupled device processors in automatic-control systems
NASA Technical Reports Server (NTRS)
Mcvey, E. S.; Parrish, E. A., Jr.
1977-01-01
The application of charge-coupled device (CCD) processors to automatic-control systems is suggested. CCD processors are a new form of semiconductor component with the unique ability to process sampled signals on an analog basis. Specific implementations of controllers are suggested for linear time-invariant, time-varying, and nonlinear systems. Typical processing time should be only a few microseconds. This form of technology may become competitive with microprocessors and minicomputers in addition to supplementing them.
NASA Technical Reports Server (NTRS)
1991-01-01
Induction heating technology, a magnetic non-deforming process, was developed by Langley researchers to join plastic and composite components in space. Under NASA license, Inductron Corporation uses the process to produce induction heating systems and equipment for numerous applications. The Torobonder, a portable system, comes with a number of interchangeable heads for aircraft repair. Other developments are the E Heating Head, the Toroid Joining Gun, and the Torobrazer. These products perform bonding applications more quickly, safely and efficiently than previous methods.
Space Shuttle Software Development and Certification
NASA Technical Reports Server (NTRS)
Orr, James K.; Henderson, Johnnie A
2000-01-01
Man-rated software, "software which is in control of systems and environments upon which human life is critically dependent," must be highly reliable. The Space Shuttle Primary Avionics Software System is an excellent example of such a software system. Lessons learn from more than 20 years of effort have identified basic elements that must be present to achieve this high degree of reliability. The elements include rigorous application of appropriate software development processes, use of trusted tools to support those processes, quantitative process management, and defect elimination and prevention. This presentation highlights methods used within the Space Shuttle project and raises questions that must be addressed to provide similar success in a cost effective manner on future long-term projects where key application development tools are COTS rather than internally developed custom application development tools
Multimedia Security System for Security and Medical Applications
ERIC Educational Resources Information Center
Zhou, Yicong
2010-01-01
This dissertation introduces a new multimedia security system for the performance of object recognition and multimedia encryption in security and medical applications. The system embeds an enhancement and multimedia encryption process into the traditional recognition system in order to improve the efficiency and accuracy of object detection and…
14 CFR 1300.16 - Application process.
Code of Federal Regulations, 2012 CFR
2012-01-01
... aviation system in the United States and that credit is not reasonably available at the time of the....16 Aeronautics and Space AIR TRANSPORTATION SYSTEM STABILIZATION OFFICE OF MANAGEMENT AND BUDGET... applications to the Board any time after October 12, 2001 through June 28, 2002. All applications must be...
14 CFR 1300.16 - Application process.
Code of Federal Regulations, 2013 CFR
2013-01-01
... aviation system in the United States and that credit is not reasonably available at the time of the....16 Aeronautics and Space AIR TRANSPORTATION SYSTEM STABILIZATION OFFICE OF MANAGEMENT AND BUDGET... applications to the Board any time after October 12, 2001 through June 28, 2002. All applications must be...
14 CFR 1300.16 - Application process.
Code of Federal Regulations, 2011 CFR
2011-01-01
... aviation system in the United States and that credit is not reasonably available at the time of the... Aeronautics and Space AIR TRANSPORTATION SYSTEM STABILIZATION OFFICE OF MANAGEMENT AND BUDGET AVIATION... applications to the Board any time after October 12, 2001 through June 28, 2002. All applications must be...
A service based adaptive U-learning system using UX.
Jeong, Hwa-Young; Yi, Gangman
2014-01-01
In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques.
A Service Based Adaptive U-Learning System Using UX
Jeong, Hwa-Young
2014-01-01
In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques. PMID:25147832
Working on the Boundaries: Philosophies and Practices of the Design Process
NASA Technical Reports Server (NTRS)
Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.
1996-01-01
While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.
ERIC Educational Resources Information Center
National Academy of Sciences - National Research Council, Washington, DC. Assembly of Engineering.
This report summarizes the findings of one of fourteen panels that studied progress in space science applications and defined user needs potentially capable of being met by space-system applications. The study was requested by the National Aeronautics and Space Administration (NASA) and was conducted by the Space Applications Board. The panels…
Application of automation and information systems to forensic genetic specimen processing.
Leclair, Benoît; Scholl, Tom
2005-03-01
During the last 10 years, the introduction of PCR-based DNA typing technologies in forensic applications has been highly successful. This technology has become pervasive throughout forensic laboratories and it continues to grow in prevalence. For many criminal cases, it provides the most probative evidence. Criminal genotype data banking and victim identification initiatives that follow mass-fatality incidents have benefited the most from the introduction of automation for sample processing and data analysis. Attributes of offender specimens including large numbers, high quality and identical collection and processing are ideal for the application of laboratory automation. The magnitude of kinship analysis required by mass-fatality incidents necessitates the application of computing solutions to automate the task. More recently, the development activities of many forensic laboratories are focused on leveraging experience from these two applications to casework sample processing. The trend toward increased prevalence of forensic genetic analysis will continue to drive additional innovations in high-throughput laboratory automation and information systems.
VLSI technology for smaller, cheaper, faster return link systems
NASA Technical Reports Server (NTRS)
Nanzetta, Kathy; Ghuman, Parminder; Bennett, Toby; Solomon, Jeff; Dowling, Jason; Welling, John
1994-01-01
Very Large Scale Integration (VLSI) Application-specific Integrated Circuit (ASIC) technology has enabled substantially smaller, cheaper, and more capable telemetry data systems. However, the rapid growth in available ASIC fabrication densities has far outpaced the application of this technology to telemetry systems. Available densities have grown by well over an order magnitude since NASA's Goddard Space Flight Center (GSFC) first began developing ASIC's for ground telemetry systems in 1985. To take advantage of these higher integration levels, a new generation of ASIC's for return link telemetry processing is under development. These new submicron devices are designed to further reduce the cost and size of NASA return link processing systems while improving performance. This paper describes these highly integrated processing components.
On-Board, Real-Time Preprocessing System for Optical Remote-Sensing Imagery
Qi, Baogui; Zhuang, Yin; Chen, He; Chen, Liang
2018-01-01
With the development of remote-sensing technology, optical remote-sensing imagery processing has played an important role in many application fields, such as geological exploration and natural disaster prevention. However, relative radiation correction and geometric correction are key steps in preprocessing because raw image data without preprocessing will cause poor performance during application. Traditionally, remote-sensing data are downlinked to the ground station, preprocessed, and distributed to users. This process generates long delays, which is a major bottleneck in real-time applications for remote-sensing data. Therefore, on-board, real-time image preprocessing is greatly desired. In this paper, a real-time processing architecture for on-board imagery preprocessing is proposed. First, a hierarchical optimization and mapping method is proposed to realize the preprocessing algorithm in a hardware structure, which can effectively reduce the computation burden of on-board processing. Second, a co-processing system using a field-programmable gate array (FPGA) and a digital signal processor (DSP; altogether, FPGA-DSP) based on optimization is designed to realize real-time preprocessing. The experimental results demonstrate the potential application of our system to an on-board processor, for which resources and power consumption are limited. PMID:29693585
On-Board, Real-Time Preprocessing System for Optical Remote-Sensing Imagery.
Qi, Baogui; Shi, Hao; Zhuang, Yin; Chen, He; Chen, Liang
2018-04-25
With the development of remote-sensing technology, optical remote-sensing imagery processing has played an important role in many application fields, such as geological exploration and natural disaster prevention. However, relative radiation correction and geometric correction are key steps in preprocessing because raw image data without preprocessing will cause poor performance during application. Traditionally, remote-sensing data are downlinked to the ground station, preprocessed, and distributed to users. This process generates long delays, which is a major bottleneck in real-time applications for remote-sensing data. Therefore, on-board, real-time image preprocessing is greatly desired. In this paper, a real-time processing architecture for on-board imagery preprocessing is proposed. First, a hierarchical optimization and mapping method is proposed to realize the preprocessing algorithm in a hardware structure, which can effectively reduce the computation burden of on-board processing. Second, a co-processing system using a field-programmable gate array (FPGA) and a digital signal processor (DSP; altogether, FPGA-DSP) based on optimization is designed to realize real-time preprocessing. The experimental results demonstrate the potential application of our system to an on-board processor, for which resources and power consumption are limited.
The Modular Modeling System (MMS): User's Manual
Leavesley, G.H.; Restrepo, Pedro J.; Markstrom, S.L.; Dixon, M.; Stannard, L.G.
1996-01-01
The Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide the research and operational framework needed to support development, testing, and evaluation of physical-process algorithms and to facilitate integration of user-selected sets of algorithms into operational physical-process models. MMS uses a module library that contains modules for simulating a variety of water, energy, and biogeochemical processes. A model is created by selectively coupling the most appropriate modules from the library to create a 'suitable' model for the desired application. Where existing modules do not provide appropriate process algorithms, new modules can be developed. The MMS user's manual provides installation instructions and a detailed discussion of system concepts, module development, and model development and application using the MMS graphical user interface.
NASA Technical Reports Server (NTRS)
Otaguro, W. S.; Kesler, L. O.; Land, K. C.; Rhoades, D. E.
1987-01-01
An intelligent tracker capable of robotic applications requiring guidance and control of platforms, robotic arms, and end effectors has been developed. This packaged system capable of supervised autonomous robotic functions is partitioned into a multiple processor/parallel processing configuration. The system currently interfaces to cameras but has the capability to also use three-dimensional inputs from scanning laser rangers. The inputs are fed into an image processing and tracking section where the camera inputs are conditioned for the multiple tracker algorithms. An executive section monitors the image processing and tracker outputs and performs all the control and decision processes. The present architecture of the system is presented with discussion of its evolutionary growth for space applications. An autonomous rendezvous demonstration of this system was performed last year. More realistic demonstrations in planning are discussed.
NASA Technical Reports Server (NTRS)
1984-01-01
Topics discussed at the symposium include hardware, geographic information system (GIS) implementation, processing remotely sensed data, spatial data structures, and NASA programs in remote sensing information systems. Attention is also given GIS applications, advanced techniques, artificial intelligence, graphics, spatial navigation, and classification. Papers are included on the design of computer software for geographic image processing, concepts for a global resource information system, algorithm development for spatial operators, and an application of expert systems technology to remotely sensed image analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrada, J.J.; Osborne-Lee, I.W.; Grizzaffi, P.A.
Expert systems are known to be useful in capturing expertise and applying knowledge to chemical engineering problems such as diagnosis, process control, process simulation, and process advisory. However, expert system applications are traditionally limited to knowledge domains that are heuristic and involve only simple mathematics. Neural networks, on the other hand, represent an emerging technology capable of rapid recognition of patterned behavior without regard to mathematical complexity. Although useful in problem identification, neural networks are not very efficient in providing in-depth solutions and typically do not promote full understanding of the problem or the reasoning behind its solutions. Hence, applicationsmore » of neural networks have certain limitations. This paper explores the potential for expanding the scope of chemical engineering areas where neural networks might be utilized by incorporating expert systems and neural networks into the same application, a process called hybridization. In addition, hybrid applications are compared with those using more traditional approaches, the results of the different applications are analyzed, and the feasibility of converting the preliminary prototypes described herein into useful final products is evaluated. 12 refs., 8 figs.« less
Adapting Wave-front Algorithms to Efficiently Utilize Systems with Deep Communication Hierarchies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerbyson, Darren J.; Lang, Michael; Pakin, Scott
2011-09-30
Large-scale systems increasingly exhibit a differential between intra-chip and inter-chip communication performance especially in hybrid systems using accelerators. Processorcores on the same socket are able to communicate at lower latencies, and with higher bandwidths, than cores on different sockets either within the same node or between nodes. A key challenge is to efficiently use this communication hierarchy and hence optimize performance. We consider here the class of applications that contains wavefront processing. In these applications data can only be processed after their upstream neighbors have been processed. Similar dependencies result between processors in which communication is required to pass boundarymore » data downstream and whose cost is typically impacted by the slowest communication channel in use. In this work we develop a novel hierarchical wave-front approach that reduces the use of slower communications in the hierarchy but at the cost of additional steps in the parallel computation and higher use of on-chip communications. This tradeoff is explored using a performance model. An implementation using the Reverse-acceleration programming model on the petascale Roadrunner system demonstrates a 27% performance improvement at full system-scale on a kernel application. The approach is generally applicable to large-scale multi-core and accelerated systems where a differential in system communication performance exists.« less
Atmospheric and Oceanographic Information Processing System (AOIPS) system description
NASA Technical Reports Server (NTRS)
Bracken, P. A.; Dalton, J. T.; Billingsley, J. B.; Quann, J. J.
1977-01-01
The development of hardware and software for an interactive, minicomputer based processing and display system for atmospheric and oceanographic information extraction and image data analysis is described. The major applications of the system are discussed as well as enhancements planned for the future.
Liu, Tongzhu; Shen, Aizong; Hu, Xiaojian; Tong, Guixian; Gu, Wei
2017-06-01
We aimed to apply collaborative business intelligence (BI) system to hospital supply, processing and distribution (SPD) logistics management model. We searched Engineering Village database, China National Knowledge Infrastructure (CNKI) and Google for articles (Published from 2011 to 2016), books, Web pages, etc., to understand SPD and BI related theories and recent research status. For the application of collaborative BI technology in the hospital SPD logistics management model, we realized this by leveraging data mining techniques to discover knowledge from complex data and collaborative techniques to improve the theories of business process. For the application of BI system, we: (i) proposed a layered structure of collaborative BI system for intelligent management in hospital logistics; (ii) built data warehouse for the collaborative BI system; (iii) improved data mining techniques such as supporting vector machines (SVM) and swarm intelligence firefly algorithm to solve key problems in hospital logistics collaborative BI system; (iv) researched the collaborative techniques oriented to data and business process optimization to improve the business processes of hospital logistics management. Proper combination of SPD model and BI system will improve the management of logistics in the hospitals. The successful implementation of the study requires: (i) to innovate and improve the traditional SPD model and make appropriate implement plans and schedules for the application of BI system according to the actual situations of hospitals; (ii) the collaborative participation of internal departments in hospital including the department of information, logistics, nursing, medical and financial; (iii) timely response of external suppliers.
Interactive data-processing system for metallurgy
NASA Technical Reports Server (NTRS)
Rathz, T. J.
1978-01-01
Equipment indicates that system can rapidly and accurately process metallurgical and materials-processing data for wide range of applications. Advantages include increase in contract between areas on image, ability to analyze images via operator-written programs, and space available for storing images.
Application research of Ganglia in Hadoop monitoring and management
NASA Astrophysics Data System (ADS)
Li, Gang; Ding, Jing; Zhou, Lixia; Yang, Yi; Liu, Lei; Wang, Xiaolei
2017-03-01
There are many applications of Hadoop System in the field of large data, cloud computing. The test bench of storage and application in seismic network at Earthquake Administration of Tianjin use with Hadoop system, which is used the open source software of Ganglia to operate and monitor. This paper reviews the function, installation and configuration process, application effect of operating and monitoring in Hadoop system of the Ganglia system. It briefly introduces the idea and effect of Nagios software monitoring Hadoop system. It is valuable for the industry in the monitoring system of cloud computing platform.
SSME propellant path leak detection real-time
NASA Technical Reports Server (NTRS)
Crawford, R. A.; Smith, L. M.
1994-01-01
Included are four documents that outline the technical aspects of the research performed on NASA Grant NAG8-140: 'A System for Sequential Step Detection with Application to Video Image Processing'; 'Leak Detection from the SSME Using Sequential Image Processing'; 'Digital Image Processor Specifications for Real-Time SSME Leak Detection'; and 'A Color Change Detection System for Video Signals with Applications to Spectral Analysis of Rocket Engine Plumes'.
Artificial intelligence issues related to automated computing operations
NASA Technical Reports Server (NTRS)
Hornfeck, William A.
1989-01-01
Large data processing installations represent target systems for effective applications of artificial intelligence (AI) constructs. The system organization of a large data processing facility at the NASA Marshall Space Flight Center is presented. The methodology and the issues which are related to AI application to automated operations within a large-scale computing facility are described. Problems to be addressed and initial goals are outlined.
Automated Detection of a Crossing Contact Based on Its Doppler Shift
2009-03-01
contacts in passive sonar systems. A common approach is the application of high- gain processing followed by successive classification criteria. Most...contacts in passive sonar systems. A common approach is the application of high-gain processing followed by successive classification criteria...RESEARCH MOTIVATION The trade-off between the false alarm and detection probability is fundamental in radar and sonar . (Chevalier, 2002) A common
Manufacturing process applications team (MATeam)
NASA Technical Reports Server (NTRS)
Bangs, E. R.
1980-01-01
The objectives and activities of an aerospace technology transfer group are outlined and programs in various stages of progress are described including the orbital tube flaring device, infrared proximity sensor for robot positioning, laser stripping magnet wire, infrared imaging as welding process tracking system, carbide coating of cutting tools, nondestructive fracture toughness testing of titanium welds, portable solar system for agricultural applications, and an anerobic methane gas generator.
Jimenez-Molina, Angel; Gaete-Villegas, Jorge; Fuentes, Javier
2018-06-01
New advances in telemedicine, ubiquitous computing, and artificial intelligence have supported the emergence of more advanced applications and support systems for chronic patients. This trend addresses the important problem of chronic illnesses, highlighted by multiple international organizations as a core issue in future healthcare. Despite the myriad of exciting new developments, each application and system is designed and implemented for specific purposes and lacks the flexibility to support different healthcare concerns. Some of the known problems of such developments are the integration issues between applications and existing healthcare systems, the reusability of technical knowledge in the creation of new and more sophisticated systems and the usage of data gathered from multiple sources in the generation of new knowledge. This paper proposes a framework for the development of chronic disease support systems and applications as an answer to these shortcomings. Through this framework our pursuit is to create a common ground methodology upon which new developments can be created and easily integrated to provide better support to chronic patients, medical staff and other relevant participants. General requirements are inferred for any support system from the primary attention process of chronic patients by the Business Process Management Notation. Numerous technical approaches are proposed to design a general architecture that considers the medical organizational requirements in the treatment of a patient. A framework is presented for any application in support of chronic patients and evaluated by a case study to test the applicability and pertinence of the solution. Copyright © 2018 Elsevier Inc. All rights reserved.
14 CFR § 1300.16 - Application process.
Code of Federal Regulations, 2014 CFR
2014-01-01
... aviation system in the United States and that credit is not reasonably available at the time of the...§ 1300.16 Aeronautics and Space AIR TRANSPORTATION SYSTEM STABILIZATION OFFICE OF MANAGEMENT AND BUDGET... applications to the Board any time after October 12, 2001 through June 28, 2002. All applications must be...
Selecting a Cable System Operator.
ERIC Educational Resources Information Center
Cable Television Information Center, Washington, DC.
Intended to assist franchising authorities with the process of selecting a cable television system operator from franchise applicants, this document provides a framework for analysis of individual applications. Section 1 deals with various methods which can be used to select an operator. The next section covers the application form, the vehicle a…
Ribesse, Nathalie; Bossyns, Paul; Marchal, Bruno; Karemere, Hermes; Burman, Christopher J; Macq, Jean
2017-03-01
In the field of development cooperation, interest in systems thinking and complex systems theories as a methodological approach is increasingly recognised. And so it is in health systems research, which informs health development aid interventions. However, practical applications remain scarce to date. The objective of this article is to contribute to the body of knowledge by presenting the tools inspired by systems thinking and complexity theories and methodological lessons learned from their application. These tools were used in a case study. Detailed results of this study are in process for publication in additional articles. Applying a complexity 'lens', the subject of the case study is the role of long-term international technical assistance in supporting health administration reform at the provincial level in the Democratic Republic of Congo. The Methods section presents the guiding principles of systems thinking and complex systems, their relevance and implication for the subject under study, and the existing tools associated with those theories which inspired us in the design of the data collection and analysis process. The tools and their application processes are presented in the results section, and followed in the discussion section by the critical analysis of their innovative potential and emergent challenges. The overall methodology provides a coherent whole, each tool bringing a different and complementary perspective on the system.
Commercial applications for optical data storage
NASA Astrophysics Data System (ADS)
Tas, Jeroen
1991-03-01
Optical data storage has spurred the market for document imaging systems. These systems are increasingly being used to electronically manage the processing, storage and retrieval of documents. Applications range from straightforward archives to sophisticated workflow management systems. The technology is developing rapidly and within a few years optical imaging facilities will be incorporated in most of the office information systems. This paper gives an overview of the status of the market, the applications and the trends of optical imaging systems.
ERIC Educational Resources Information Center
Mohammadi, Hadi
2014-01-01
Use of the Patch Vulnerability Management (PVM) process should be seriously considered for any networked computing system. The PVM process prevents the operating system (OS) and software applications from being attacked due to security vulnerabilities, which lead to system failures and critical data leakage. The purpose of this research is to…
Introducing new technologies into Space Station subsystems
NASA Technical Reports Server (NTRS)
Wiskerchen, Michael J.; Mollakarimi, Cindy L.
1989-01-01
A new systems engineering technology has been developed and applied to Shuttle processing. The new engineering approach emphasizes the identification, quantitative assessment, and management of system performance and risk related to the dynamic nature of requirements, technology, and operational concepts. The Space Shuttle Tile Automation System is described as an example of the first application of the new engineering technology. Lessons learned from the Shuttle processing experience are examined, and concepts are presented which are applicable to the design and development of the Space Station Freedom.
Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang
2016-10-01
Currently, near infrared spectroscopy (NIRS) has been considered as an efficient tool for achieving process analytical technology(PAT) in the manufacture of traditional Chinese medicine (TCM) products. In this article, the NIRS based process analytical system for the production of salvianolic acid for injection was introduced. The design of the process analytical system was described in detail, including the selection of monitored processes and testing mode, and potential risks that should be avoided. Moreover, the development of relative technologies was also presented, which contained the establishment of the monitoring methods for the elution of polyamide resin and macroporous resin chromatography processes, as well as the rapid analysis method for finished products. Based on author's experience of research and work, several issues in the application of NIRS to the process monitoring and control in TCM production were then raised, and some potential solutions were also discussed. The issues include building the technical team for process analytical system, the design of the process analytical system in the manufacture of TCM products, standardization of the NIRS-based analytical methods, and improving the management of process analytical system. Finally, the prospect for the application of NIRS in the TCM industry was put forward. Copyright© by the Chinese Pharmaceutical Association.
NASA Astrophysics Data System (ADS)
Jamaludin, A. S.; Hosokawa, A.; Furumoto, T.; Koyano, T.; Hashimoto, Y.
2018-03-01
Cutting process of difficult-to-cut material such as stainless steel, generates immensely excessive heat, which is one of the major causes related to shortening tool life and lower quality of surface finish. It is proven that application of cutting fluid during the cutting process of difficult-to-cut material is able to improve the cutting performance, but excessive application of cutting fluid leads to another problem such as increasing processing cost and environmental hazardous pollution of workplace. In the study, Extreme Cold Mist system is designed and tested along with various Minimum Quantity Lubrication (MQL) systems on turning process of stainless steel AISI 316. In the study, it is obtained that, Extreme Cold Mist system is able to reduce cutting force up to 60N and improve the surface roughness of the machined surface significantly.
NASA Technical Reports Server (NTRS)
Hyde, Patricia R.; Loftin, R. Bowen
1993-01-01
The volume 2 proceedings from the 1993 Conference on Intelligent Computer-Aided Training and Virtual Environment Technology are presented. Topics discussed include intelligent computer assisted training (ICAT) systems architectures, ICAT educational and medical applications, virtual environment (VE) training and assessment, human factors engineering and VE, ICAT theory and natural language processing, ICAT military applications, VE engineering applications, ICAT knowledge acquisition processes and applications, and ICAT aerospace applications.
Candidate thermal energy storage technologies for solar industrial process heat applications
NASA Technical Reports Server (NTRS)
Furman, E. R.
1979-01-01
A number of candidate thermal energy storage system elements were identified as having the potential for the successful application of solar industrial process heat. These elements which include storage media, containment and heat exchange are shown.
Fundamentals and applications of solar energy. Part 2
NASA Astrophysics Data System (ADS)
Faraq, I. H.; Melsheimer, S. S.
Applications of techniques of chemical engineering to the development of materials, production methods, and performance optimization and evaluation of solar energy systems are discussed. Solar thermal storage systems using phase change materials, liquid phase Diels-Alder reactions, aquifers, and hydrocarbon oil were examined. Solar electric systems were explored in terms of a chlorophyll solar cell, the nonequilibrium electric field effects developed at photoelectrode/electrolyte interfaces, and designs for commercial scale processing of solar cells using continuous thin-film coating production methods. Solar coal gasification processes were considered, along with multilayer absorber coatings for solar concentrator receivers, solar thermal industrial applications, the kinetics of anaerobic digestion of crop residues to produce methane, and a procedure for developing a computer simulation of a solar cooling system.
NASA Astrophysics Data System (ADS)
Cheng, D. L. C.; Quinn, J. D.; Larour, E. Y.; Halkides, D. J.
2017-12-01
The Virtual Earth System Laboratory (VESL) is a Web application, under continued development at the Jet Propulsion Laboratory and UC Irvine, for the visualization of Earth System data and process simulations. As with any project of its size, we have encountered both successes and challenges during the course of development. Our principal point of success is the fact that VESL users can interact seamlessly with our earth science simulations within their own Web browser. Some of the challenges we have faced include retrofitting the VESL Web application to respond to touch gestures, reducing page load time (especially as the application has grown), and accounting for the differences between the various Web browsers and computing platforms.
The embedded operating system project
NASA Technical Reports Server (NTRS)
Campbell, R. H.
1985-01-01
The design and construction of embedded operating systems for real-time advanced aerospace applications was investigated. The applications require reliable operating system support that must accommodate computer networks. Problems that arise in the construction of such operating systems, reconfiguration, consistency and recovery in a distributed system, and the issues of real-time processing are reported. A thesis that provides theoretical foundations for the use of atomic actions to support fault tolerance and data consistency in real-time object-based system is included. The following items are addressed: (1) atomic actions and fault-tolerance issues; (2) operating system structure; (3) program development; (4) a reliable compiler for path Pascal; and (5) mediators, a mechanism for scheduling distributed system processes.
ERIC Educational Resources Information Center
McDonald, Joseph
1986-01-01
Focusing on management decisions in academic libraries, this article compares management information systems (MIS) with decision support systems (DSS) and discusses the decision-making process, information needs of library managers, sources of data, reasons for choosing microcomputer, preprogrammed application software, prototyping a system, and…
Design and development of a medical big data processing system based on Hadoop.
Yao, Qin; Tian, Yu; Li, Peng-Fei; Tian, Li-Li; Qian, Yang-Ming; Li, Jing-Song
2015-03-01
Secondary use of medical big data is increasingly popular in healthcare services and clinical research. Understanding the logic behind medical big data demonstrates tendencies in hospital information technology and shows great significance for hospital information systems that are designing and expanding services. Big data has four characteristics--Volume, Variety, Velocity and Value (the 4 Vs)--that make traditional systems incapable of processing these data using standalones. Apache Hadoop MapReduce is a promising software framework for developing applications that process vast amounts of data in parallel with large clusters of commodity hardware in a reliable, fault-tolerant manner. With the Hadoop framework and MapReduce application program interface (API), we can more easily develop our own MapReduce applications to run on a Hadoop framework that can scale up from a single node to thousands of machines. This paper investigates a practical case of a Hadoop-based medical big data processing system. We developed this system to intelligently process medical big data and uncover some features of hospital information system user behaviors. This paper studies user behaviors regarding various data produced by different hospital information systems for daily work. In this paper, we also built a five-node Hadoop cluster to execute distributed MapReduce algorithms. Our distributed algorithms show promise in facilitating efficient data processing with medical big data in healthcare services and clinical research compared with single nodes. Additionally, with medical big data analytics, we can design our hospital information systems to be much more intelligent and easier to use by making personalized recommendations.
Information processing of earth resources data
NASA Technical Reports Server (NTRS)
Zobrist, A. L.; Bryant, N. A.
1982-01-01
Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.
Empirical modeling for intelligent, real-time manufacture control
NASA Technical Reports Server (NTRS)
Xu, Xiaoshu
1994-01-01
Artificial neural systems (ANS), also known as neural networks, are an attempt to develop computer systems that emulate the neural reasoning behavior of biological neural systems (e.g. the human brain). As such, they are loosely based on biological neural networks. The ANS consists of a series of nodes (neurons) and weighted connections (axons) that, when presented with a specific input pattern, can associate specific output patterns. It is essentially a highly complex, nonlinear, mathematical relationship or transform. These constructs have two significant properties that have proven useful to the authors in signal processing and process modeling: noise tolerance and complex pattern recognition. Specifically, the authors have developed a new network learning algorithm that has resulted in the successful application of ANS's to high speed signal processing and to developing models of highly complex processes. Two of the applications, the Weld Bead Geometry Control System and the Welding Penetration Monitoring System, are discussed in the body of this paper.
Graphical Environment Tools for Application to Gamma-Ray Energy Tracking Arrays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Todd, Richard A.; Radford, David C.
2013-12-30
Highly segmented, position-sensitive germanium detector systems are being developed for nuclear physics research where traditional electronic signal processing with mixed analog and digital function blocks would be enormously complex and costly. Future systems will be constructed using pipelined processing of high-speed digitized signals as is done in the telecommunications industry. Techniques which provide rapid algorithm and system development for future systems are desirable. This project has used digital signal processing concepts and existing graphical system design tools to develop a set of re-usable modular functions and libraries targeted for the nuclear physics community. Researchers working with complex nuclear detector arraysmore » such as the Gamma-Ray Energy Tracking Array (GRETA) have been able to construct advanced data processing algorithms for implementation in field programmable gate arrays (FPGAs) through application of these library functions using intuitive graphical interfaces.« less
Potential medical applications of TAE
NASA Technical Reports Server (NTRS)
Fahy, J. Ben; Kaucic, Robert; Kim, Yongmin
1986-01-01
In cooperation with scientists in the University of Washington Medical School, a microcomputer-based image processing system for quantitative microscopy, called DMD1 (Digital Microdensitometer 1) was constructed. In order to make DMD1 transportable to different hosts and image processors, we have been investigating the possibility of rewriting the lower level portions of DMD1 software using Transportable Applications Executive (TAE) libraries and subsystems. If successful, we hope to produce a newer version of DMD1, called DMD2, running on an IBM PC/AT under the SCO XENIX System 5 operating system, using any of seven target image processors available in our laboratory. Following this implementation, copies of the system will be transferred to other laboratories with biomedical imaging applications. By integrating those applications into DMD2, we hope to eventually expand our system into a low-cost general purpose biomedical imaging workstation. This workstation will be useful not only as a self-contained instrument for clinical or research applications, but also as part of a large scale Digital Imaging Network and Picture Archiving and Communication System, (DIN/PACS). Widespread application of these TAE-based image processing and analysis systems should facilitate software exchange and scientific cooperation not only within the medical community, but between the medical and remote sensing communities as well.
NASA Astrophysics Data System (ADS)
Aiken, John Charles
The development of a colour Spatial Light Modulator (SLM) and its application to optical information processing is described. Whilst monochrome technology has been established for many years, this is not the case for colour where commercial systems are unavailable. A main aspect of this study is therefore, how the use of colour can add an additional dimension to optical information processing. A well established route to monochrome system development has been the use of (black and white) liquid crystal televisions (LCTV) as SLM, providing useful performance at a low-cost. This study is based on the unique use of a colour display removed from a LCTV and operated as a colour SLM. A significant development has been the replacement of the original TV electronics operating the display with enhanced drive electronics specially developed for this application. Through a computer interface colour images from a drawing package or video camera can now be readily displayed on the LCD as input to an optical system. A detailed evaluation of the colour LCD optical properties, indicates that the new drive electronics have considerably improved the operation of the display for use as a colour SLM. Applications are described employing the use of colour in Fourier plane filtering, image correlation and speckle metrology. The SLM (and optical system) developed demonstrates, how the addition of colour has greatly enhanced its capabilities to implement principles of optical data processing, conventionally performed monochromatically. The hybrid combination employed, combining colour optical data processing with electronic techniques has resulted in a capable development system. Further development of the system using current colour LCDs and the move towards a portable system, is considered in the study conclusion.
Optical Information Processing for Aerospace Applications
NASA Technical Reports Server (NTRS)
1981-01-01
Current research in optical processing is reviewed. Its role in future aerospace systems is determined. The development of optical devices and components demonstrates that system concepts can be implemented in practical aerospace configurations.
Membrane separation for non-aqueous solution
NASA Astrophysics Data System (ADS)
Widodo, S.; Khoiruddin; Ariono, D.; Subagjo; Wenten, I. G.
2018-01-01
Membrane technology has been widely used in a number of applications competing with conventional technologies in various ways. Despite the enormous applications, they are mainly used for the aqueous system. The use of membrane-based processes in a non-aqueous system is an emerging area. This is because developed membranes are still limited in separations involving aqueous solution which show several drawbacks when implemented in a non-aqueous system. The purpose of this paper is to provide a review of the current application of membrane processes in non-aqueous solutions, such as mineral oil treatment, vegetable oil processing, and organic solvent recovery. Developments of advanced membrane materials for the non-aqueous solutions such as super-hydrophobic and organic solvent resistant membranes are reviewed. In addition, challenges and future outlook of membrane separation for the non-aqueous solution are discussed.
The Application of V&V within Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward
1996-01-01
Verification and Validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In reuse-based software engineering, decisions on the requirements, design and even implementation of domain assets can can be made prior to beginning development of a specific system. in order to bring the effectiveness of V&V to bear within reuse-based software engineering. V&V must be incorporated within the domain engineering process.
Material Processing with High Power CO2-Lasers
NASA Astrophysics Data System (ADS)
Bakowsky, Lothar
1986-10-01
After a period of research and development lasertechnique now is regarded as an important instrument for flexible, economic and fully automatic manufacturing. Especially cutting of flat metal sheets with high power C02-lasers and CNC controlled two or three axes handling systems is a wide spread. application. Three dimensional laser cutting, laser-welding and -heat treatment are just at the be ginning of industrial use in production lines. The main. advantages of laser technology. are - high. accuracy - high, processing velocity - law thermal distortion. - no tool abrasion. The market for laser material processing systems had 1985 a volume of 300 Mio S with growth rates between, 20 % and 30 %. The topic of this lecture are hiTrh. power CO2-lasers. Besides this systems two others are used as machining tools, Nd-YAG- and Eximer lasers. All applications of high. power CO2-lasers to industrial material processing show that high processing velocity and quality are only guaranteed in case of a stable intensity. profile on the workpiece. This is only achieved by laser systems without any power and mode fluctuations and by handling systems of high accuracy. Two applications in the automotive industry are described, below as examples for laser cutting and laser welding of special cylindrical motor parts.
Military applications of automatic speech recognition and future requirements
NASA Technical Reports Server (NTRS)
Beek, Bruno; Cupples, Edward J.
1977-01-01
An updated summary of the state-of-the-art of automatic speech recognition and its relevance to military applications is provided. A number of potential systems for military applications are under development. These include: (1) digital narrowband communication systems; (2) automatic speech verification; (3) on-line cartographic processing unit; (4) word recognition for militarized tactical data system; and (5) voice recognition and synthesis for aircraft cockpit.
Paperless Procurement: The Impact of Advanced Automation
1992-09-01
System. POPS = Paperless Order Processing System; RADMIS = Research and Development Management Information System; SAACONS=Standard Army Automated... order processing system, which then updates the contractor’s production (or delivery) scheduling and contract accounting applications. In return, the...used by the DLA’s POPS. 3-5 into an EDI delivery order and pass it directly to the distributor’s or manufacturer’s order processing system. That
Lunar Applications in Reconfigurable Computing
NASA Technical Reports Server (NTRS)
Somervill, Kevin
2008-01-01
NASA s Constellation Program is developing a lunar surface outpost in which reconfigurable computing will play a significant role. Reconfigurable systems provide a number of benefits over conventional software-based implementations including performance and power efficiency, while the use of standardized reconfigurable hardware provides opportunities to reduce logistical overhead. The current vision for the lunar surface architecture includes habitation, mobility, and communications systems, each of which greatly benefit from reconfigurable hardware in applications including video processing, natural feature recognition, data formatting, IP offload processing, and embedded control systems. In deploying reprogrammable hardware, considerations similar to those of software systems must be managed. There needs to be a mechanism for discovery enabling applications to locate and utilize the available resources. Also, application interfaces are needed to provide for both configuring the resources as well as transferring data between the application and the reconfigurable hardware. Each of these topics are explored in the context of deploying reconfigurable resources as an integral aspect of the lunar exploration architecture.
NASA Astrophysics Data System (ADS)
McLaughlin, B. D.; Pawloski, A. W.
2015-12-01
Modern development practices require the ability to quickly and easily host an application. Small projects cannot afford to maintain a large staff for infrastructure maintenance. Rapid prototyping fosters innovation. However, maintaining the integrity of data and systems demands care, particularly in a government context. The extensive data holdings that make up much of the value of NASA's EOSDIS (Earth Observing System Data and Information System) are stored in a number of locations, across a wide variety of applications, ranging from small prototypes to large computationally-intensive operational processes.However, it is increasingly difficult for an application to implement the required security controls, perform required registrations and inventory entries, ensure logging, monitoring, patching, and then ensure that all these activities continue for the life of that application, let alone five, or ten, or fifty applications. This process often takes weeks or months to complete and requires expertise in a variety of different domains such as security, systems administration, development, etc.NGAP, the Next Generation Application Platform, is tackling this problem by investigating, automating, and resolving many of the repeatable policy hurdles that a typical application must overcome. This platform provides a relatively simple and straightforward process by which applications can commit source code to a repository and then deploy that source code to a cloud-based infrastructure, all while meeting NASA's policies for security, governance, inventory, reliability, and availability. While there is still work for the application owner for any application hosting, NGAP handles a significant portion of that work.This talk will discuss areas where we have made significant progress, areas that are complex or must remain human-intensive, and areas where we are still striving to improve this application deployment and hosting pipeline.
DIY soundcard based temperature logging system. Part II: applications
NASA Astrophysics Data System (ADS)
Nunn, John
2016-11-01
This paper demonstrates some simple applications of how temperature logging systems may be used to monitor simple heat experiments, and how the data obtained can be analysed to get some additional insight into the physical processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garling, W.S.; Harper, M.R.; Merchant-Geuder, L.
1980-03-01
Potential applications of wind energy include not only large central turbines that can be utilized by utilities, but also dispersed systems for farms and other applications. The US Departments of Energy (DOE) and Agriculture (USDA) currently are establishing the feasibility of wind energy use in applications where the energy can be used as available, or stored in a simple form. These applications include production of hot water for rural sanitation, heating and cooling of rural structures and products, drying agricultural products, and irrigation. This study, funded by USDA, analyzed the economic feasibility of wind power in refrigeration cooling and watermore » heating systems in food processing plants. Types of plants included were meat and poultry, dairy, fruit and vegetable, and aquaculture.« less
An open system approach to process reengineering in a healthcare operational environment.
Czuchry, A J; Yasin, M M; Norris, J
2000-01-01
The objective of this study is to examine the applicability of process reengineering in a healthcare operational environment. The intake process of a mental healthcare service delivery system is analyzed systematically to identify process-related problems. A methodology which utilizes an open system orientation coupled with process reengineering is utilized to overcome operational and patient related problems associated with the pre-reengineered intake process. The systematic redesign of the intake process resulted in performance improvements in terms of cost, quality, service and timing.
A coherent optical feedback system for optical information processing
NASA Technical Reports Server (NTRS)
Jablonowski, D. P.; Lee, S. H.
1975-01-01
A unique optical feedback system for coherent optical data processing is described. With the introduction of feedback, the well-known transfer function for feedback systems is obtained in two dimensions. Operational details of the optical feedback system are given. Experimental results of system applications in image restoration, contrast control and analog computation are presented.
Administrator Training and Development: Conceptual Model.
ERIC Educational Resources Information Center
Boardman, Gerald R.
A conceptual model for an individualized training program for school administrators integrates processes, characteristics, and tasks through theory training and application. Based on an application of contingency theory, it provides a system matching up administrative candidates' needs in three areas (administrative process, administrative…
ERIC Educational Resources Information Center
Stevenson, R. D.
These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. This report describes concepts presented in another module called "The First Law of…
40 CFR 65.140 - Applicability.
Code of Federal Regulations, 2010 CFR
2010-07-01
... FEDERAL AIR RULE Closed Vent Systems, Control Devices, and Routing to a Fuel Gas System or a Process § 65..., shutdown, and malfunction provisions in § 65.6) apply to routing emissions to processes, fuel gas systems, closed vent systems, control devices, and recovery devices where another subpart expressly references the...
7 CFR 277.18 - Establishment of an Automated Data Processing (ADP) and Information Retrieval System.
Code of Federal Regulations, 2012 CFR
2012-01-01
...) and Information Retrieval System. 277.18 Section 277.18 Agriculture Regulations of the Department of... Data Processing (ADP) and Information Retrieval System. (a) Scope and application. This section... costs of planning, design, development or installation of ADP and information retrieval systems if the...
7 CFR 277.18 - Establishment of an Automated Data Processing (ADP) and Information Retrieval System.
Code of Federal Regulations, 2014 CFR
2014-01-01
...) and Information Retrieval System. 277.18 Section 277.18 Agriculture Regulations of the Department of... Data Processing (ADP) and Information Retrieval System. (a) Scope and application. This section... costs of planning, design, development or installation of ADP and information retrieval systems if the...
7 CFR 277.18 - Establishment of an Automated Data Processing (ADP) and Information Retrieval System.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) and Information Retrieval System. 277.18 Section 277.18 Agriculture Regulations of the Department of... Data Processing (ADP) and Information Retrieval System. (a) Scope and application. This section... costs of planning, design, development or installation of ADP and information retrieval systems if the...
7 CFR 277.18 - Establishment of an Automated Data Processing (ADP) and Information Retrieval System.
Code of Federal Regulations, 2013 CFR
2013-01-01
...) and Information Retrieval System. 277.18 Section 277.18 Agriculture Regulations of the Department of... Data Processing (ADP) and Information Retrieval System. (a) Scope and application. This section... costs of planning, design, development or installation of ADP and information retrieval systems if the...
Adapting wave-front algorithms to efficiently utilize systems with deep communication hierarchies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerbyson, Darren J; Lang, Michael; Pakin, Scott
2009-01-01
Large-scale systems increasingly exhibit a differential between intra-chip and inter-chip communication performance. Processor-cores on the same socket are able to communicate at lower latencies, and with higher bandwidths, than cores on different sockets either within the same node or between nodes. A key challenge is to efficiently use this communication hierarchy and hence optimize performance. We consider here the class of applications that contain wave-front processing. In these applications data can only be processed after their upstream neighbors have been processed. Similar dependencies result between processors in which communication is required to pass boundary data downstream and whose cost ismore » typically impacted by the slowest communication channel in use. In this work we develop a novel hierarchical wave-front approach that reduces the use of slower communications in the hierarchy but at the cost of additional computation and higher use of on-chip communications. This tradeoff is explored using a performance model and an implementation on the Petascale Roadrunner system demonstrates a 27% performance improvement at full system-scale on a kernel application. The approach is generally applicable to large-scale multi-core and accelerated systems where a differential in system communication performance exists.« less
A CMMI-based approach for medical software project life cycle study.
Chen, Jui-Jen; Su, Wu-Chen; Wang, Pei-Wen; Yen, Hung-Chi
2013-01-01
In terms of medical techniques, Taiwan has gained international recognition in recent years. However, the medical information system industry in Taiwan is still at a developing stage compared with the software industries in other nations. In addition, systematic development processes are indispensable elements of software development. They can help developers increase their productivity and efficiency and also avoid unnecessary risks arising during the development process. Thus, this paper presents an application of Light-Weight Capability Maturity Model Integration (LW-CMMI) to Chang Gung Medical Research Project (CMRP) in the Nuclear medicine field. This application was intended to integrate user requirements, system design and testing of software development processes into three layers (Domain, Concept and Instance) model. Then, expressing in structural System Modeling Language (SysML) diagrams and converts part of the manual effort necessary for project management maintenance into computational effort, for example: (semi-) automatic delivery of traceability management. In this application, it supports establishing artifacts of "requirement specification document", "project execution plan document", "system design document" and "system test document", and can deliver a prototype of lightweight project management tool on the Nuclear Medicine software project. The results of this application can be a reference for other medical institutions in developing medical information systems and support of project management to achieve the aim of patient safety.
Verma, Arjun; Fratto, Brian E.; Privman, Vladimir; Katz, Evgeny
2016-01-01
We consider flow systems that have been utilized for small-scale biomolecular computing and digital signal processing in binary-operating biosensors. Signal measurement is optimized by designing a flow-reversal cuvette and analyzing the experimental data to theoretically extract the pulse shape, as well as reveal the level of noise it possesses. Noise reduction is then carried out numerically. We conclude that this can be accomplished physically via the addition of properly designed well-mixing flow-reversal cell(s) as an integral part of the flow system. This approach should enable improved networking capabilities and potentially not only digital but analog signal-processing in such systems. Possible applications in complex biocomputing networks and various sense-and-act systems are discussed. PMID:27399702
NASA Astrophysics Data System (ADS)
Volkov, L. V.; Larkin, A. I.
1994-04-01
Theoretical and experimental investigations are reported of the potential applications of quasi-cw partially coherent radiation in optical systems based on diffraction—interference principles. It is shown that the spectral characteristics of quasi-cw radiation influence the data-handling capabilities of a holographic correlator and of a partially coherent holographic system for data acquisition. Relevant experimental results are reported.
1990-09-01
I. Introduction .......................................... 1 General Issue .................................. 1 Specific Research Problem...viii APPLICATION OF A MICRO COMPUTER-BASED MANAGEMENT INFORMATION SYSTEM TO IMPROVE THE USAF SERVICE REPORTING PROCESS I. Introduction General Issue...continued Transfer MIP Responsibility ,KNT WETSS0GEFORM UNCLASSIFIED 904 JAUG 19: iRR iRRl UUUUI HOWE271652_ D- FF:MCH INFO: NONE E. iUCH DATA DEF: NONE F
A Process-Centered Tool for Evaluating Patient Safety Performance and Guiding Strategic Improvement
2005-01-01
next patient safety steps in individual health care organizations. The low priority given to Category 3 (Focus on patients , other customers , and...presents a patient safety applicator tool for implementing and assessing patient safety systems in health care institutions. The applicator tool consists...the survey rounds. The study addressed three research questions: 1. What critical processes should be included in health care patient safety systems
Image processing for flight crew enhanced situation awareness
NASA Technical Reports Server (NTRS)
Roberts, Barry
1993-01-01
This presentation describes the image processing work that is being performed for the Enhanced Situational Awareness System (ESAS) application. Specifically, the presented work supports the Enhanced Vision System (EVS) component of ESAS.
A proven knowledge-based approach to prioritizing process information
NASA Technical Reports Server (NTRS)
Corsberg, Daniel R.
1991-01-01
Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.
Design of an FMCW radar baseband signal processing system for automotive application.
Lin, Jau-Jr; Li, Yuan-Ping; Hsu, Wei-Chiang; Lee, Ta-Sung
2016-01-01
For a typical FMCW automotive radar system, a new design of baseband signal processing architecture and algorithms is proposed to overcome the ghost targets and overlapping problems in the multi-target detection scenario. To satisfy the short measurement time constraint without increasing the RF front-end loading, a three-segment waveform with different slopes is utilized. By introducing a new pairing mechanism and a spatial filter design algorithm, the proposed detection architecture not only provides high accuracy and reliability, but also requires low pairing time and computational loading. This proposed baseband signal processing architecture and algorithms balance the performance and complexity, and are suitable to be implemented in a real automotive radar system. Field measurement results demonstrate that the proposed automotive radar signal processing system can perform well in a realistic application scenario.
Applications of Multi-Agent Technology to Power Systems
NASA Astrophysics Data System (ADS)
Nagata, Takeshi
Currently, agents are focus of intense on many sub-fields of computer science and artificial intelligence. Agents are being used in an increasingly wide variety of applications. Many important computing applications such as planning, process control, communication networks and concurrent systems will benefit from using multi-agent system approach. A multi-agent system is a structure given by an environment together with a set of artificial agents capable to act on this environment. Multi-agent models are oriented towards interactions, collaborative phenomena, and autonomy. This article presents the applications of multi-agent technology to the power systems.
ERIC Educational Resources Information Center
Abildinova, Gulmira M.; Alzhanov, Aitugan K.; Ospanova, Nazira N.; Taybaldieva, Zhymatay; Baigojanova, Dametken S.; Pashovkin, Nikita O.
2016-01-01
Nowadays, when there is a need to introduce various innovations into the educational process, most efforts are aimed at simplifying the learning process. To that end, electronic textbooks, testing systems and other software is being developed. Most of them are intended to run on personal computers with limited mobility. Smart education is…
Knowledge-based control of an adaptive interface
NASA Technical Reports Server (NTRS)
Lachman, Roy
1989-01-01
The analysis, development strategy, and preliminary design for an intelligent, adaptive interface is reported. The design philosophy couples knowledge-based system technology with standard human factors approaches to interface development for computer workstations. An expert system has been designed to drive the interface for application software. The intelligent interface will be linked to application packages, one at a time, that are planned for multiple-application workstations aboard Space Station Freedom. Current requirements call for most Space Station activities to be conducted at the workstation consoles. One set of activities will consist of standard data management services (DMS). DMS software includes text processing, spreadsheets, data base management, etc. Text processing was selected for the first intelligent interface prototype because text-processing software can be developed initially as fully functional but limited with a small set of commands. The program's complexity then can be increased incrementally. The intelligent interface includes the operator's behavior and three types of instructions to the underlying application software are included in the rule base. A conventional expert-system inference engine searches the data base for antecedents to rules and sends the consequents of fired rules as commands to the underlying software. Plans for putting the expert system on top of a second application, a database management system, will be carried out following behavioral research on the first application. The intelligent interface design is suitable for use with ground-based workstations now common in government, industrial, and educational organizations.
DIY Soundcard Based Temperature Logging System. Part II: Applications
ERIC Educational Resources Information Center
Nunn, John
2016-01-01
This paper demonstrates some simple applications of how temperature logging systems may be used to monitor simple heat experiments, and how the data obtained can be analysed to get some additional insight into the physical processes. [For "DIY Soundcard Based Temperature Logging System. Part I: Design," see EJ1114124.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strait, R.S.; Wagner, E.E.
1994-07-01
The US Department of Energy (DOE) Office of Safeguards and Security initiated the DOE Integrated Security System / Electronic Transfer (DISS/ET) for the purpose of reducing the time required to process security clearance requests. DISS/ET will be an integrated system using electronic commerce technologies for the collection and processing of personnel security clearance data, and its transfer between DOE local security clearance offices, DOE Operations Offices, and the Office of Personnel Management. The system will use electronic forms to collect clearance applicant data. The forms data will be combined with electronic fingerprint images and packaged in a secure encrypted electronicmore » mail envelope for transmission across the Internet. Information provided by the applicant will be authenticated using digital signatures. All processing will be done electronically.« less
NASA Technical Reports Server (NTRS)
Bonanne, Kevin H.
2011-01-01
Model-based Systems Engineering (MBSE) is an emerging methodology that can be leveraged to enhance many system development processes. MBSE allows for the centralization of an architecture description that would otherwise be stored in various locations and formats, thus simplifying communication among the project stakeholders, inducing commonality in representation, and expediting report generation. This paper outlines the MBSE approach taken to capture the processes of two different, but related, architectures by employing the Systems Modeling Language (SysML) as a standard for architecture description and the modeling tool MagicDraw. The overarching goal of this study was to demonstrate the effectiveness of MBSE as a means of capturing and designing a mission systems architecture. The first portion of the project focused on capturing the necessary system engineering activities that occur when designing, developing, and deploying a mission systems architecture for a space mission. The second part applies activities from the first to an application problem - the system engineering of the Orion Flight Test 1 (OFT-1) End-to-End Information System (EEIS). By modeling the activities required to create a space mission architecture and then implementing those activities in an application problem, the utility of MBSE as an approach to systems engineering can be demonstrated.
Industrial application of thermal image processing and thermal control
NASA Astrophysics Data System (ADS)
Kong, Lingxue
2001-09-01
Industrial application of infrared thermography is virtually boundless as it can be used in any situations where there are temperature differences. This technology has particularly been widely used in automotive industry for process evaluation and system design. In this work, thermal image processing technique will be introduced to quantitatively calculate the heat stored in a warm/hot object and consequently, a thermal control system will be proposed to accurately and actively manage the thermal distribution within the object in accordance with the heat calculated from the thermal images.
Data exchange technology based on handshake protocol for industrial automation system
NASA Astrophysics Data System (ADS)
Astafiev, A. V.; Shardin, T. O.
2018-05-01
In the article, questions of data exchange technology based on the handshake protocol for industrial automation system are considered. The methods of organizing the technology in client-server applications are analyzed. In the process of work, the main threats of client-server applications that arise during the information interaction of users are indicated. Also, a comparative analysis of analogue systems was carried out, as a result of which the most suitable option was chosen for further use. The basic schemes for the operation of the handshake protocol are shown, as well as the general scheme of the implemented application, which describes the entire process of interaction between the client and the server.
A USNRC perspective on the use of commercial-off-shelf software (COTS) in advanced reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, J.C.
1997-12-01
The use of commercially available digital computer systems and components in safety critical systems (nuclear power plant, military, and commercial applications) is increasing rapidly. While this paper focuses on the software aspects of the application most of these continents are applicable to the hardware aspects as well. Commercial dedication (the process of assuring that a commercial grade item will perform its intended safety function) has demonstrated benefits in cost savings and a wide base of user experience, however, care must be taken to avoid difficulties with some aspects of the dedication process such as access to vendor development information, configurationmore » management long term support, and system integration.« less
Fast Computation and Assessment Methods in Power System Analysis
NASA Astrophysics Data System (ADS)
Nagata, Masaki
Power system analysis is essential for efficient and reliable power system operation and control. Recently, online security assessment system has become of importance, as more efficient use of power networks is eagerly required. In this article, fast power system analysis techniques such as contingency screening, parallel processing and intelligent systems application are briefly surveyed from the view point of their application to online dynamic security assessment.
Signal processing system for electrotherapy applications
NASA Astrophysics Data System (ADS)
Płaza, Mirosław; Szcześniak, Zbigniew
2017-08-01
The system of signal processing for electrotherapeutic applications is proposed in the paper. The system makes it possible to model the curve of threshold human sensitivity to current (Dalziel's curve) in full medium frequency range (1kHz-100kHz). The tests based on the proposed solution were conducted and their results were compared with those obtained according to the assumptions of High Tone Power Therapy method and referred to optimum values. Proposed system has high dynamics and precision of mapping the curve of threshold human sensitivity to current and can be used in all methods where threshold curves are modelled.
Implementation and evaluation of LMS mobile application: scele mobile based on user-centered design
NASA Astrophysics Data System (ADS)
Banimahendra, R. D.; Santoso, H. B.
2018-03-01
The development of mobile technology is now increasing rapidly, demanding all activities including learning should be done on mobile devices. It shows that the implementation of mobile application as a learning medium needs to be done. This study describes the process of developing and evaluating the Moodle-based mobile Learning Management System (LMS) application called Student Centered e-Learning Environment (SCeLE). This study discusses the process of defining features, implementing features into the application, and evaluating the application. We define the features using user research and literature study, then we implement the application with user-centered design basis, at the last phase we evaluated the application using usability testing and system usability score (SUS). The purpose of this study is to determine the extent to which this application can help the users doing their tasks and provide recommendation for the next research and development.
NASA Astrophysics Data System (ADS)
Levchenko, N. G.; Glushkov, S. V.; Sobolevskaya, E. Yu; Orlov, A. P.
2018-05-01
The method of modeling the transport and logistics process using fuzzy neural network technologies has been considered. The analysis of the implemented fuzzy neural network model of the information management system of transnational multimodal transportation of the process showed the expediency of applying this method to the management of transport and logistics processes in the Arctic and Subarctic conditions. The modular architecture of this model can be expanded by incorporating additional modules, since the working conditions in the Arctic and the subarctic themselves will present more and more realistic tasks. The architecture allows increasing the information management system, without affecting the system or the method itself. The model has a wide range of application possibilities, including: analysis of the situation and behavior of interacting elements; dynamic monitoring and diagnostics of management processes; simulation of real events and processes; prediction and prevention of critical situations.
Third Conference on Artificial Intelligence for Space Applications, part 1
NASA Technical Reports Server (NTRS)
Denton, Judith S. (Compiler); Freeman, Michael S. (Compiler); Vereen, Mary (Compiler)
1987-01-01
The application of artificial intelligence to spacecraft and aerospace systems is discussed. Expert systems, robotics, space station automation, fault diagnostics, parallel processing, knowledge representation, scheduling, man-machine interfaces and neural nets are among the topics discussed.
Framework for Development of Object-Oriented Software
NASA Technical Reports Server (NTRS)
Perez-Poveda, Gus; Ciavarella, Tony; Nieten, Dan
2004-01-01
The Real-Time Control (RTC) Application Framework is a high-level software framework written in C++ that supports the rapid design and implementation of object-oriented application programs. This framework provides built-in functionality that solves common software development problems within distributed client-server, multi-threaded, and embedded programming environments. When using the RTC Framework to develop software for a specific domain, designers and implementers can focus entirely on the details of the domain-specific software rather than on creating custom solutions, utilities, and frameworks for the complexities of the programming environment. The RTC Framework was originally developed as part of a Space Shuttle Launch Processing System (LPS) replacement project called Checkout and Launch Control System (CLCS). As a result of the framework s development, CLCS software development time was reduced by 66 percent. The framework is generic enough for developing applications outside of the launch-processing system domain. Other applicable high-level domains include command and control systems and simulation/ training systems.
Micro Thermal and Chemical Systems for In Situ Resource Utilization on Mars
NASA Technical Reports Server (NTRS)
Wegeng, Robert S.; Sanders, Gerald
2000-01-01
Robotic sample return missions and postulated human missions to Mars can be greatly aided through the development and utilization of compact chemical processing systems that process atmospheric gases and other indigenous resources to produce hydrocarbon propellants/fuels, oxygen, and other needed chemicals. When used to reduce earth launch mass, substantial cost savings can result. Process Intensification and Process Miniaturization can simultaneously be achieved through the application of microfabricated chemical process systems, based on the rapid heat and mass transport in engineered microchannels. Researchers at NASA's Johnson Space Center (JSC) and the Department of Energy's Pacific Northwest National Laboratory (PNNL) are collaboratively developing micro thermal and chemical systems for NASA's Mission to Mars program. Preliminary results show that many standard chemical process components (e.g., heat exchangers, chemical reactors and chemical separations units) can be reduced in hardware volume without a corresponding reduction in chemical production rates. Low pressure drops are also achievable when appropriate scaling rules are applied. This paper will discuss current progress in the development of engineered microchemical systems for space and terrestrial applications, including fabrication methods, expected operating characteristics, and specific experimental results.
Food metabolomics: from farm to human.
Kim, Sooah; Kim, Jungyeon; Yun, Eun Ju; Kim, Kyoung Heon
2016-02-01
Metabolomics, one of the latest components in the suite of systems biology, has been used to understand the metabolism and physiology of living systems, including microorganisms, plants, animals and humans. Food metabolomics can be defined as the application of metabolomics in food systems, including food resources, food processing and diet for humans. The study of food metabolomics has increased gradually in the recent years, because food systems are directly related to nutrition and human health. This review describes the recent trends and applications of metabolomics to food systems, from farm to human, including food resource production, industrial food processing and food intake by humans. Copyright © 2015 Elsevier Ltd. All rights reserved.
77 FR 42363 - Notice of Delays in Processing of Special Permits Applications
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-18
... publishing the following list of special permit applications that have been in process for 180 days or more... Austin Powder 4 10-31-2012 Company, Cleveland, OH. 13548-P Interstate Battery 4 08-31-2012 System of The...
Real-time hyperspectral imaging for food safety applications
USDA-ARS?s Scientific Manuscript database
Multispectral imaging systems with selected bands can commonly be used for real-time applications of food processing. Recent research has demonstrated several image processing methods including binning, noise removal filter, and appropriate morphological analysis in real-time mode can remove most fa...
Kumwenda, Ben; Dowell, Jon; Husbands, Adrian
2013-07-01
The assessment of non-academic achievements through the personal statement remains part of the selection process at most UK medical and dental schools. Such statement offers applicants an opportunity to highlight their non-academic achievements, but the highly competitive nature of the process may tempt them to exaggerate their accomplishments. The challenge is that selectors cannot discern applicants' exaggerated claims from genuine accounts and the system risks preferentially selecting dishonest applicants. To explore the level and perception of deception on UCAS personal statements among applicants to medical and dental schools. To investigate the association between attitudes towards deception and various other demographic variables and cognitive ability via the UKCAT. An online survey was completed with first year students from six UK medical schools and one dental school. Questionnaire items were classified into three categories involving individual acts, how they suspect their peers behave, and overall perceptions of personal statements to influence the selection process. Descriptive statistics were used to investigate responses to questionnaire items. t-Tests were used to investigate the relationship between items, demographic variables and cognitive ability. Candidates recognized that putting fraudulent information or exaggerating one's experience on UCAS personal statement was dishonest; however there is a widespread belief that their peers do it. Female respondents and those with a higher UKCAT score were more likely to condemn deceptive practices. The existing selection process is open to abuse and may benefit dishonest applicants. Admission systems should consider investing in systems that can pursue traceable information that applicants provide, and nullify the application should it contain fraudulent information.
ERIC Educational Resources Information Center
Silver, Wayne
A description of the communication behaviors in high innovation societies depends on the application of selected principles from modern systems theory. The first is the principle of equifinality which explains the activities of open systems. If the researcher views society as an open system, he frees himself from the client approach since society…
Electronic processing and control system with programmable hardware
NASA Technical Reports Server (NTRS)
Alkalaj, Leon (Inventor); Fang, Wai-Chi (Inventor); Newell, Michael A. (Inventor)
1998-01-01
A computer system with reprogrammable hardware allowing dynamically allocating hardware resources for different functions and adaptability for different processors and different operating platforms. All hardware resources are physically partitioned into system-user hardware and application-user hardware depending on the specific operation requirements. A reprogrammable interface preferably interconnects the system-user hardware and application-user hardware.
NASA Technical Reports Server (NTRS)
Aaronson, A. C.; Buelow, K.; David, F. C.; Packard, R. L.; Ravet, F. W. (Principal Investigator)
1979-01-01
The latest satellite and computer processing and analysis technologies were tested and evaluated in terms of their application feasibility. Technologies evaluated include those developed, tested, and evaluated by the LACIE, as well as candidate technologies developed by the research community and private industry. The implementation of the applications test system and the technology transfer experience between the LACIE and the applications test system is discussed highlighting the approach, the achievements, and the shortcomings.
LIU, Tongzhu; SHEN, Aizong; HU, Xiaojian; TONG, Guixian; GU, Wei
2017-01-01
Background: We aimed to apply collaborative business intelligence (BI) system to hospital supply, processing and distribution (SPD) logistics management model. Methods: We searched Engineering Village database, China National Knowledge Infrastructure (CNKI) and Google for articles (Published from 2011 to 2016), books, Web pages, etc., to understand SPD and BI related theories and recent research status. For the application of collaborative BI technology in the hospital SPD logistics management model, we realized this by leveraging data mining techniques to discover knowledge from complex data and collaborative techniques to improve the theories of business process. Results: For the application of BI system, we: (i) proposed a layered structure of collaborative BI system for intelligent management in hospital logistics; (ii) built data warehouse for the collaborative BI system; (iii) improved data mining techniques such as supporting vector machines (SVM) and swarm intelligence firefly algorithm to solve key problems in hospital logistics collaborative BI system; (iv) researched the collaborative techniques oriented to data and business process optimization to improve the business processes of hospital logistics management. Conclusion: Proper combination of SPD model and BI system will improve the management of logistics in the hospitals. The successful implementation of the study requires: (i) to innovate and improve the traditional SPD model and make appropriate implement plans and schedules for the application of BI system according to the actual situations of hospitals; (ii) the collaborative participation of internal departments in hospital including the department of information, logistics, nursing, medical and financial; (iii) timely response of external suppliers. PMID:28828316
Camera systems in human motion analysis for biomedical applications
NASA Astrophysics Data System (ADS)
Chin, Lim Chee; Basah, Shafriza Nisha; Yaacob, Sazali; Juan, Yeap Ewe; Kadir, Aida Khairunnisaa Ab.
2015-05-01
Human Motion Analysis (HMA) system has been one of the major interests among researchers in the field of computer vision, artificial intelligence and biomedical engineering and sciences. This is due to its wide and promising biomedical applications, namely, bio-instrumentation for human computer interfacing and surveillance system for monitoring human behaviour as well as analysis of biomedical signal and image processing for diagnosis and rehabilitation applications. This paper provides an extensive review of the camera system of HMA, its taxonomy, including camera types, camera calibration and camera configuration. The review focused on evaluating the camera system consideration of the HMA system specifically for biomedical applications. This review is important as it provides guidelines and recommendation for researchers and practitioners in selecting a camera system of the HMA system for biomedical applications.
Network acceleration techniques
NASA Technical Reports Server (NTRS)
Crowley, Patricia (Inventor); Maccabe, Arthur Barney (Inventor); Awrach, James Michael (Inventor)
2012-01-01
Splintered offloading techniques with receive batch processing are described for network acceleration. Such techniques offload specific functionality to a NIC while maintaining the bulk of the protocol processing in the host operating system ("OS"). The resulting protocol implementation allows the application to bypass the protocol processing of the received data. Such can be accomplished this by moving data from the NIC directly to the application through direct memory access ("DMA") and batch processing the receive headers in the host OS when the host OS is interrupted to perform other work. Batch processing receive headers allows the data path to be separated from the control path. Unlike operating system bypass, however, the operating system still fully manages the network resource and has relevant feedback about traffic and flows. Embodiments of the present disclosure can therefore address the challenges of networks with extreme bandwidth delay products (BWDP).
Frenkel, Val S; Cummings, Gregg A; Maillacheruvu, K Y; Tang, Walter Z
2015-10-01
Literature published in 2014 and early 2015 related to food processing wastes treatment for industrial applications are reviewed. This review is a subsection of the Treatment Systems section of the annual Water Environment Federation literature review and covers the following food processing industries and applications: general, meat and poultry, fruits and vegetables, dairy and beverage, and miscellaneous treatment of food wastes.
Frenkel, Val S; Cummings, Gregg A; Maillacheruvu, K Y; Tang, Walter Z
2017-10-01
Literature published in 2016 and early 2017 related to food processing wastes treatment for industrial applications are reviewed. This review is a subsection of the Treatment Systems section of the annual Water Environment Federation literature review and covers the following food processing industries and applications: general, meat and poultry, fruits and vegetables, dairy and beverage, and miscellaneous treatment of food wastes.
Frenkel, Val S; Cummings, Gregg A; Maillacheruvu, K Y; Tang, Walter Z
2016-10-01
Literature published in 2015 and early 2016 related to food processing wastes treatment for industrial applications are reviewed. This review is a subsection of the Treatment Systems section of the annual Water Environment Federation literature review and covers the following food processing industries and applications: general, meat and poultry, fruits and vegetables, dairy and beverage, and miscellaneous treatment of food wastes.
NASA Technical Reports Server (NTRS)
Tompkins, F. G.
1984-01-01
The Office of Management and Budget (OMB) Circular A-71, transmittal Memorandum No. 1, requires that each agency establish a management control process to assure that appropriate administrative, physical and technical safeguards are incorporated into all new computer applications. In addition to security specifications, the management control process should assure that the safeguards are adequate for the application. The security activities that should be integral to the system development process are examined. The software quality assurance process to assure that adequate and appropriate controls are incorporated into sensitive applications is also examined. Security for software packages is also discussed.
NASA Astrophysics Data System (ADS)
Evtushenko, V. F.; Myshlyaev, L. P.; Makarov, G. V.; Ivushkin, K. A.; Burkova, E. V.
2016-10-01
The structure of multi-variant physical and mathematical models of control system is offered as well as its application for adjustment of automatic control system (ACS) of production facilities on the example of coal processing plant.
A systematic collaborative process for assessing launch vehicle propulsion technologies
NASA Astrophysics Data System (ADS)
Odom, Pat R.
1999-01-01
A systematic, collaborative process for prioritizing candidate investments in space transportation systems technologies has been developed for the NASA Space Transportation Programs Office. The purpose of the process is to provide a repeatable and auditable basis for selecting technology investments to enable achievement of NASA's strategic space transportation objectives. The paper describes the current multilevel process and supporting software tool that has been developed. Technologies are prioritized across system applications to produce integrated portfolios for recommended funding. An example application of the process to the assessment of launch vehicle propulsion technologies is described and illustrated. The methodologies discussed in the paper are expected to help NASA and industry ensure maximum returns from technology investments under constrained budgets.
Wang, Monan; Zhang, Kai; Yang, Ning
2018-04-09
To help doctors decide their treatment from the aspect of mechanical analysis, the work built a computer assisted optimal system for treatment of femoral neck fracture oriented to clinical application. The whole system encompassed the following three parts: Preprocessing module, finite element mechanical analysis module, post processing module. Preprocessing module included parametric modeling of bone, parametric modeling of fracture face, parametric modeling of fixed screw and fixed position and input and transmission of model parameters. Finite element mechanical analysis module included grid division, element type setting, material property setting, contact setting, constraint and load setting, analysis method setting and batch processing operation. Post processing module included extraction and display of batch processing operation results, image generation of batch processing operation, optimal program operation and optimal result display. The system implemented the whole operations from input of fracture parameters to output of the optimal fixed plan according to specific patient real fracture parameter and optimal rules, which demonstrated the effectiveness of the system. Meanwhile, the system had a friendly interface, simple operation and could improve the system function quickly through modifying single module.
Advanced Life Support Systems: Opportunities for Technology Transfer
NASA Technical Reports Server (NTRS)
Fields, B.; Henninger, D.; Ming, D.; Verostko, C. E.
1994-01-01
NASA's future missions to explore the solar system will be of long-duration possibly lasting years at a time. Human life support systems will have to operate with very high reliability for these long periods with essentially no resupply from Earth. Such life support systems will make extensive use of higher plants, microorganisms, and physicochemical processes for recycling air and water, processing wastes, and producing food. Development of regenerative life support systems will be a pivotal capability for NASA's future human missions. A fully functional closed loop human life support system currently does not exist and thus represents a major technical challenge for space exploration. Technologies where all life support consumables are recycled have many potential terrestrial applications as well. Potential applications include providing human habitation in hostile environments such as the polar regions or the desert in such a way as to minimize energy expenditures and to minimize negative impacts on those often ecologically-sensitive areas. Other potential applications include production of food and ornamental crops without damaging the environment from fertilizers that contaminate water supplies; removal of trace gas contaminants from tightly sealed, energy-efficient buildings (the so-called sick building syndrome); and even the potential of gaining insight into the dynamics of the Earth's biosphere such that we can better manage our global environment. Two specific advanced life support technologies being developed by NASA, with potential terrestrial application, are the zeoponic plant growth system and the Hybrid Regenerative Water Recovery System (HRWRS). The potential applications for these candidate dual use technologies are quite different as are the mechanisms for transfer. In the case of zeoponics, a variety of commercial applications has been suggested which represent potentially lucrative markets. Also, the patented nature of this product offers opportunities for licensing to commercial entities. In the case of the HRWRS, commercial markets with broad applications have not been identified but some terrestrial applications are being explored where this approach has advantages over other methods of waste water processing. Although these potential applications do not appear to have the same broad attraction from the standpoint of rapid commercialization, they represent niches where commercialization possibilities as well as social benefits could be realized.
Use of Precious Metal-Modifed Nickel-Base Superalloys for Thin Gage Applications (Preprint)
2011-04-01
superalloys are being investigated for use in thin gage applications, such as thermal protection systems or heat exchangers, due to their strength and...atomic % total) in place of the platinum and iridium. 15. SUBJECT TERMS thermal protection systems, nickel, superalloy, thermomechanical processing...use in thin gage applications, such as thermal protection systems or heat exchangers, due to their strength and inherent oxidation resistance at
Scalable Photogrammetric Motion Capture System "mosca": Development and Application
NASA Astrophysics Data System (ADS)
Knyaz, V. A.
2015-05-01
Wide variety of applications (from industrial to entertainment) has a need for reliable and accurate 3D information about motion of an object and its parts. Very often the process of movement is rather fast as in cases of vehicle movement, sport biomechanics, animation of cartoon characters. Motion capture systems based on different physical principles are used for these purposes. The great potential for obtaining high accuracy and high degree of automation has vision-based system due to progress in image processing and analysis. Scalable inexpensive motion capture system is developed as a convenient and flexible tool for solving various tasks requiring 3D motion analysis. It is based on photogrammetric techniques of 3D measurements and provides high speed image acquisition, high accuracy of 3D measurements and highly automated processing of captured data. Depending on the application the system can be easily modified for different working areas from 100 mm to 10 m. The developed motion capture system uses from 2 to 4 technical vision cameras for video sequences of object motion acquisition. All cameras work in synchronization mode at frame rate up to 100 frames per second under the control of personal computer providing the possibility for accurate calculation of 3D coordinates of interest points. The system was used for a set of different applications fields and demonstrated high accuracy and high level of automation.
Janakiraman, Vijay; Kwiatkowski, Chris; Kshirsagar, Rashmi; Ryll, Thomas; Huang, Yao-Ming
2015-01-01
High-throughput systems and processes have typically been targeted for process development and optimization in the bioprocessing industry. For process characterization, bench scale bioreactors have been the system of choice. Due to the need for performing different process conditions for multiple process parameters, the process characterization studies typically span several months and are considered time and resource intensive. In this study, we have shown the application of a high-throughput mini-bioreactor system viz. the Advanced Microscale Bioreactor (ambr15(TM) ), to perform process characterization in less than a month and develop an input control strategy. As a pre-requisite to process characterization, a scale-down model was first developed in the ambr system (15 mL) using statistical multivariate analysis techniques that showed comparability with both manufacturing scale (15,000 L) and bench scale (5 L). Volumetric sparge rates were matched between ambr and manufacturing scale, and the ambr process matched the pCO2 profiles as well as several other process and product quality parameters. The scale-down model was used to perform the process characterization DoE study and product quality results were generated. Upon comparison with DoE data from the bench scale bioreactors, similar effects of process parameters on process yield and product quality were identified between the two systems. We used the ambr data for setting action limits for the critical controlled parameters (CCPs), which were comparable to those from bench scale bioreactor data. In other words, the current work shows that the ambr15(TM) system is capable of replacing the bench scale bioreactor system for routine process development and process characterization. © 2015 American Institute of Chemical Engineers.
Flexible control techniques for a lunar base
NASA Technical Reports Server (NTRS)
Kraus, Thomas W.
1992-01-01
The fundamental elements found in every terrestrial control system can be employed in all lunar applications. These elements include sensors which measure physical properties, controllers which acquire sensor data and calculate a control response, and actuators which apply the control output to the process. The unique characteristics of the lunar environment will certainly require the development of new control system technology. However, weightlessness, harsh atmospheric conditions, temperature extremes, and radiation hazards will most significantly impact the design of sensors and actuators. The controller and associated control algorithms, which are the most complex element of any control system, can be derived in their entirety from existing technology. Lunar process control applications -- ranging from small-scale research projects to full-scale processing plants -- will benefit greatly from the controller advances being developed today. In particular, new software technology aimed at commercial process monitoring and control applications will almost completely eliminate the need for custom programs and the lengthy development and testing cycle they require. The applicability of existing industrial software to lunar applications has other significant advantages in addition to cost and quality. This software is designed to run on standard hardware platforms and takes advantage of existing LAN and telecommunications technology. Further, in order to exploit the existing commercial market, the software is being designed to be implemented by users of all skill levels -- typically users who are familiar with their process, but not necessarily with software or control theory. This means that specialized technical support personnel will not need to be on-hand, and the associated costs are eliminated. Finally, the latest industrial software designed for the commercial market is extremely flexible, in order to fit the requirements of many types of processing applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth.
Natural Inspired Intelligent Visual Computing and Its Application to Viticulture.
Ang, Li Minn; Seng, Kah Phooi; Ge, Feng Lu
2017-05-23
This paper presents an investigation of natural inspired intelligent computing and its corresponding application towards visual information processing systems for viticulture. The paper has three contributions: (1) a review of visual information processing applications for viticulture; (2) the development of natural inspired computing algorithms based on artificial immune system (AIS) techniques for grape berry detection; and (3) the application of the developed algorithms towards real-world grape berry images captured in natural conditions from vineyards in Australia. The AIS algorithms in (2) were developed based on a nature-inspired clonal selection algorithm (CSA) which is able to detect the arcs in the berry images with precision, based on a fitness model. The arcs detected are then extended to perform the multiple arcs and ring detectors information processing for the berry detection application. The performance of the developed algorithms were compared with traditional image processing algorithms like the circular Hough transform (CHT) and other well-known circle detection methods. The proposed AIS approach gave a Fscore of 0.71 compared with Fscores of 0.28 and 0.30 for the CHT and a parameter-free circle detection technique (RPCD) respectively.
Application of grounded theory to content definition: a case study.
Audiss, D; Roth, T
1999-02-01
Successful implementation of a clinical information system requires clinician involvement throughout the process of content definition and system development to ensure acceptance of the automated care process. In these times of downsizing, however, clinicians are not always able to participate fully in the dontent definition phase of system development and often become frustrated with their inability to obtain the patient information they need from the system. The qualitative research principles of grounded theory afford clinicians the opportunity to participate in content definition for information systems. This article presents a case study of the application of grounded theory to develop systematically the content definition for a clinical information system in preparation for implementation on four medical-surgical units.
ARIADNE: a Tracking System for Relationships in LHCb Metadata
NASA Astrophysics Data System (ADS)
Shapoval, I.; Clemencic, M.; Cattaneo, M.
2014-06-01
The data processing model of the LHCb experiment implies handling of an evolving set of heterogeneous metadata entities and relationships between them. The entities range from software and databases states to architecture specificators and software/data deployment locations. For instance, there is an important relationship between the LHCb Conditions Database (CondDB), which provides versioned, time dependent geometry and conditions data, and the LHCb software, which is the data processing applications (used for simulation, high level triggering, reconstruction and analysis of physics data). The evolution of CondDB and of the LHCb applications is a weakly-homomorphic process. It means that relationships between a CondDB state and LHCb application state may not be preserved across different database and application generations. These issues may lead to various kinds of problems in the LHCb production, varying from unexpected application crashes to incorrect data processing results. In this paper we present Ariadne - a generic metadata relationships tracking system based on the novel NoSQL Neo4j graph database. Its aim is to track and analyze many thousands of evolving relationships for cases such as the one described above, and several others, which would otherwise remain unmanaged and potentially harmful. The highlights of the paper include the system's implementation and management details, infrastructure needed for running it, security issues, first experience of usage in the LHCb production and potential of the system to be applied to a wider set of LHCb tasks.
A modular, programmable measurement system for physiological and spaceflight applications
NASA Technical Reports Server (NTRS)
Hines, John W.; Ricks, Robert D.; Miles, Christopher J.
1993-01-01
The NASA-Ames Sensors 2000! Program has developed a small, compact, modular, programmable, sensor signal conditioning and measurement system, initially targeted for Life Sciences Spaceflight Programs. The system consists of a twelve-slot, multi-layer, distributed function backplane, a digital microcontroller/memory subsystem, conditioned and isolated power supplies, and six application-specific, physiological signal conditioners. Each signal condition is capable of being programmed for gains, offsets, calibration and operate modes, and, in some cases, selectable outputs and functional modes. Presently, the system has the capability for measuring ECG, EMG, EEG, Temperature, Respiration, Pressure, Force, and Acceleration parameters, in physiological ranges. The measurement system makes heavy use of surface-mount packaging technology, resulting in plug in modules sized 125x55 mm. The complete 12-slot system is contained within a volume of 220x150x70mm. The system's capabilities extend well beyond the specific objectives of NASA programs. Indeed, the potential commercial uses of the technology are virtually limitless. In addition to applications in medical and biomedical sensing, the system might also be used in process control situations, in clinical or research environments, in general instrumentation systems, factory processing, or any other applications where high quality measurements are required.
A modular, programmable measurement system for physiological and spaceflight applications
NASA Astrophysics Data System (ADS)
Hines, John W.; Ricks, Robert D.; Miles, Christopher J.
1993-02-01
The NASA-Ames Sensors 2000] Program has developed a small, compact, modular, programmable, sensor signal conditioning and measurement system, initially targeted for Life Sciences Spaceflight Programs. The system consists of a twelve-slot, multi-layer, distributed function backplane, a digital microcontroller/memory subsystem, conditioned and isolated power supplies, and six application-specific, physiological signal conditioners. Each signal condition is capable of being programmed for gains, offsets, calibration and operate modes, and, in some cases, selectable outputs and functional modes. Presently, the system has the capability for measuring ECG, EMG, EEG, Temperature, Respiration, Pressure, Force, and Acceleration parameters, in physiological ranges. The measurement system makes heavy use of surface-mount packaging technology, resulting in plug in modules sized 125x55 mm. The complete 12-slot system is contained within a volume of 220x150x70mm. The system's capabilities extend well beyond the specific objectives of NASA programs. Indeed, the potential commercial uses of the technology are virtually limitless. In addition to applications in medical and biomedical sensing, the system might also be used in process control situations, in clinical or research environments, in general instrumentation systems, factory processing, or any other applications where high quality measurements are required.
Decision support systems and the healthcare strategic planning process: a case study.
Lundquist, D L; Norris, R M
1991-01-01
The repertoire of applications that comprises health-care decision support systems (DSS) includes analyses of clinical, financial, and operational activities. As a whole, these applications facilitate developing comprehensive and interrelated business and medical models that support the complex decisions required to successfully manage today's health-care organizations. Kennestone Regional Health Care System's use of DSS to facilitate strategic planning has precipitated marked changes in the organization's method of determining capital allocations. This case study discusses Kennestone's use of DSS in the strategic planning process, including profiles of key DSS modeling components.
NASA Astrophysics Data System (ADS)
Lamarque, J. F.
2016-12-01
In this talk, we will discuss the upcoming release of CESM2 and the computational and scientific challenges encountered in the process. We will then discuss upcoming new opportunities in development and applications of Earth System Models; in particular, we will discuss additional ways in which the university community can contribute to CESM.
System software for the finite element machine
NASA Technical Reports Server (NTRS)
Crockett, T. W.; Knott, J. D.
1985-01-01
The Finite Element Machine is an experimental parallel computer developed at Langley Research Center to investigate the application of concurrent processing to structural engineering analysis. This report describes system-level software which has been developed to facilitate use of the machine by applications researchers. The overall software design is outlined, and several important parallel processing issues are discussed in detail, including processor management, communication, synchronization, and input/output. Based on experience using the system, the hardware architecture and software design are critiqued, and areas for further work are suggested.
Stack-and-Draw Manufacture Process of a Seven-Core Optical Fiber for Fluorescence Measurements
NASA Astrophysics Data System (ADS)
Samir, Ahmed; Batagelj, Bostjan
2018-01-01
Multi-core, optical-fiber technology is expected to be used in telecommunications and sensory systems in a relatively short amount of time. However, a successful transition from research laboratories to industry applications will only be possible with an optimized design and manufacturing process. The fabrication process is an important aspect in designing and developing new multi-applicable, multi-core fibers, where the best candidate is a seven-core fiber. Here, the basics for designing and manufacturing a single-mode, seven-core fiber using the stack-and-draw process is described for the example of a fluorescence sensory system.
Methods for design and evaluation of parallel computating systems (The PISCES project)
NASA Technical Reports Server (NTRS)
Pratt, Terrence W.; Wise, Robert; Haught, Mary JO
1989-01-01
The PISCES project started in 1984 under the sponsorship of the NASA Computational Structural Mechanics (CSM) program. A PISCES 1 programming environment and parallel FORTRAN were implemented in 1984 for the DEC VAX (using UNIX processes to simulate parallel processes). This system was used for experimentation with parallel programs for scientific applications and AI (dynamic scene analysis) applications. PISCES 1 was ported to a network of Apollo workstations by N. Fitzgerald.
NASA Technical Reports Server (NTRS)
Gupta, Pramod; Loparo, Kenneth; Mackall, Dale; Schumann, Johann; Soares, Fola
2004-01-01
Recent research has shown that adaptive neural based control systems are very effective in restoring stability and control of an aircraft in the presence of damage or failures. The application of an adaptive neural network with a flight critical control system requires a thorough and proven process to ensure safe and proper flight operation. Unique testing tools have been developed as part of a process to perform verification and validation (V&V) of real time adaptive neural networks used in recent adaptive flight control system, to evaluate the performance of the on line trained neural networks. The tools will help in certification from FAA and will help in the successful deployment of neural network based adaptive controllers in safety-critical applications. The process to perform verification and validation is evaluated against a typical neural adaptive controller and the results are discussed.
Code of Federal Regulations, 2010 CFR
2010-10-01
... (CARS). All services authorized under part 78 of this title. (e) Filings. Any application, notification... conveyed by operation of rule upon filing notification of aeronautical frequency usage by MVPDs or... database, application filing system, and processing system for Multichannel Video and Cable Television...
Design and Verification of Remote Sensing Image Data Center Storage Architecture Based on Hadoop
NASA Astrophysics Data System (ADS)
Tang, D.; Zhou, X.; Jing, Y.; Cong, W.; Li, C.
2018-04-01
The data center is a new concept of data processing and application proposed in recent years. It is a new method of processing technologies based on data, parallel computing, and compatibility with different hardware clusters. While optimizing the data storage management structure, it fully utilizes cluster resource computing nodes and improves the efficiency of data parallel application. This paper used mature Hadoop technology to build a large-scale distributed image management architecture for remote sensing imagery. Using MapReduce parallel processing technology, it called many computing nodes to process image storage blocks and pyramids in the background to improve the efficiency of image reading and application and sovled the need for concurrent multi-user high-speed access to remotely sensed data. It verified the rationality, reliability and superiority of the system design by testing the storage efficiency of different image data and multi-users and analyzing the distributed storage architecture to improve the application efficiency of remote sensing images through building an actual Hadoop service system.
Display management subsystem, version 1: A user's eye view
NASA Technical Reports Server (NTRS)
Parker, Dolores
1986-01-01
The structure and application functions of the Display Management Subsystem (DMS) are described. The DMS, a subsystem of the Transportable Applications Executive (TAE), was designed to provide a device-independent interface for an image processing and display environment. The system is callable by C and FORTRAN applications, portable to accommodate different image analysis terminals, and easily expandable to meet local needs. Generic applications are also available for performing many image processing tasks.
An advanced telerobotic system for shuttle payload changeout room processing applications
NASA Technical Reports Server (NTRS)
Sklar, M.; Wegerif, D.
1989-01-01
To potentially alleviate the inherent difficulties in the ground processing of the Space Shuttle and its associated payloads, a teleoperated, semi-autonomous robotic processing system for the Payload Changeout Room (PCR) is now in the conceptual stages. The complete PCR robotic system as currently conceived is described and critical design issues and the required technologies are discussed.
Material Processing Laser Systems In Production
NASA Astrophysics Data System (ADS)
Taeusch, David R.
1988-11-01
The laser processing system is now a respected, productive machine tool in the manufacturing industries. Systems in use today are proving their cost effectiveness and capabilities of processing quality parts. Several types of industrial lasers are described and their applications are discussed, with emphasis being placed on the production environment and methods of protection required for optical equipment against this normally hostile environment.
Distributed semantic networks and CLIPS
NASA Technical Reports Server (NTRS)
Snyder, James; Rodriguez, Tony
1991-01-01
Semantic networks of frames are commonly used as a method of reasoning in many problems. In most of these applications the semantic network exists as a single entity in a single process environment. Advances in workstation hardware provide support for more sophisticated applications involving multiple processes, interacting in a distributed environment. In these applications the semantic network may well be distributed over several concurrently executing tasks. This paper describes the design and implementation of a frame based, distributed semantic network in which frames are accessed both through C Language Integrated Production System (CLIPS) expert systems and procedural C++ language programs. The application area is a knowledge based, cooperative decision making model utilizing both rule based and procedural experts.
NASA Astrophysics Data System (ADS)
Willsch, Reinhardt; Ecke, Wolfgang; Schwotzer, Gunter
2005-09-01
Different types of advanced optical fibre sensor systems using similar spectral interrogation principles and potential low-cost polychromator optoelectronic signal processing instrumentation will be presented, and examples of their industrial application are demonstrated. These are such sensors as multimode fibre based humidity, temperature, and pressure sensors with extrinsic microoptical Fabry-Perot transducers for process control in gas industry, UV absorption evanescent field sensors for organic pollution monitoring in groundwater, and single mode fibre Bragg grating (FBG) multiplexed strain & vibration and temperature sensor networks for structural health monitoring applications in electric power facilities, aerospace, railways, geotechnical and civil engineering. Recent results of current investigations applying FBGs and microstructured fibres for chemical sensing will be discussed.
NASA Astrophysics Data System (ADS)
Longmore, S. P.; Bikos, D.; Szoke, E.; Miller, S. D.; Brummer, R.; Lindsey, D. T.; Hillger, D.
2014-12-01
The increasing use of mobile phones equipped with digital cameras and the ability to post images and information to the Internet in real-time has significantly improved the ability to report events almost instantaneously. In the context of severe weather reports, a representative digital image conveys significantly more information than a simple text or phone relayed report to a weather forecaster issuing severe weather warnings. It also allows the forecaster to reasonably discern the validity and quality of a storm report. Posting geo-located, time stamped storm report photographs utilizing a mobile phone application to NWS social media weather forecast office pages has generated recent positive feedback from forecasters. Building upon this feedback, this discussion advances the concept, development, and implementation of a formalized Photo Storm Report (PSR) mobile application, processing and distribution system and Advanced Weather Interactive Processing System II (AWIPS-II) plug-in display software.The PSR system would be composed of three core components: i) a mobile phone application, ii) a processing and distribution software and hardware system, and iii) AWIPS-II data, exchange and visualization plug-in software. i) The mobile phone application would allow web-registered users to send geo-location, view direction, and time stamped PSRs along with severe weather type and comments to the processing and distribution servers. ii) The servers would receive PSRs, convert images and information to NWS network bandwidth manageable sizes in an AWIPS-II data format, distribute them on the NWS data communications network, and archive the original PSRs for possible future research datasets. iii) The AWIPS-II data and exchange plug-ins would archive PSRs, and the visualization plug-in would display PSR locations, times and directions by hour, similar to surface observations. Hovering on individual PSRs would reveal photo thumbnails and clicking on them would display the full resolution photograph.Here, we present initial NWS forecaster feedback received from social media posted PSRs, motivating the possible advantages of PSRs within AWIPS-II, the details of developing and implementing a PSR system, and possible future applications beyond severe weather reports and AWIPS-II.
Image processing applications: From particle physics to society
NASA Astrophysics Data System (ADS)
Sotiropoulou, C.-L.; Luciano, P.; Gkaitatzis, S.; Citraro, S.; Giannetti, P.; Dell'Orso, M.
2017-01-01
We present an embedded system for extremely efficient real-time pattern recognition execution, enabling technological advancements with both scientific and social impact. It is a compact, fast, low consumption processing unit (PU) based on a combination of Field Programmable Gate Arrays (FPGAs) and the full custom associative memory chip. The PU has been developed for real time tracking in particle physics experiments, but delivers flexible features for potential application in a wide range of fields. It has been proposed to be used in accelerated pattern matching execution for Magnetic Resonance Fingerprinting (biomedical applications), in real time detection of space debris trails in astronomical images (space applications) and in brain emulation for image processing (cognitive image processing). We illustrate the potentiality of the PU for the new applications.
NASA-HBCU Space Science and Engineering Research Forum Proceedings
NASA Technical Reports Server (NTRS)
Sanders, Yvonne D. (Editor); Freeman, Yvonne B. (Editor); George, M. C. (Editor)
1989-01-01
The proceedings of the Historically Black Colleges and Universities (HBCU) forum are presented. A wide range of research topics from plant science to space science and related academic areas was covered. The sessions were divided into the following subject areas: Life science; Mathematical modeling, image processing, pattern recognition, and algorithms; Microgravity processing, space utilization and application; Physical science and chemistry; Research and training programs; Space science (astronomy, planetary science, asteroids, moon); Space technology (engineering, structures and systems for application in space); Space technology (physics of materials and systems for space applications); and Technology (materials, techniques, measurements).
ICE: A Scalable, Low-Cost FPGA-Based Telescope Signal Processing and Networking System
NASA Astrophysics Data System (ADS)
Bandura, K.; Bender, A. N.; Cliche, J. F.; de Haan, T.; Dobbs, M. A.; Gilbert, A. J.; Griffin, S.; Hsyu, G.; Ittah, D.; Parra, J. Mena; Montgomery, J.; Pinsonneault-Marotte, T.; Siegel, S.; Smecher, G.; Tang, Q. Y.; Vanderlinde, K.; Whitehorn, N.
2016-03-01
We present an overview of the ‘ICE’ hardware and software framework that implements large arrays of interconnected field-programmable gate array (FPGA)-based data acquisition, signal processing and networking nodes economically. The system was conceived for application to radio, millimeter and sub-millimeter telescope readout systems that have requirements beyond typical off-the-shelf processing systems, such as careful control of interference signals produced by the digital electronics, and clocking of all elements in the system from a single precise observatory-derived oscillator. A new generation of telescopes operating at these frequency bands and designed with a vastly increased emphasis on digital signal processing to support their detector multiplexing technology or high-bandwidth correlators — data rates exceeding a terabyte per second — are becoming common. The ICE system is built around a custom FPGA motherboard that makes use of an Xilinx Kintex-7 FPGA and ARM-based co-processor. The system is specialized for specific applications through software, firmware and custom mezzanine daughter boards that interface to the FPGA through the industry-standard FPGA mezzanine card (FMC) specifications. For high density applications, the motherboards are packaged in 16-slot crates with ICE backplanes that implement a low-cost passive full-mesh network between the motherboards in a crate, allow high bandwidth interconnection between crates and enable data offload to a computer cluster. A Python-based control software library automatically detects and operates the hardware in the array. Examples of specific telescope applications of the ICE framework are presented, namely the frequency-multiplexed bolometer readout systems used for the South Pole Telescope (SPT) and Simons Array and the digitizer, F-engine, and networking engine for the Canadian Hydrogen Intensity Mapping Experiment (CHIME) and Hydrogen Intensity and Real-time Analysis eXperiment (HIRAX) radio interferometers.
NASA Astrophysics Data System (ADS)
Tang, Hongliang; Kang, Chengxu; Tian, Youping
2018-01-01
Realizing the online handling of administrative approval of earthquakes is an important measure to improve work efficiency and facilitate people’s convenience. Based on the analysis of the characteristics and processes of the administrative licensing in the earthquake industry, this paper proposes an online processing model based on ASP technology and an online processing system based on B/S architecture. This paper presents the design and implementation methods. The application of the system shows that the system is simple in design and full in function, and can be used on mobile platforms such as computers and mobile phones, and has good practicability and forward-lookingness.
NASA Technical Reports Server (NTRS)
Sagerman, G. D.; Barna, G. J.; Burns, R. K.
1979-01-01
The Cogeneration Technology Alternatives Study (CTAS), a program undertaken to identify the most attractive advanced energy conversion systems for industrial cogeneration applications in the 1985-2000 time period, is described, and preliminary results are presented. Two cogeneration options are included in the analysis: a topping application, in which fuel is input to the energy conversion system which generates electricity and waste heat from the conversion system is used to provide heat to the process, and a bottoming application, in which fuel is burned to provide high temperature process heat and waste heat from the process is used as thermal input to the energy conversion system which generates energy. Steam turbines, open and closed cycle gas turbines, combined cycles, diesel engines, Stirling engines, phosphoric acid and molten carbonate fuel cells and thermionics are examined. Expected plant level energy savings, annual energy cost savings, and other results of the economic analysis are given, and the sensitivity of these results to the assumptions concerning fuel prices, price of purchased electricity and the potential effects of regional energy use characteristics is discussed.
The Multimission Image Processing Laboratory's virtual frame buffer interface
NASA Technical Reports Server (NTRS)
Wolfe, T.
1984-01-01
Large image processing systems use multiple frame buffers with differing architectures and vendor supplied interfaces. This variety of architectures and interfaces creates software development, maintenance and portability problems for application programs. Several machine-dependent graphics standards such as ANSI Core and GKS are available, but none of them are adequate for image processing. Therefore, the Multimission Image Processing laboratory project has implemented a programmer level virtual frame buffer interface. This interface makes all frame buffers appear as a generic frame buffer with a specified set of characteristics. This document defines the virtual frame uffer interface and provides information such as FORTRAN subroutine definitions, frame buffer characteristics, sample programs, etc. It is intended to be used by application programmers and system programmers who are adding new frame buffers to a system.
North American Fuzzy Logic Processing Society (NAFIPS 1992), volume 1
NASA Technical Reports Server (NTRS)
Villarreal, James A. (Compiler)
1992-01-01
This document contains papers presented at the NAFIPS '92 North American Fuzzy Information Processing Society Conference. More than 75 papers were presented at this Conference, which was sponsored by NAFIPS in cooperation with NASA, the Instituto Tecnologico de Morelia, the Indian Society for Fuzzy Mathematics and Information Processing (ISFUMIP), the Instituto Tecnologico de Estudios Superiores de Monterrey (ITESM), the International Fuzzy Systems Association (IFSA), the Japan Society for Fuzzy Theory and Systems, and the Microelectronics and Computer Technology Corporation (MCC). The fuzzy set theory has led to a large number of diverse applications. Recently, interesting applications have been developed which involve the integration of fuzzy systems with adaptive processes such as neural networks and genetic algorithms. NAFIPS '92 was directed toward the advancement, commercialization, and engineering development of these technologies.
North American Fuzzy Logic Processing Society (NAFIPS 1992), volume 2
NASA Technical Reports Server (NTRS)
Villarreal, James A. (Compiler)
1992-01-01
This document contains papers presented at the NAFIPS '92 North American Fuzzy Information Processing Society Conference. More than 75 papers were presented at this Conference, which was sponsored by NAFIPS in cooperation with NASA, the Instituto Tecnologico de Morelia, the Indian Society for Fuzzy Mathematics and Information Processing (ISFUMIP), the Instituto Tecnologico de Estudios Superiores de Monterrey (ITESM), the International Fuzzy Systems Association (IFSA), the Japan Society for Fuzzy Theory and Systems, and the Microelectronics and Computer Technology Corporation (MCC). The fuzzy set theory has led to a large number of diverse applications. Recently, interesting applications have been developed which involve the integration of fuzzy systems with adaptive processes such a neural networks and genetic algorithms. NAFIPS '92 was directed toward the advancement, commercialization, and engineering development of these technologies.
Neutron radiographic viewing system
NASA Technical Reports Server (NTRS)
1972-01-01
The design, development and application of a neutron radiographic viewing system for use in nondestructive testing applications is considered. The system consists of a SEC vidicon camera, neutron image intensifier system, disc recorder, and TV readout. Neutron bombardment of the subject is recorded by an image converter and passed through an optical system into the SEC vidicon. The vidicon output may be stored, or processed for visual readout.
Maine Facility Research Summary : Dynamic Sign Systems for Narrow Bridges
DOT National Transportation Integrated Search
1997-09-01
This report describes the development of operational surveillance data processing algorithms and software for application to urban freeway systems, conforming to a framework in which data processing is performed in stages: sensor malfunction detectio...
Advanced Manufacturing Systems in Food Processing and Packaging Industry
NASA Astrophysics Data System (ADS)
Shafie Sani, Mohd; Aziz, Faieza Abdul
2013-06-01
In this paper, several advanced manufacturing systems in food processing and packaging industry are reviewed, including: biodegradable smart packaging and Nano composites, advanced automation control system consists of fieldbus technology, distributed control system and food safety inspection features. The main purpose of current technology in food processing and packaging industry is discussed due to major concern on efficiency of the plant process, productivity, quality, as well as safety. These application were chosen because they are robust, flexible, reconfigurable, preserve the quality of the food, and efficient.
Dual-Use Space Technology Transfer Conference and Exhibition. Volume 2
NASA Technical Reports Server (NTRS)
Krishen, Kumar (Compiler)
1994-01-01
This is the second volume of papers presented at the Dual-Use Space Technology Transfer Conference and Exhibition held at the Johnson Space Center February 1-3, 1994. Possible technology transfers covered during the conference were in the areas of information access; innovative microwave and optical applications; materials and structures; marketing and barriers; intelligent systems; human factors and habitation; communications and data systems; business process and technology transfer; software engineering; biotechnology and advanced bioinstrumentation; communications signal processing and analysis; medical care; applications derived from control center data systems; human performance evaluation; technology transfer methods; mathematics, modeling, and simulation; propulsion; software analysis and decision tools; systems/processes in human support technology; networks, control centers, and distributed systems; power; rapid development; perception and vision technologies; integrated vehicle health management; automation technologies; advanced avionics; and robotics technologies.
The development of data acquisition and processing application system for RF ion source
NASA Astrophysics Data System (ADS)
Zhang, Xiaodan; Wang, Xiaoying; Hu, Chundong; Jiang, Caichao; Xie, Yahong; Zhao, Yuanzhe
2017-07-01
As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and high reliability. In addition, it easily achieves long-pulse steady-state operation. During the process of the development and testing of the RF ion source, a lot of original experimental data will be generated. Therefore, it is necessary to develop a stable and reliable computer data acquisition and processing application system for realizing the functions of data acquisition, storage, access, and real-time monitoring. In this paper, the development of a data acquisition and processing application system for the RF ion source is presented. The hardware platform is based on the PXI system and the software is programmed on the LabVIEW development environment. The key technologies that are used for the implementation of this software programming mainly include the long-pulse data acquisition technology, multi-threading processing technology, transmission control communication protocol, and the Lempel-Ziv-Oberhumer data compression algorithm. Now, this design has been tested and applied on the RF ion source. The test results show that it can work reliably and steadily. With the help of this design, the stable plasma discharge data of the RF ion source are collected, stored, accessed, and monitored in real-time. It is shown that it has a very practical application significance for the RF experiments.
Time-critical multirate scheduling using contemporary real-time operating system services
NASA Technical Reports Server (NTRS)
Eckhardt, D. E., Jr.
1983-01-01
Although real-time operating systems provide many of the task control services necessary to process time-critical applications (i.e., applications with fixed, invariant deadlines), it may still be necessary to provide a scheduling algorithm at a level above the operating system in order to coordinate a set of synchronized, time-critical tasks executing at different cyclic rates. The scheduling requirements for such applications and develops scheduling algorithms using services provided by contemporary real-time operating systems.
NASA Technical Reports Server (NTRS)
Kayton, M.; Smith, A. G.
1974-01-01
The services provided by the Spacelab Information Management System are discussed. The majority of the services are provided by the common-support subsystems in the Support Module furnished by the Spacelab manufacturer. The information processing requirements for the space processing applications (SPA) are identified. The requirements and capabilities for electric power, display and control panels, recording and telemetry, intercom, and closed circuit television are analyzed.
76 FR 11199 - Application(s) for Duty-Free Entry of Scientific Instruments
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-01
... of the central nervous systems of freshwater prawns. Justification for Duty-Free Entry: There are no... 120 kV accelerating voltage, and an electron gun assembly with Cool Beam Illumination System--LaB6..., flexibility of software for signal acquisition and image processing, overall system stability, and ease of use...
[Introduction of long-term care insurance: changes in service usage].
Matsuda, Tomoyuki; Tamiya, Nanako; Kashiwagi, Masayo; Moriyama, Yoko
2013-09-01
With the aging of the population, Japan's long-term care system has shifted from a welfare-placement system to a social-insurance system, which is a precedent of policies for the elderly. We examined how individuals who used care services before the implementation of long-term care insurance (LTCI) (previous service users) currently use the LTCI services, with a focus on the processes of service use. Panel data were obtained from the Nihon University Japanese Longitudinal Study of Aging database. These data were collected by interviews conducted before (November 1999 and March 2000) and after (November 2001 and December 2001) the establishment of LTCI. Among the 3992 individuals who participated in these interviews, 416 of the previous service users, aged ≥65 years, were sampled. The outcome measures were the processes of using LTCI services (application for LTCI, certification of long-term care need, and contract with LTCI service providers). Logistic regression analysis was performed to identify individual factors associated with the process of application for LTCI. There were 133 LTCI users among the 416 previous service users (32.0%). Of the service processes used, 45.5% of previous service users were applicants, 85.7% of the applicants were certified individuals, and 88.7% of those certified used services with service contracts. The application process was significantly easier for individuals with disease (odds ratio[OR], 8.34 : 95% confidence interval [CI], 1.86-37.46), those dependent for their instrumental activities of daily living (IADL) (OR, 11.21 : 95% CI, 5.22-24.07), those with an equivalent income of <1.25 million yen (OR, 2.72 : 95% CI, 1.30-5.69), and those who had used respite care (OR, 3.29 : 95% CI, 1.16-9.35) previously. In contrast, the application process was significantly difficult for community rehabilitation users (OR, 0.38 : 95% CI, 0.17-0.82). Only half of the previous service users were applicants, and they had severe diseases or were more dependent for their IADL. Our findings suggest that many individuals who were functionally independent were covered under the welfare-placement system. Additionally, low-income individuals did not refrain from applying.
Using SAHRIS a web-based application for creating heritage cases and permit applications
NASA Astrophysics Data System (ADS)
Mlungwana, N.
2015-08-01
Since the inception of the South African Heritage Resources Information System (SAHRIS) in 2012, creating heritage cases and permit applications has been streamlined, and interaction with South African Heritage Authorities has been simplified. SAHRIS facilitates applications for development cases and mining applications that trigger the South African National Heritage Resources Act (Act 25 of 1999) and is able to differentiate between cases that require comment only, where the heritage process is subsidiary to environmental or mining law (Section 38(8)), and those where the heritage authority is the deciding authority (Section 38(1)). The system further facilitates cases related to site and object management, as well as permit applications for excavation, invasive research techniques and export of materials for research abroad in the case of archaeological or palaeontological specimens, or for sale or exhibition in the case of heritage objects. The integrated, easy to use, online system has removed the need for applicants to print out forms, take documents from one government department to the next for approval and other time-consuming processes that accompany paper-based systems. SAHRIS is a user friendly application that makes it easy for applicants to make their submissions, but also allows applicants to track the progress of their cases with the relevant heritage authority, which allows for better response rates and turnaround times from the authorities, while also ensuring transparency and good governance practice.
2011-10-01
Systems engineer- ing knowledge has also been documented through the standards bodies, most notably : • ISO /IEC/IEEE 15288, Systems Engineer- ing...System Life Cycle Processes, 2008 (see [10]). • ANSI/EIA 632, Processes for Engineering a System, (1998) • IEEE 1220, ISO /IEC 26702 Application...tion • United States Defense Acquisition Guidebook, Chapter 4, June 27, 2011 • IEEE/EIA 12207 , Software Life Cycle Processes, 2008 • United
Integrating CLIPS applications into heterogeneous distributed systems
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1991-01-01
SOCIAL is an advanced, object-oriented development tool for integrating intelligent and conventional applications across heterogeneous hardware and software platforms. SOCIAL defines a family of 'wrapper' objects called agents, which incorporate predefined capabilities for distributed communication and control. Developers embed applications within agents and establish interactions between distributed agents via non-intrusive message-based interfaces. This paper describes a predefined SOCIAL agent that is specialized for integrating C Language Integrated Production System (CLIPS)-based applications. The agent's high-level Application Programming Interface supports bidirectional flow of data, knowledge, and commands to other agents, enabling CLIPS applications to initiate interactions autonomously, and respond to requests and results from heterogeneous remote systems. The design and operation of CLIPS agents are illustrated with two distributed applications that integrate CLIPS-based expert systems with other intelligent systems for isolating and mapping problems in the Space Shuttle Launch Processing System at the NASA Kennedy Space Center.
The potential of cloud point system as a novel two-phase partitioning system for biotransformation.
Wang, Zhilong
2007-05-01
Although the extractive biotransformation in two-phase partitioning systems have been studied extensively, such as the water-organic solvent two-phase system, the aqueous two-phase system, the reverse micelle system, and the room temperature ionic liquid, etc., this has not yet resulted in a widespread industrial application. Based on the discussion of the main obstacles, an exploitation of a cloud point system, which has already been applied in a separation field known as a cloud point extraction, as a novel two-phase partitioning system for biotransformation, is reviewed by analysis of some topical examples. At the end of the review, the process control and downstream processing in the application of the novel two-phase partitioning system for biotransformation are also briefly discussed.
Managing Analysis Models in the Design Process
NASA Technical Reports Server (NTRS)
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
An information adaptive system study report and development plan
NASA Technical Reports Server (NTRS)
Ataras, W. S.; Eng, K.; Morone, J. J.; Beaudet, P. R.; Chin, R.
1980-01-01
The purpose of the information adaptive system (IAS) study was to determine how some selected Earth resource applications may be processed onboard a spacecraft and to provide a detailed preliminary IAS design for these applications. Detailed investigations of a number of applications were conducted with regard to IAS and three were selected for further analysis. Areas of future research and development include algorithmic specifications, system design specifications, and IAS recommended time lines.
Image-Processing Software For A Hypercube Computer
NASA Technical Reports Server (NTRS)
Lee, Meemong; Mazer, Alan S.; Groom, Steven L.; Williams, Winifred I.
1992-01-01
Concurrent Image Processing Executive (CIPE) is software system intended to develop and use image-processing application programs on concurrent computing environment. Designed to shield programmer from complexities of concurrent-system architecture, it provides interactive image-processing environment for end user. CIPE utilizes architectural characteristics of particular concurrent system to maximize efficiency while preserving architectural independence from user and programmer. CIPE runs on Mark-IIIfp 8-node hypercube computer and associated SUN-4 host computer.
Decision Support System for Determining Scholarship Selection using an Analytical Hierarchy Process
NASA Astrophysics Data System (ADS)
Puspitasari, T. D.; Sari, E. O.; Destarianto, P.; Riskiawan, H. Y.
2018-01-01
Decision Support System is a computer program application that analyzes data and presents it so that users can make decision more easily. Determining Scholarship Selection study case in Senior High School in east Java wasn’t easy. It needed application to solve the problem, to improve the accuracy of targets for prospective beneficiaries of poor students and to speed up the screening process. This research will build system uses the method of Analytical Hierarchy Process (AHP) is a method that solves a complex and unstructured problem into its group, organizes the groups into a hierarchical order, inputs numerical values instead of human perception in comparing relative and ultimately with a synthesis determined elements that have the highest priority. The accuracy system for this research is 90%.
Natural Language Processing: Toward Large-Scale, Robust Systems.
ERIC Educational Resources Information Center
Haas, Stephanie W.
1996-01-01
Natural language processing (NLP) is concerned with getting computers to do useful things with natural language. Major applications include machine translation, text generation, information retrieval, and natural language interfaces. Reviews important developments since 1987 that have led to advances in NLP; current NLP applications; and problems…
78 FR 4138 - Notice of Intent To Grant Co-Exclusive Licenses
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-18
... brine solution.//Patent Application Serial No. 13/426294: Process and apparatus for the selective dimerization of terpenes and alpha-olefin oligomers with a single-stage reactor and a single-stage fractionation system.//Patent Application Serial No. 13/426347: Process and apparatus for the selective...
Cybernetic Basis and System Practice of Remote Sensing and Spatial Information Science
NASA Astrophysics Data System (ADS)
Tan, X.; Jing, X.; Chen, R.; Ming, Z.; He, L.; Sun, Y.; Sun, X.; Yan, L.
2017-09-01
Cybernetics provides a new set of ideas and methods for the study of modern science, and it has been fully applied in many areas. However, few people have introduced cybernetics into the field of remote sensing. The paper is based on the imaging process of remote sensing system, introducing cybernetics into the field of remote sensing, establishing a space-time closed-loop control theory for the actual operation of remote sensing. The paper made the process of spatial information coherently, and improved the comprehensive efficiency of the space information from acquisition, procession, transformation to application. We not only describes the application of cybernetics in remote sensing platform control, sensor control, data processing control, but also in whole system of remote sensing imaging process control. We achieve the information of output back to the input to control the efficient operation of the entire system. This breakthrough combination of cybernetics science and remote sensing science will improve remote sensing science to a higher level.
MarsSI: Martian surface data processing information system
NASA Astrophysics Data System (ADS)
Quantin-Nataf, C.; Lozac'h, L.; Thollot, P.; Loizeau, D.; Bultel, B.; Fernando, J.; Allemand, P.; Dubuffet, F.; Poulet, F.; Ody, A.; Clenet, H.; Leyrat, C.; Harrisson, S.
2018-01-01
MarsSI (Acronym for Mars System of Information, https://emars.univ-lyon1.fr/MarsSI/, is a web Geographic Information System application which helps managing and processing martian orbital data. The MarsSI facility is part of the web portal called PSUP (Planetary SUrface Portal) developed by the Observatories of Paris Sud (OSUPS) and Lyon (OSUL) to provide users with efficient and easy access to data products dedicated to the martian surface. The portal proposes 1) the management and processing of data thanks to MarsSI and 2) the visualization and merging of high level (imagery, spectral, and topographic) products and catalogs via a web-based user interface (MarsVisu). The portal PSUP as well as the facility MarsVisu is detailed in a companion paper (Poulet et al., 2018). The purpose of this paper is to describe the facility MarsSI. From this application, users are able to easily and rapidly select observations, process raw data via automatic pipelines, and get back final products which can be visualized under Geographic Information Systems. Moreover, MarsSI also contains an automatic stereo-restitution pipeline in order to produce Digital Terrain Models (DTM) on demand from HiRISE (High Resolution Imaging Science Experiment) or CTX (Context Camera) pair-images. This application is funded by the European Union's Seventh Framework Programme (FP7/2007-2013) (ERC project eMars, No. 280168) and has been developed in the scope of Mars, but the design is applicable to any other planetary body of the solar system.
[Computerized system validation of clinical researches].
Yan, Charles; Chen, Feng; Xia, Jia-lai; Zheng, Qing-shan; Liu, Daniel
2015-11-01
Validation is a documented process that provides a high degree of assurance. The computer system does exactly and consistently what it is designed to do in a controlled manner throughout the life. The validation process begins with the system proposal/requirements definition, and continues application and maintenance until system retirement and retention of the e-records based on regulatory rules. The objective to do so is to clearly specify that each application of information technology fulfills its purpose. The computer system validation (CSV) is essential in clinical studies according to the GCP standard, meeting product's pre-determined attributes of the specifications, quality, safety and traceability. This paper describes how to perform the validation process and determine relevant stakeholders within an organization in the light of validation SOPs. Although a specific accountability in the implementation of the validation process might be outsourced, the ultimate responsibility of the CSV remains on the shoulder of the business process owner-sponsor. In order to show that the compliance of the system validation has been properly attained, it is essential to set up comprehensive validation procedures and maintain adequate documentations as well as training records. Quality of the system validation should be controlled using both QC and QA means.
An Integrated Computerized Triage System in the Emergency Department
Aronsky, Dominik; Jones, Ian; Raines, Bill; Hemphill, Robin; Mayberry, Scott R; Luther, Melissa A; Slusser, Ted
2008-01-01
Emergency department (ED) triage is a fast-paced process that prioritizes the allocation of limited health care resources to patients in greatest need. This paper describes the experiences with an integrated, computerized triage application. The system exchanges information with other information systems, including the ED patient tracking board, the longitudinal electronic medical record, the computerized provider order entry, and the medication reconciliation application. The application includes decision support capabilities such as assessing the patient’s acuity level, age-dependent alerts for vital signs, and clinical reminders. The browser-based system utilizes the institution’s controlled vocabulary, improves data completeness and quality, such as compliance with capturing required data elements and screening questions, initiates clinical processes, such as pneumococcal vaccination ordering, and reminders to start clinical pathways, issues alerts for clinical trial eligibility, and facilitates various reporting needs. The system has supported the triage documentation of >290,000 pediatric and adult patients. PMID:18999190
Long-pulse-width narrow-bandwidth solid state laser
Dane, C. Brent; Hackel, Lloyd A.
1997-01-01
A long pulse laser system emits 500-1000 ns quasi-rectangular pulses at 527 nm with near diffraction-limited divergence and near transform-limited bandwidth. The system consists of one or more flashlamp-pumped Nd:glass zig-zag amplifiers, a very low threshold stimulated-Brillouin-scattering (SBS) phase conjugator system, and a free-running single frequency Nd:YLF master oscillator. Completely passive polarization switching provides eight amplifier gain passes. Multiple frequency output can be generated by using SBS cells having different pressures of a gaseous SBS medium or different SBS materials. This long pulse, low divergence, narrow-bandwidth, multi-frequency output laser system is ideally suited for use as an illuminator for long range speckle imaging applications. Because of its high average power and high beam quality, this system has application in any process which would benefit from a long pulse format, including material processing and medical applications.
Long-pulse-width narrow-bandwidth solid state laser
Dane, C.B.; Hackel, L.A.
1997-11-18
A long pulse laser system emits 500-1000 ns quasi-rectangular pulses at 527 nm with near diffraction-limited divergence and near transform-limited bandwidth. The system consists of one or more flashlamp-pumped Nd:glass zig-zag amplifiers, a very low threshold stimulated-Brillouin-scattering (SBS) phase conjugator system, and a free-running single frequency Nd:YLF master oscillator. Completely passive polarization switching provides eight amplifier gain passes. Multiple frequency output can be generated by using SBS cells having different pressures of a gaseous SBS medium or different SBS materials. This long pulse, low divergence, narrow-bandwidth, multi-frequency output laser system is ideally suited for use as an illuminator for long range speckle imaging applications. Because of its high average power and high beam quality, this system has application in any process which would benefit from a long pulse format, including material processing and medical applications. 5 figs.
Big Data Analysis of Manufacturing Processes
NASA Astrophysics Data System (ADS)
Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert
2015-11-01
The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.
Monolithic silicon-photonic platforms in state-of-the-art CMOS SOI processes [Invited].
Stojanović, Vladimir; Ram, Rajeev J; Popović, Milos; Lin, Sen; Moazeni, Sajjad; Wade, Mark; Sun, Chen; Alloatti, Luca; Atabaki, Amir; Pavanello, Fabio; Mehta, Nandish; Bhargava, Pavan
2018-05-14
Integrating photonics with advanced electronics leverages transistor performance, process fidelity and package integration, to enable a new class of systems-on-a-chip for a variety of applications ranging from computing and communications to sensing and imaging. Monolithic silicon photonics is a promising solution to meet the energy efficiency, sensitivity, and cost requirements of these applications. In this review paper, we take a comprehensive view of the performance of the silicon-photonic technologies developed to date for photonic interconnect applications. We also present the latest performance and results of our "zero-change" silicon photonics platforms in 45 nm and 32 nm SOI CMOS. The results indicate that the 45 nm and 32 nm processes provide a "sweet-spot" for adding photonic capability and enhancing integrated system applications beyond the Moore-scaling, while being able to offload major communication tasks from more deeply-scaled compute and memory chips without complicated 3D integration approaches.
Otolaryngology residency selection process. Medical student perspective.
Stringer, S P; Cassisi, N J; Slattery, W H
1992-04-01
In an effort to improve the otolaryngology matching process at the University of Florida, Gainesville, we sought to obtain the medical student's perspective of the current system. All students who interviewed here over a 3-year period were surveyed regarding the application, interview, and ranking process. In addition, suggestions for improving the system were sought from the students. The application and interviewing patterns of the students surveyed were found to be similar to those of the entire otolaryngology residency applicant pool. We were unable to identify any factors that influence a student's rank list that could be prospectively used to help select applicants for interview. A variety of suggestions for improvements in the match were received, several of which could easily be instituted. A uniform interview invitation date as requested by the students could be rapidly implemented and would provide benefits for both the students and the residency programs.
Geochemistry and the understanding of ground-water systems
Glynn, Pierre D.; Plummer, Niel
2005-01-01
Geochemistry has contributed significantly to the understanding of ground-water systems over the last 50 years. Historic advances include development of the hydrochemical facies concept, application of equilibrium theory, investigation of redox processes, and radiocarbon dating. Other hydrochemical concepts, tools, and techniques have helped elucidate mechanisms of flow and transport in ground-water systems, and have helped unlock an archive of paleoenvironmental information. Hydrochemical and isotopic information can be used to interpret the origin and mode of ground-water recharge, refine estimates of time scales of recharge and ground-water flow, decipher reactive processes, provide paleohydrological information, and calibrate ground-water flow models. Progress needs to be made in obtaining representative samples. Improvements are needed in the interpretation of the information obtained, and in the construction and interpretation of numerical models utilizing hydrochemical data. The best approach will ensure an optimized iterative process between field data collection and analysis, interpretation, and the application of forward, inverse, and statistical modeling tools. Advances are anticipated from microbiological investigations, the characterization of natural organics, isotopic fingerprinting, applications of dissolved gas measurements, and the fields of reaction kinetics and coupled processes. A thermodynamic perspective is offered that could facilitate the comparison and understanding of the multiple physical, chemical, and biological processes affecting ground-water systems.
Heat pipes for low-humidity applications
NASA Technical Reports Server (NTRS)
Khattar, Mukesh K.
1989-01-01
A novel application of an air-to-air heat pipe heat exchanger (HPHX) in a cooling and dehumidification process of an air-conditioning system is described which provides significant energy savings in applications requiring reheat of cold supply air to maintain low humidity. The efficiency of the system has been demonstrated in an application requiring a humidity of 40 percent. The use of the HPHX and fine tuning of the air-conditioning system and controls has resulted in significant energy savings. The technology can be advantageously used in many low-humidity applications commonly encountered in high-tech and aerospace facilities.
42 CFR 432.50 - FFP: Staffing and training costs.
Code of Federal Regulations, 2014 CFR
2014-10-01
... directly in the operation of mechanized claims processing and information retrieval systems, the rate is 75... processing and information retrieval systems, the rate is 50 percent for training and 90 percent for all... information retrieval systems (paragraphs (b)(2) and (3) of this section) are applicable only if the design...
42 CFR 432.50 - FFP: Staffing and training costs.
Code of Federal Regulations, 2013 CFR
2013-10-01
... directly in the operation of mechanized claims processing and information retrieval systems, the rate is 75... processing and information retrieval systems, the rate is 50 percent for training and 90 percent for all... information retrieval systems (paragraphs (b)(2) and (3) of this section) are applicable only if the design...
42 CFR 432.50 - FFP: Staffing and training costs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... directly in the operation of mechanized claims processing and information retrieval systems, the rate is 75... processing and information retrieval systems, the rate is 50 percent for training and 90 percent for all... information retrieval systems (paragraphs (b)(2) and (3) of this section) are applicable only if the design...
42 CFR 432.50 - FFP: Staffing and training costs.
Code of Federal Regulations, 2011 CFR
2011-10-01
... directly in the operation of mechanized claims processing and information retrieval systems, the rate is 75... processing and information retrieval systems, the rate is 50 percent for training and 90 percent for all... information retrieval systems (paragraphs (b)(2) and (3) of this section) are applicable only if the design...
42 CFR 432.50 - FFP: Staffing and training costs.
Code of Federal Regulations, 2012 CFR
2012-10-01
... directly in the operation of mechanized claims processing and information retrieval systems, the rate is 75... processing and information retrieval systems, the rate is 50 percent for training and 90 percent for all... information retrieval systems (paragraphs (b)(2) and (3) of this section) are applicable only if the design...
EDExpress Pell Training, 2000-2001.
ERIC Educational Resources Information Center
Department of Education, Washington, DC. Student Financial Assistance.
This training manual is intended for higher education institutions that process Federal Pell Grants under a new system called the recipient financial management system (RFMS). The RFMS system is part of the electronic data exchange process which allows schools to send and receive Title IV student financial aid application data to and from the…
PCIPS 2.0: Powerful multiprofile image processing implemented on PCs
NASA Technical Reports Server (NTRS)
Smirnov, O. M.; Piskunov, N. E.
1992-01-01
Over the years, the processing power of personal computers has steadily increased. Now, 386- and 486-based PC's are fast enough for many image processing applications, and inexpensive enough even for amateur astronomers. PCIPS is an image processing system based on these platforms that was designed to satisfy a broad range of data analysis needs, while requiring minimum hardware and providing maximum expandability. It will run (albeit at a slow pace) even on a 80286 with 640K memory, but will take full advantage of bigger memory and faster CPU's. Because the actual image processing is performed by external modules, the system can be easily upgraded by the user for all sorts of scientific data analysis. PCIPS supports large format lD and 2D images in any numeric type from 8-bit integer to 64-bit floating point. The images can be displayed, overlaid, printed and any part of the data examined via an intuitive graphical user interface that employs buttons, pop-up menus, and a mouse. PCIPS automatically converts images between different types and sizes to satisfy the requirements of various applications. PCIPS features an API that lets users develop custom applications in C or FORTRAN. While doing so, a programmer can concentrate on the actual data processing, because PCIPS assumes responsibility for accessing images and interacting with the user. This also ensures that all applications, even custom ones, have a consistent and user-friendly interface. The API is compatible with factory programming, a metaphor for constructing image processing procedures that will be implemented in future versions of the system. Several application packages were created under PCIPS. The basic package includes elementary arithmetics and statistics, geometric transformations and import/export in various formats (FITS, binary, ASCII, and GIF). The CCD processing package and the spectral analysis package were successfully used to reduce spectra from the Nordic Telescope at La Palma. A photometry package is also available, and other packages are being developed. A multitasking version of PCIPS that utilizes the factory programming concept is currently under development. This version will remain compatible (on the source code level) with existing application packages and custom applications.
Mission Use of the SpaceCube Hybrid Data Processing System
NASA Technical Reports Server (NTRS)
Petrick, Dave
2017-01-01
The award-winning SpaceCube v2.0 system is a high performance, reconfigurable, hybrid data processing system that can be used in a multitude of applications including those that require a radiation hardened and reliable solution. This presentation provides an overview of the design architecture, flexibility, and the advantages of the modular SpaceCube v2.0 high performance data processing system for space applications. The current state of the proven SpaceCube technology is based on 11 years of engineering and operations. Eight systems have been successfully operated in space starting in 2008 with eight more to be delivered for payload integration in 2018 in support of various missions. This presentation will highlight how this multipurpose system is currently being used to solve design challenges of a variety of independent applications. The SpaceCube hardware adapts to new system requirements by allowing for application-unique interface cards that are utilized by reconfiguring the underlying programmable elements on the core processor card. We will show how this system is being used to improve on a heritage NASA GPS technology, enable a cutting-edge LiDAR instrument, and serve as a typical command and data handling (CDH) computer for a space robotics technology demonstration.Finally, this presentation will highlight the use of the SpaceCube v2.0 system on the Restore-L robotic satellite servicing mission. SpaceCube v2.0 is the central avionics responsible for the real-time vision system and autonomous robotic control necessary to find, capture, and service a national asset weather satellite.
A Methodology and a Web Platform for the Collaborative Development of Context-Aware Systems
Martín, David; López-de-Ipiña, Diego; Alzua-Sorzabal, Aurkene; Lamsfus, Carlos; Torres-Manzanera, Emilio
2013-01-01
Information and services personalization is essential for an optimal user experience. Systems have to be able to acquire data about the user's context, process them in order to identify the user's situation and finally, adapt the functionality of the system to that situation, but the development of context-aware systems is complex. Data coming from distributed and heterogeneous sources have to be acquired, processed and managed. Several programming frameworks have been proposed in order to simplify the development of context-aware systems. These frameworks offer high-level application programming interfaces for programmers that complicate the involvement of domain experts in the development life-cycle. The participation of users that do not have programming skills but are experts in the application domain can speed up and improve the development process of these kinds of systems. Apart from that, there is a lack of methodologies to guide the development process. This article presents as main contributions, the implementation and evaluation of a web platform and a methodology to collaboratively develop context-aware systems by programmers and domain experts. PMID:23666131
Arc spray process for the aircraft and stationary gas turbine industry
NASA Astrophysics Data System (ADS)
Sampson, E. R.; Zwetsloot, M. P.
1997-06-01
Technological advances in arc spray have produced a system that competes favorably with other thermal spray processes. In the past, arc spray was thought of as a process for very large parts that need thick buildups. However, an attachment device known as the arc jet system has been developed that focuses the pattern and accelerates the particles. This attachment device, coupled with the in-troduction of metal-cored wires that provide the same chemistries as plasma-sprayed powders, pro-vides application engineers with a viable economic alternative to existing spray methods. A comparative evaluation of a standard production plasma spray system was conducted with the arc spray process using the attachment device. This evaluation was conducted by an airline company on four major parts coated with nickel-aluminum. Results show that, for these applications, the arc spray process offers several benefits.
Low cost open data acquisition system for biomedical applications
NASA Astrophysics Data System (ADS)
Zabolotny, Wojciech M.; Laniewski-Wollk, Przemyslaw; Zaworski, Wojciech
2005-09-01
In the biomedical applications it is often necessary to collect measurement data from different devices. It is relatively easy, if the devices are equipped with a MIB or Ethernet interface, however often they feature only the asynchronous serial link, and sometimes the measured values are available only as the analog signals. The system presented in the paper is a low cost alternative to commercially available data acquisition systems. The hardware and software architecture of the system is fully open, so it is possible to customize it for particular needs. The presented system offers various possibilities to connect it to the computer based data processing unit - e.g. using the USB or Ethernet ports. Both interfaces allow also to use many such systems in parallel to increase amount of serial and analog inputs. The open source software used in the system makes possible to process the acquired data with standard tools like MATLAB, Scilab or Octave, or with a dedicated, user supplied application.
Convergent spray process for environmentally friendly coatings
NASA Technical Reports Server (NTRS)
Scarpa, Jack
1995-01-01
Conventional spray application processes have poor transfer efficiencies, resulting in an exorbitant loss in materials, solvents, and time. Also, with ever tightening Environmental Protection Agency (EPA) regulations and Occupational Safety and Health Administration requirements, the low transfer efficiencies have a significant impact on the quantities of materials and solvents that are released into the environment. High solids spray processes are also limited by material viscosities, thus requiring many passes over the surface to achieve a thickness in the 0.125 -inch range. This results in high application costs and a negative impact on the environment. Until recently, requirements for a 100 percent solid sprayable, environmentally friendly, lightweight thermal protection system that can be applied in a thick (greater than 0.125 inch) single-pass operation exceeded the capability of existing systems. Such coatings must be applied by hand lay-up techniques, especially for thermal and/or fire protection systems. The current formulation of these coatings has presented many problems such as worker safety, environmental hazards, waste, high cost, and application constraints. A system which can apply coatings without using hazardous materials would alleviate many of these problems. Potential applications include the aerospace thermal protective specialty coatings, chemical and petroleum industries that require fire-protection coatings that resist impact, chemicals, and weather. These markets can be penetrated by offering customized coatings applied by automated processes that are environmentally friendly.
Kleidon, A.
2010-01-01
The Earth system is remarkably different from its planetary neighbours in that it shows pronounced, strong global cycling of matter. These global cycles result in the maintenance of a unique thermodynamic state of the Earth's atmosphere which is far from thermodynamic equilibrium (TE). Here, I provide a simple introduction of the thermodynamic basis to understand why Earth system processes operate so far away from TE. I use a simple toy model to illustrate the application of non-equilibrium thermodynamics and to classify applications of the proposed principle of maximum entropy production (MEP) to such processes into three different cases of contrasting flexibility in the boundary conditions. I then provide a brief overview of the different processes within the Earth system that produce entropy, review actual examples of MEP in environmental and ecological systems, and discuss the role of interactions among dissipative processes in making boundary conditions more flexible. I close with a brief summary and conclusion. PMID:20368248
Kleidon, A
2010-05-12
The Earth system is remarkably different from its planetary neighbours in that it shows pronounced, strong global cycling of matter. These global cycles result in the maintenance of a unique thermodynamic state of the Earth's atmosphere which is far from thermodynamic equilibrium (TE). Here, I provide a simple introduction of the thermodynamic basis to understand why Earth system processes operate so far away from TE. I use a simple toy model to illustrate the application of non-equilibrium thermodynamics and to classify applications of the proposed principle of maximum entropy production (MEP) to such processes into three different cases of contrasting flexibility in the boundary conditions. I then provide a brief overview of the different processes within the Earth system that produce entropy, review actual examples of MEP in environmental and ecological systems, and discuss the role of interactions among dissipative processes in making boundary conditions more flexible. I close with a brief summary and conclusion.
NASA Technical Reports Server (NTRS)
Choi, H. J.; Su, Y. T.
1986-01-01
The User Constraint Measurement System (UCMS) is a hardware/software package developed by NASA Goddard to measure the signal parameter constraints of the user transponder in the TDRSS environment by means of an all-digital signal sampling technique. An account is presently given of the features of UCMS design and of its performance capabilities and applications; attention is given to such important aspects of the system as RF interface parameter definitions, hardware minimization, the emphasis on offline software signal processing, and end-to-end link performance. Applications to the measurement of other signal parameters are also discussed.
Graphics Processing Units for HEP trigger systems
NASA Astrophysics Data System (ADS)
Ammendola, R.; Bauce, M.; Biagioni, A.; Chiozzi, S.; Cotta Ramusino, A.; Fantechi, R.; Fiorini, M.; Giagu, S.; Gianoli, A.; Lamanna, G.; Lonardo, A.; Messina, A.; Neri, I.; Paolucci, P. S.; Piandani, R.; Pontisso, L.; Rescigno, M.; Simula, F.; Sozzi, M.; Vicini, P.
2016-07-01
General-purpose computing on GPUs (Graphics Processing Units) is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughput, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming ripe. We will discuss the use of online parallel computing on GPU for synchronous low level trigger, focusing on CERN NA62 experiment trigger system. The use of GPU in higher level trigger system is also briefly considered.
NASA Technical Reports Server (NTRS)
Stackpoole, Margaret; Gusman, M.; Ellerby, D.; Johnson, S. M.; Arnold, Jim (Technical Monitor)
2001-01-01
The Thermal Protection Materials and Systems Branch at NASA Ames Research Center is involved in the development of a class of refractory oxidation-resistant diboride composites termed Ultra High Temperature Ceramics or UHTCs. These composites have good high temperature properties making them candidate materials for thermal protection system (TPS) applications. The current research focuses on improving processing methods to develop more reliable composites with enhanced thermal and mechanical properties. This presentation will concentrate on the processing of ZrB2/SiC composites. Some preliminary mechanical properties and oxidation data will also be presented.
Remote hardware-reconfigurable robotic camera
NASA Astrophysics Data System (ADS)
Arias-Estrada, Miguel; Torres-Huitzil, Cesar; Maya-Rueda, Selene E.
2001-10-01
In this work, a camera with integrated image processing capabilities is discussed. The camera is based on an imager coupled to an FPGA device (Field Programmable Gate Array) which contains an architecture for real-time computer vision low-level processing. The architecture can be reprogrammed remotely for application specific purposes. The system is intended for rapid modification and adaptation for inspection and recognition applications, with the flexibility of hardware and software reprogrammability. FPGA reconfiguration allows the same ease of upgrade in hardware as a software upgrade process. The camera is composed of a digital imager coupled to an FPGA device, two memory banks, and a microcontroller. The microcontroller is used for communication tasks and FPGA programming. The system implements a software architecture to handle multiple FPGA architectures in the device, and the possibility to download a software/hardware object from the host computer into its internal context memory. System advantages are: small size, low power consumption, and a library of hardware/software functionalities that can be exchanged during run time. The system has been validated with an edge detection and a motion processing architecture, which will be presented in the paper. Applications targeted are in robotics, mobile robotics, and vision based quality control.
Mini-Ckpts: Surviving OS Failures in Persistent Memory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fiala, David; Mueller, Frank; Ferreira, Kurt Brian
Concern is growing in the high-performance computing (HPC) community on the reliability of future extreme-scale systems. Current efforts have focused on application fault-tolerance rather than the operating system (OS), despite the fact that recent studies have suggested that failures in OS memory are more likely. The OS is critical to a system's correct and efficient operation of the node and processes it governs -- and in HPC also for any other nodes a parallelized application runs on and communicates with: Any single node failure generally forces all processes of this application to terminate due to tight communication in HPC. Therefore,more » the OS itself must be capable of tolerating failures. In this work, we introduce mini-ckpts, a framework which enables application survival despite the occurrence of a fatal OS failure or crash. Mini-ckpts achieves this tolerance by ensuring that the critical data describing a process is preserved in persistent memory prior to the failure. Following the failure, the OS is rejuvenated via a warm reboot and the application continues execution effectively making the failure and restart transparent. The mini-ckpts rejuvenation and recovery process is measured to take between three to six seconds and has a failure-free overhead of between 3-5% for a number of key HPC workloads. In contrast to current fault-tolerance methods, this work ensures that the operating and runtime system can continue in the presence of faults. This is a much finer-grained and dynamic method of fault-tolerance than the current, coarse-grained, application-centric methods. Handling faults at this level has the potential to greatly reduce overheads and enables mitigation of additional fault scenarios.« less
Chandler, Jacqueline; Rycroft-Malone, Jo; Hawkes, Claire; Noyes, Jane
2016-02-01
To examine the application of core concepts from Complexity Theory to explain the findings from a process evaluation undertaken in a trial evaluating implementation strategies for recommendations about reducing surgical fasting times. The proliferation of evidence-based guidance requires a greater focus on its implementation. Theory is required to explain the complex processes across the multiple healthcare organizational levels. This social healthcare context involves the interaction between professionals, patients and the organizational systems in care delivery. Complexity Theory may provide an explanatory framework to explain the complexities inherent in implementation in social healthcare contexts. A secondary thematic analysis of qualitative process evaluation data informed by Complexity Theory. Seminal texts applying Complexity Theory to the social context were annotated, key concepts extracted and core Complexity Theory concepts identified. These core concepts were applied as a theoretical lens to provide an explanation of themes from a process evaluation of a trial evaluating the implementation of strategies to reduce surgical fasting times. Sampled substantive texts provided a representative spread of theoretical development and application of Complexity Theory from late 1990's-2013 in social science, healthcare, management and philosophy. Five Complexity Theory core concepts extracted were 'self-organization', 'interaction', 'emergence', 'system history' and 'temporality'. Application of these concepts suggests routine surgical fasting practice is habituated in the social healthcare system and therefore it cannot easily be reversed. A reduction to fasting times requires an incentivised new approach to emerge in the surgical system's priority of completing the operating list. The application of Complexity Theory provides a useful explanation for resistance to change fasting practice. Its utility in implementation research warrants further attention and evaluation. © 2015 John Wiley & Sons Ltd.
Image processing system performance prediction and product quality evaluation
NASA Technical Reports Server (NTRS)
Stein, E. K.; Hammill, H. B. (Principal Investigator)
1976-01-01
The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.
Smart Sensor Systems for Aerospace Applications: From Sensor Development to Application Testing
NASA Technical Reports Server (NTRS)
Hunter, G. W.; Xu, J. C.; Dungan, L. K.; Ward, B. J.; Rowe, S.; Williams, J.; Makel, D. B.; Liu, C. C.; Chang, C. W.
2008-01-01
The application of Smart Sensor Systems for aerospace applications is a multidisciplinary process consisting of sensor element development, element integration into Smart Sensor hardware, and testing of the resulting sensor systems in application environments. This paper provides a cross-section of these activities for multiple aerospace applications illustrating the technology challenges involved. The development and application testing topics discussed are: 1) The broadening of sensitivity and operational range of silicon carbide (SiC) Schottky gas sensor elements; 2) Integration of fire detection sensor technology into a "Lick and Stick" Smart Sensor hardware platform for Crew Exploration Vehicle applications; 3) Extended testing for zirconia based oxygen sensors in the basic "Lick and Stick" platform for environmental monitoring applications. It is concluded that that both core sensor platform technology and a basic hardware platform can enhance the viability of implementing smart sensor systems in aerospace applications.
AOIPS water resources data management system
NASA Technical Reports Server (NTRS)
Merritt, E. S.; Shotwell, R. L.; Place, M. C.; Belknap, N. J.
1976-01-01
A geocoded data management system applicable for hydrological applications was designed to demonstrate the utility of the Atmospheric and Oceanographic Information Processing System (AOIPS) for hydrological applications. Within that context, the geocoded hydrology data management system was designed to take advantage of the interactive capability of the AOIPS hardware. Portions of the Water Resource Data Management System which best demonstrate the interactive nature of the hydrology data management system were implemented on the AOIPS. A hydrological case study was prepared using all data supplied for the Bear River watershed located in northwest Utah, southeast Idaho, and western Wyoming.
NASA Technical Reports Server (NTRS)
Harper, Richard
1989-01-01
In a fault-tolerant parallel computer, a functional programming model can facilitate distributed checkpointing, error recovery, load balancing, and graceful degradation. Such a model has been implemented on the Draper Fault-Tolerant Parallel Processor (FTPP). When used in conjunction with the FTPP's fault detection and masking capabilities, this implementation results in a graceful degradation of system performance after faults. Three graceful degradation algorithms have been implemented and are presented. A user interface has been implemented which requires minimal cognitive overhead by the application programmer, masking such complexities as the system's redundancy, distributed nature, variable complement of processing resources, load balancing, fault occurrence and recovery. This user interface is described and its use demonstrated. The applicability of the functional programming style to the Activation Framework, a paradigm for intelligent systems, is then briefly described.
Fuzzy logic applications to expert systems and control
NASA Technical Reports Server (NTRS)
Lea, Robert N.; Jani, Yashvant
1991-01-01
A considerable amount of work on the development of fuzzy logic algorithms and application to space related control problems has been done at the Johnson Space Center (JSC) over the past few years. Particularly, guidance control systems for space vehicles during proximity operations, learning systems utilizing neural networks, control of data processing during rendezvous navigation, collision avoidance algorithms, camera tracking controllers, and tether controllers have been developed utilizing fuzzy logic technology. Several other areas in which fuzzy sets and related concepts are being considered at JSC are diagnostic systems, control of robot arms, pattern recognition, and image processing. It has become evident, based on the commercial applications of fuzzy technology in Japan and China during the last few years, that this technology should be exploited by the government as well as private industry for energy savings.
Prevention Guidance for Isocyanate-Induced Asthma Using Occupational Surveillance Data
Reeb-Whitaker, Carolyn; Anderson, Naomi J.; Bonauto, David K.
2013-01-01
Data from Washington State's work-related asthma surveillance system were used to characterize isocyanate-induced asthma cases occurring from 1999 through 2010. Injured worker interviews and medical records were used to describe the industry, job title, work process, workers’ compensation cost, and exposure trends associated with 27 cases of isocyanate-induced asthma. The majority (81%) of cases were classified within the surveillance system as new-onset asthma while 19% were classified as work-aggravated asthma. The workers’ compensation cost for isocyanate-induced asthma cases was $1.7 million; this was 14% of the total claims cost for all claims in the asthma surveillance system. The majority of cases (48%) occurred from paint processes, followed by foam application or foam manufacturing (22%). Nine of the asthma cases associated with spray application occurred during application to large or awkward-shaped objects. Six workers who did not directly handle isocyanates (indirect exposure) developed new-onset asthma. Two cases suggest that skin contact and processes secondary to the isocyanate spray application, such as cleanup, contributed to immune sensitization. Surveillance data provide insight for the prevention of isocyanate-induced respiratory disease. Key observations are made regarding the development of work-related asthma in association with a) paint application on large objects difficult to ventilate, b) indirect exposure to isocyanates, c) exposure during secondary or cleanup processes, and d) reports of dermal exposure. PMID:24116665
Prevention guidance for isocyanate-induced asthma using occupational surveillance data.
Reeb-Whitaker, Carolyn; Anderson, Naomi J; Bonauto, David K
2013-01-01
Data from Washington State's work-related asthma surveillance system were used to characterize isocyanate-induced asthma cases occurring from 1999 through 2010. Injured worker interviews and medical records were used to describe the industry, job title, work process, workers' compensation cost, and exposure trends associated with 27 cases of isocyanate-induced asthma. The majority (81%) of cases were classified within the surveillance system as new-onset asthma while 19% were classified as work-aggravated asthma. The workers' compensation cost for isocyanate-induced asthma cases was $1.7 million; this was 14% of the total claims cost for all claims in the asthma surveillance system. The majority of cases (48%) occurred from paint processes, followed by foam application or foam manufacturing (22%). Nine of the asthma cases associated with spray application occurred during application to large or awkward-shaped objects. Six workers who did not directly handle isocyanates (indirect exposure) developed new-onset asthma. Two cases suggest that skin contact and processes secondary to the isocyanate spray application, such as cleanup, contributed to immune sensitization. Surveillance data provide insight for the prevention of isocyanate-induced respiratory disease. Key observations are made regarding the development of work-related asthma in association with a) paint application on large objects difficult to ventilate, b) indirect exposure to isocyanates, c) exposure during secondary or cleanup processes, and d) reports of dermal exposure.
Code of Federal Regulations, 2012 CFR
2012-01-01
... the safety element for which the safety approval is sought. (ii) Engineering design and analyses that... TRANSPORTATION LICENSING SAFETY APPROVALS Application Procedures § 414.11 Application. (a) The application must...) Safety element (i.e., launch vehicle, reentry vehicle, safety system, process, service, or any identified...
Code of Federal Regulations, 2013 CFR
2013-01-01
... the safety element for which the safety approval is sought. (ii) Engineering design and analyses that... TRANSPORTATION LICENSING SAFETY APPROVALS Application Procedures § 414.11 Application. (a) The application must...) Safety element (i.e., launch vehicle, reentry vehicle, safety system, process, service, or any identified...
Code of Federal Regulations, 2014 CFR
2014-01-01
... the safety element for which the safety approval is sought. (ii) Engineering design and analyses that... TRANSPORTATION LICENSING SAFETY APPROVALS Application Procedures § 414.11 Application. (a) The application must...) Safety element (i.e., launch vehicle, reentry vehicle, safety system, process, service, or any identified...
Attractive and repulsive magnetic suspension systems overview
NASA Technical Reports Server (NTRS)
Cope, David B.; Fontana, Richard R.
1992-01-01
Magnetic suspension systems can be used in a wide variety of applications. The decision of whether to use an attractive or repulsive suspension system for a particular application is a fundamental one which must be made during the design process. As an aid to the designer, we compare and contrast attractive and repulsive magnetic suspension systems and indicate whether and under what conditions one or the other system is preferred.
Application Processing, 2003-2004. EDExpress Training. Participant Guide.
ERIC Educational Resources Information Center
Office of Student Financial Assistance (ED), Washington, DC.
This participant guide contains training materials for processing applications for student financial aid under the EDExpress system. Representatives of institutions of higher education receive training in the use of EDExpress software that allows the school to manage student financial aid records. The guide contains these sessions: (1) welcome and…
2016-01-07
news. Both of these resemble typical activities of intelligence analysts in OSINT processing and production applications. We assessed two task...intelligence analysts in a number of OSINT processing and production applications. (5) Summary of the most important results In both settings
7 CFR 1780.35 - Processing office review.
Code of Federal Regulations, 2012 CFR
2012-01-01
... assistance when the debt service portion of the average annual EDU cost, for users in the applicant's service... AGRICULTURE (CONTINUED) WATER AND WASTE LOANS AND GRANTS Loan and Grant Application Processing § 1780.35... office in accordance with the following provisions and will not result in EDU costs below similar system...
7 CFR 1780.35 - Processing office review.
Code of Federal Regulations, 2011 CFR
2011-01-01
... assistance when the debt service portion of the average annual EDU cost, for users in the applicant's service... AGRICULTURE (CONTINUED) WATER AND WASTE LOANS AND GRANTS Loan and Grant Application Processing § 1780.35... office in accordance with the following provisions and will not result in EDU costs below similar system...
7 CFR 1780.35 - Processing office review.
Code of Federal Regulations, 2013 CFR
2013-01-01
... assistance when the debt service portion of the average annual EDU cost, for users in the applicant's service... AGRICULTURE (CONTINUED) WATER AND WASTE LOANS AND GRANTS Loan and Grant Application Processing § 1780.35... office in accordance with the following provisions and will not result in EDU costs below similar system...
7 CFR 1780.35 - Processing office review.
Code of Federal Regulations, 2010 CFR
2010-01-01
... assistance when the debt service portion of the average annual EDU cost, for users in the applicant's service... AGRICULTURE (CONTINUED) WATER AND WASTE LOANS AND GRANTS Loan and Grant Application Processing § 1780.35... office in accordance with the following provisions and will not result in EDU costs below similar system...
7 CFR 1780.35 - Processing office review.
Code of Federal Regulations, 2014 CFR
2014-01-01
... assistance when the debt service portion of the average annual EDU cost, for users in the applicant's service... AGRICULTURE (CONTINUED) WATER AND WASTE LOANS AND GRANTS Loan and Grant Application Processing § 1780.35... office in accordance with the following provisions and will not result in EDU costs below similar system...
Seismic signal processing on heterogeneous supercomputers
NASA Astrophysics Data System (ADS)
Gokhberg, Alexey; Ermert, Laura; Fichtner, Andreas
2015-04-01
The processing of seismic signals - including the correlation of massive ambient noise data sets - represents an important part of a wide range of seismological applications. It is characterized by large data volumes as well as high computational input/output intensity. Development of efficient approaches towards seismic signal processing on emerging high performance computing systems is therefore essential. Heterogeneous supercomputing systems introduced in the recent years provide numerous computing nodes interconnected via high throughput networks, every node containing a mix of processing elements of different architectures, like several sequential processor cores and one or a few graphical processing units (GPU) serving as accelerators. A typical representative of such computing systems is "Piz Daint", a supercomputer of the Cray XC 30 family operated by the Swiss National Supercomputing Center (CSCS), which we used in this research. Heterogeneous supercomputers provide an opportunity for manifold application performance increase and are more energy-efficient, however they have much higher hardware complexity and are therefore much more difficult to program. The programming effort may be substantially reduced by the introduction of modular libraries of software components that can be reused for a wide class of seismology applications. The ultimate goal of this research is design of a prototype for such library suitable for implementing various seismic signal processing applications on heterogeneous systems. As a representative use case we have chosen an ambient noise correlation application. Ambient noise interferometry has developed into one of the most powerful tools to image and monitor the Earth's interior. Future applications will require the extraction of increasingly small details from noise recordings. To meet this demand, more advanced correlation techniques combined with very large data volumes are needed. This poses new computational problems that require dedicated HPC solutions. The chosen application is using a wide range of common signal processing methods, which include various IIR filter designs, amplitude and phase correlation, computing the analytic signal, and discrete Fourier transforms. Furthermore, various processing methods specific for seismology, like rotation of seismic traces, are used. Efficient implementation of all these methods on the GPU-accelerated systems represents several challenges. In particular, it requires a careful distribution of work between the sequential processors and accelerators. Furthermore, since the application is designed to process very large volumes of data, special attention had to be paid to the efficient use of the available memory and networking hardware resources in order to reduce intensity of data input and output. In our contribution we will explain the software architecture as well as principal engineering decisions used to address these challenges. We will also describe the programming model based on C++ and CUDA that we used to develop the software. Finally, we will demonstrate performance improvements achieved by using the heterogeneous computing architecture. This work was supported by a grant from the Swiss National Supercomputing Centre (CSCS) under project ID d26.
48 CFR 9904.414-50 - Techniques for application.
Code of Federal Regulations, 2010 CFR
2010-10-01
... the case of process cost accounting systems, the contracting parties may agree to substitute an.... 9904.414-50 Section 9904.414-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.414-50 Techniques for application. (a) The investment...
Applications of active microwave imagery
NASA Technical Reports Server (NTRS)
Weber, F. P.; Childs, L. F.; Gilbert, R.; Harlan, J. C.; Hoffer, R. M.; Miller, J. M.; Parsons, J.; Polcyn, F.; Schardt, B. B.; Smith, J. L.
1978-01-01
The following topics were discussed in reference to active microwave applications: (1) Use of imaging radar to improve the data collection/analysis process; (2) Data collection tasks for radar that other systems will not perform; (3) Data reduction concepts; and (4) System and vehicle parameters: aircraft and spacecraft.
Herrera-May, Agustín Leobardo; Soler-Balcazar, Juan Carlos; Vázquez-Leal, Héctor; Martínez-Castillo, Jaime; Vigueras-Zuñiga, Marco Osvaldo; Aguilera-Cortés, Luz Antonio
2016-08-24
Microelectromechanical systems (MEMS) resonators have allowed the development of magnetic field sensors with potential applications such as biomedicine, automotive industry, navigation systems, space satellites, telecommunications and non-destructive testing. We present a review of recent magnetic field sensors based on MEMS resonators, which operate with Lorentz force. These sensors have a compact structure, wide measurement range, low energy consumption, high sensitivity and suitable performance. The design methodology, simulation tools, damping sources, sensing techniques and future applications of magnetic field sensors are discussed. The design process is fundamental in achieving correct selection of the operation principle, sensing technique, materials, fabrication process and readout systems of the sensors. In addition, the description of the main sensing systems and challenges of the MEMS sensors are discussed. To develop the best devices, researches of their mechanical reliability, vacuum packaging, design optimization and temperature compensation circuits are needed. Future applications will require multifunctional sensors for monitoring several physical parameters (e.g., magnetic field, acceleration, angular ratio, humidity, temperature and gases).
Herrera-May, Agustín Leobardo; Soler-Balcazar, Juan Carlos; Vázquez-Leal, Héctor; Martínez-Castillo, Jaime; Vigueras-Zuñiga, Marco Osvaldo; Aguilera-Cortés, Luz Antonio
2016-01-01
Microelectromechanical systems (MEMS) resonators have allowed the development of magnetic field sensors with potential applications such as biomedicine, automotive industry, navigation systems, space satellites, telecommunications and non-destructive testing. We present a review of recent magnetic field sensors based on MEMS resonators, which operate with Lorentz force. These sensors have a compact structure, wide measurement range, low energy consumption, high sensitivity and suitable performance. The design methodology, simulation tools, damping sources, sensing techniques and future applications of magnetic field sensors are discussed. The design process is fundamental in achieving correct selection of the operation principle, sensing technique, materials, fabrication process and readout systems of the sensors. In addition, the description of the main sensing systems and challenges of the MEMS sensors are discussed. To develop the best devices, researches of their mechanical reliability, vacuum packaging, design optimization and temperature compensation circuits are needed. Future applications will require multifunctional sensors for monitoring several physical parameters (e.g., magnetic field, acceleration, angular ratio, humidity, temperature and gases). PMID:27563912
Warehouses information system design and development
NASA Astrophysics Data System (ADS)
Darajatun, R. A.; Sukanta
2017-12-01
Materials/goods handling industry is fundamental for companies to ensure the smooth running of their warehouses. Efficiency and organization within every aspect of the business is essential in order to gain a competitive advantage. The purpose of this research is design and development of Kanban of inventory storage and delivery system. Application aims to facilitate inventory stock checks to be more efficient and effective. Users easily input finished goods from production department, warehouse, customer, and also suppliers. Master data designed as complete as possible to be prepared applications used in a variety of process logistic warehouse variations. The author uses Java programming language to develop the application, which is used for building Java Web applications, while the database used is MySQL. System development methodology that I use is the Waterfall methodology. Waterfall methodology has several stages of the Analysis, System Design, Implementation, Integration, Operation and Maintenance. In the process of collecting data the author uses the method of observation, interviews, and literature.
On a production system using default reasoning for pattern classification
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Lowe, Carlyle M.
1990-01-01
This paper addresses an unconventional application of a production system to a problem involving belief specialization. The production system reduces a large quantity of low-level descriptions into just a few higher-level descriptions that encompass the problem space in a more tractable fashion. This classification process utilizes a set of descriptions generated by combining the component hierarchy of a physical system with the semantics of the terminology employed in its operation. The paper describes an application of this process in a program, constructed in C and CLIPS, that classifies signatures of electromechanical system configurations. The program compares two independent classifications, describing the actual and expected system configurations, in order to generate a set of contradictions between the two.
NASA Technical Reports Server (NTRS)
Clayton, C.; Raley, R.; Zook, L.
2001-01-01
The solid rocket booster (SRB) has historically used a chromate conversion coating prior to protective finish application. After conversion coating, an organic paint system consisting of a chromated epoxy primer and polyurethane topcoat is applied. An overall systems approach was selected to reduce waste generation from the coatings application and removal processes. While the most obvious waste reduction opportunity involved elimination of the chromate conversion coating, several other coating system configurations were explored in an attempt to reduce the total waste. This paper will briefly discuss the use of a systems view to reduce waste generation from the coating process and present the results of the qualification testing of nonchromated aluminum pretreatments and alternate coating systems configurations.
Pattern recognition and expert image analysis systems in biomedical image processing (Invited Paper)
NASA Astrophysics Data System (ADS)
Oosterlinck, A.; Suetens, P.; Wu, Q.; Baird, M.; F. M., C.
1987-09-01
This paper gives an overview of pattern recoanition techniques (P.R.) used in biomedical image processing and problems related to the different P.R. solutions. Also the use of knowledge based systems to overcome P.R. difficulties, is described. This is illustrated by a common example ofabiomedical image processing application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaskell, D.R.; Hager, J.P.; Hoffmann, J.E.
1987-01-01
This book contains papers that cover the following topics: high intensity smelting, novel aspects of gold recovery, resin membrane applications in hydrometallurgy, process analysis and characterization, fundamental studies in pyrometallurgical systems, advances in electroextraction, new process chemistry, process engineering in pyrometallurgical systems, and developments in hydrometallurgy.
Knowledge Reasoning with Semantic Data for Real-Time Data Processing in Smart Factory
Wang, Shiyong; Li, Di; Liu, Chengliang
2018-01-01
The application of high-bandwidth networks and cloud computing in manufacturing systems will be followed by mass data. Industrial data analysis plays important roles in condition monitoring, performance optimization, flexibility, and transparency of the manufacturing system. However, the currently existing architectures are mainly for offline data analysis, not suitable for real-time data processing. In this paper, we first define the smart factory as a cloud-assisted and self-organized manufacturing system in which physical entities such as machines, conveyors, and products organize production through intelligent negotiation and the cloud supervises this self-organized process for fault detection and troubleshooting based on data analysis. Then, we propose a scheme to integrate knowledge reasoning and semantic data where the reasoning engine processes the ontology model with real time semantic data coming from the production process. Based on these ideas, we build a benchmarking system for smart candy packing application that supports direct consumer customization and flexible hybrid production, and the data are collected and processed in real time for fault diagnosis and statistical analysis. PMID:29415444
Process control systems at Homer City coal preparation plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shell, W.P.
1983-03-01
An important part of process control engineering is the implementation of the basic control system design through commissioning to routine operation. This is a period when basic concepts can be reviewed and improvements either implemented or recorded for application in future systems. The experience of commissioning the process control systems in the Homer City coal cleaning plant are described and discussed. The current level of operating control performance in individual sections and the overall system are also reported and discussed.
Developments in metallic materials for aerospace applications
NASA Astrophysics Data System (ADS)
Wadsworth, J.; Froes, F. H.
1989-05-01
High-performance aerospace systems are creating a demand for new materials, not only for airframe and engine applications, but for missile and space systems as well. Recently, advances have been made in metallic materials systems based on magnesium, aluminum, titanium and niobium using a variety of processing methods, including ingot casting, powder metallurgy, rapid solidification and composite technology.
Evaluation of the Intel iWarp parallel processor for space flight applications
NASA Technical Reports Server (NTRS)
Hine, Butler P., III; Fong, Terrence W.
1993-01-01
The potential of a DARPA-sponsored advanced processor, the Intel iWarp, for use in future SSF Data Management Systems (DMS) upgrades is evaluated through integration into the Ames DMS testbed and applications testing. The iWarp is a distributed, parallel computing system well suited for high performance computing applications such as matrix operations and image processing. The system architecture is modular, supports systolic and message-based computation, and is capable of providing massive computational power in a low-cost, low-power package. As a consequence, the iWarp offers significant potential for advanced space-based computing. This research seeks to determine the iWarp's suitability as a processing device for space missions. In particular, the project focuses on evaluating the ease of integrating the iWarp into the SSF DMS baseline architecture and the iWarp's ability to support computationally stressing applications representative of SSF tasks.
OpenDA Open Source Generic Data Assimilation Environment and its Application in Process Models
NASA Astrophysics Data System (ADS)
El Serafy, Ghada; Verlaan, Martin; Hummel, Stef; Weerts, Albrecht; Dhondia, Juzer
2010-05-01
Data Assimilation techniques are essential elements in state-of-the-art development of models and their optimization with data in the field of groundwater, surface water and soil systems. They are essential tools in calibration of complex modelling systems and improvement of model forecasts. The OpenDA is a new and generic open source data assimilation environment for application to a choice of physical process models, applied to case dependent domains. OpenDA was introduced recently when the developers of Costa, an open-source TU Delft project [http://www.costapse.org; Van Velzen and Verlaan; 2007] and those of the DATools from the former WL|Delft Hydraulics [El Serafy et al 2007; Weerts et al. 2009] decided to join forces. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modelling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model duscribing a process(atmospheric processes, 3D circulation, 2D water level, sea surface temperature, soil systems, groundwater etc.). Presently, OpenDA features filtering techniques and calibration techniques. The presentation will give an overview of the OpenDA and the results of some of its practical applications. Application of data assimilation in portable operational forecasting systems—the DATools assimilation environment, El Serafy G.Y., H. Gerritsen, S. Hummel, A. H. Weerts, A.E. Mynett and M. Tanaka (2007), Journal of Ocean Dynamics, DOI 10.1007/s10236-007-0124-3, pp.485-499. COSTA a problem solving environment for data assimilation applied for hydrodynamical modelling, Van Velzen and Verlaan (2007), Meteorologische Zeitschrift, Volume 16, Number 6, December 2007 , pp. 777-793(17). Application of generic data assimilation tools (DATools) for flood forecasting purposes, A.H. Weerts, G.Y.H. El Serafy, S. Hummel, J. Dhondia, and H. Gerritsen (2009), accepted by Geoscience & Computers.
Porting the AVS/Express scientific visualization software to Cray XT4.
Leaver, George W; Turner, Martin J; Perrin, James S; Mummery, Paul M; Withers, Philip J
2011-08-28
Remote scientific visualization, where rendering services are provided by larger scale systems than are available on the desktop, is becoming increasingly important as dataset sizes increase beyond the capabilities of desktop workstations. Uptake of such services relies on access to suitable visualization applications and the ability to view the resulting visualization in a convenient form. We consider five rules from the e-Science community to meet these goals with the porting of a commercial visualization package to a large-scale system. The application uses message-passing interface (MPI) to distribute data among data processing and rendering processes. The use of MPI in such an interactive application is not compatible with restrictions imposed by the Cray system being considered. We present details, and performance analysis, of a new MPI proxy method that allows the application to run within the Cray environment yet still support MPI communication required by the application. Example use cases from materials science are considered.
System Considerations and Challendes in 3d Mapping and Modeling Using Low-Cost Uav Systems
NASA Astrophysics Data System (ADS)
Lari, Z.; El-Sheimy, N.
2015-08-01
In the last few years, low-cost UAV systems have been acknowledged as an affordable technology for geospatial data acquisition that can meet the needs of a variety of traditional and non-traditional mapping applications. In spite of its proven potential, UAV-based mapping is still lacking in terms of what is needed for it to become an acceptable mapping tool. In other words, a well-designed system architecture that considers payload restrictions as well as the specifications of the utilized direct geo-referencing component and the imaging systems in light of the required mapping accuracy and intended application is still required. Moreover, efficient data processing workflows, which are capable of delivering the mapping products with the specified quality while considering the synergistic characteristics of the sensors onboard, the wide range of potential users who might lack deep knowledge in mapping activities, and time constraints of emerging applications, are still needed to be adopted. Therefore, the introduced challenges by having low-cost imaging and georeferencing sensors onboard UAVs with limited payload capability, the necessity of efficient data processing techniques for delivering required products for intended applications, and the diversity of potential users with insufficient mapping-related expertise needs to be fully investigated and addressed by UAV-based mapping research efforts. This paper addresses these challenges and reviews system considerations, adaptive processing techniques, and quality assurance/quality control procedures for achievement of accurate mapping products from these systems.
Schwessinger, Benjamin; Li, Xiang; Ellinghaus, Thomas L; Chan, Leanne Jade G; Wei, Tong; Joe, Anna; Thomas, Nicholas; Pruitt, Rory; Adams, Paul D; Chern, Maw Sheng; Petzold, Christopher J; Liu, Chang C; Ronald, Pamela C
2016-04-18
Posttranslational modification (PTM) of proteins and peptides is important for diverse biological processes in plants and animals. The paucity of heterologous expression systems for PTMs and the technical challenges associated with chemical synthesis of these modified proteins has limited detailed molecular characterization and therapeutic applications. Here we describe an optimized system for expression of tyrosine-sulfated proteins in Escherichia coli and its application in a bio-based crop protection strategy in rice.
Schwessinger, Benjamin; Li, Xiang; Ellinghaus, Thomas L.; ...
2015-11-27
Posttranslational modification (PTM) of proteins and peptides is important for diverse biological processes in plants and animals. The paucity of heterologous expression systems for PTMs and the technical challenges associated with chemical synthesis of these modified proteins has limited detailed molecular characterization and therapeutic applications. Here we describe an optimized system for expression of tyrosine-sulfated proteins in Escherichia coli and its application in a bio-based crop protection strategy in rice.
NASA Technical Reports Server (NTRS)
Gevarter, W. B.
1983-01-01
Artificial Intelligence (AI) is an emerging technology that has recently attracted considerable attention. Many applications are now under development. This report, Part B of a three part report on AI, presents overviews of the key application areas: Expert Systems, Computer Vision, Natural Language Processing, Speech Interfaces, and Problem Solving and Planning. The basic approaches to such systems, the state-of-the-art, existing systems and future trends and expectations are covered.
Software/hardware distributed processing network supporting the Ada environment
NASA Astrophysics Data System (ADS)
Wood, Richard J.; Pryk, Zen
1993-09-01
A high-performance, fault-tolerant, distributed network has been developed, tested, and demonstrated. The network is based on the MIPS Computer Systems, Inc. R3000 Risc for processing, VHSIC ASICs for high speed, reliable, inter-node communications and compatible commercial memory and I/O boards. The network is an evolution of the Advanced Onboard Signal Processor (AOSP) architecture. It supports Ada application software with an Ada- implemented operating system. A six-node implementation (capable of expansion up to 256 nodes) of the RISC multiprocessor architecture provides 120 MIPS of scalar throughput, 96 Mbytes of RAM and 24 Mbytes of non-volatile memory. The network provides for all ground processing applications, has merit for space-qualified RISC-based network, and interfaces to advanced Computer Aided Software Engineering (CASE) tools for application software development.
An Intelligent Pictorial Information System
NASA Astrophysics Data System (ADS)
Lee, Edward T.; Chang, B.
1987-05-01
In examining the history of computer application, we discover that early computer systems were developed primarily for applications related to scientific computation, as in weather prediction, aerospace applications, and nuclear physics applications. At this stage, the computer system served as a big calculator to perform, in the main, manipulation of numbers. Then it was found that computer systems could also be used for business applications, information storage and retrieval, word processing, and report generation. The history of computer application is summarized in Table I. The complexity of pictures makes picture processing much more difficult than number and alphanumerical processing. Therefore, new techniques, new algorithms, and above all, new pictorial knowledge, [1] are needed to overcome the limitatins of existing computer systems. New frontiers in designing computer systems are the ways to handle the representation,[2,3] classification, manipulation, processing, storage, and retrieval of pictures. Especially, the ways to deal with similarity measures and the meaning of the word "approximate" and the phrase "approximate reasoning" are an important and an indispensable part of an intelligent pictorial information system. [4,5] The main objective of this paper is to investigate the mathematical foundation for the effective organization and efficient retrieval of pictures in similarity-directed pictorial databases, [6] based on similarity retrieval techniques [7] and fuzzy languages [8]. The main advantage of this approach is that similar pictures are stored logically close to each other by using quantitative similarity measures. Thus, for answering queries, the amount of picture data needed to be searched can be reduced and the retrieval time can be improved. In addition, in a pictorial database, very often it is desired to find pictures (or feature vectors, histograms, etc.) that are most similar to or most dissimilar [9] to a test picture (or feature vector). Using similarity measures, one can not only store similar pictures logically or physically close to each other in order to improve retrieval or updating efficiency, one can also use such similarity measures to answer fuzzy queries involving nonexact retrieval conditions. In this paper, similarity directed pictorial databases involving geometric figures, chromosome images, [10] leukocyte images, cardiomyopathy images, and satellite images [11] are presented as illustrative examples.
Integration of image capture and processing: beyond single-chip digital camera
NASA Astrophysics Data System (ADS)
Lim, SukHwan; El Gamal, Abbas
2001-05-01
An important trend in the design of digital cameras is the integration of capture and processing onto a single CMOS chip. Although integrating the components of a digital camera system onto a single chip significantly reduces system size and power, it does not fully exploit the potential advantages of integration. We argue that a key advantage of integration is the ability to exploit the high speed imaging capability of CMOS image senor to enable new applications such as multiple capture for enhancing dynamic range and to improve the performance of existing applications such as optical flow estimation. Conventional digital cameras operate at low frame rates and it would be too costly, if not infeasible, to operate their chips at high frame rates. Integration solves this problem. The idea is to capture images at much higher frame rates than he standard frame rate, process the high frame rate data on chip, and output the video sequence and the application specific data at standard frame rate. This idea is applied to optical flow estimation, where significant performance improvements are demonstrate over methods using standard frame rate sequences. We then investigate the constraints on memory size and processing power that can be integrated with a CMOS image sensor in a 0.18 micrometers process and below. We show that enough memory and processing power can be integrated to be able to not only perform the functions of a conventional camera system but also to perform applications such as real time optical flow estimation.
Application of rumen microorganisms for anaerobic bioconversion of lignocellulosic biomass.
Yue, Zheng-Bo; Li, Wen-Wei; Yu, Han-Qing
2013-01-01
Rumen in the mammalian animals is a natural cellulose-degrading system and the microorganisms inside have been found to be able to effectively digest lignocellulosic biomass. Furthermore, methane or volatile fatty acids, which could be further converted to other biofuels, are the two major products in such a system. This paper offers an overview of recent development in the application of rumen microorganisms for lignocellulosic biomass conversion. Application of recent molecular tools in the analysis of rumen microbial community, progress in the development of artificial rumen reactors, the latest research results about characterizing rumen-dominated anaerobic digestion process and energy products are summarized. Also, the potential application of such a rumen-dominated process is discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Dual-Use Space Technology Transfer Conference and Exhibition. Volume 1
NASA Technical Reports Server (NTRS)
Krishen, Kumar (Compiler)
1994-01-01
This document contains papers presented at the Dual-Use Space Technology Transfer Conference and Exhibition held at the Johnson Space Center February 1-3, 1994. Possible technology transfers covered during the conference were in the areas of information access; innovative microwave and optical applications; materials and structures; marketing and barriers; intelligent systems; human factors and habitation; communications and data systems; business process and technology transfer; software engineering; biotechnology and advanced bioinstrumentation; communications signal processing and analysis; new ways of doing business; medical care; applications derived from control center data systems; human performance evaluation; technology transfer methods; mathematics, modeling, and simulation; propulsion; software analysis and decision tools systems/processes in human support technology; networks, control centers, and distributed systems; power; rapid development perception and vision technologies; integrated vehicle health management; automation technologies; advanced avionics; ans robotics technologies. More than 77 papers, 20 presentations, and 20 exhibits covering various disciplines were presented b experts from NASA, universities, and industry.
Near infrared (NIR) spectroscopy for in-line monitoring of polymer extrusion processes.
Rohe, T; Becker, W; Kölle, S; Eisenreich, N; Eyerer, P
1999-09-13
In recent years, near infrared (NIR) spectroscopy has become an analytical tool frequently used in many chemical production processes. In particular, on-line measurements are of interest to increase process stability and to document constant product quality. Application to polymer processing e.g. polymer extrusion, could even increase product quality. Interesting parameters are composition of the processed polymer, moisture, or reaction status in reactive extrusion. For this issue a transmission sensor was developed for application of NIR spectroscopy to extrusion processes. This sensor includes fibre optic probes and a measuring cell to be adapted to various extruders for in-line measurements. In contrast to infrared sensors, it only uses optical quartz components. Extrusion processes at temperatures up to 300 degrees C and pressures up to 37 MPa have been investigated. Application of multivariate data analysis (e.g. partial least squares, PLS) demonstrated the performance of the system with respect to process monitoring: in the case of polymer blending, deviations between predicted and actual polymer composition were quite low (in the range of +/-0.25%). So the complete system is suitable for harsh industrial environments and could lead to improved polymer extrusion processes.
Research into software executives for space operations support
NASA Technical Reports Server (NTRS)
Collier, Mark D.
1990-01-01
Research concepts pertaining to a software (workstation) executive which will support a distributed processing command and control system characterized by high-performance graphics workstations used as computing nodes are presented. Although a workstation-based distributed processing environment offers many advantages, it also introduces a number of new concerns. In order to solve these problems, allow the environment to function as an integrated system, and present a functional development environment to application programmers, it is necessary to develop an additional layer of software. This 'executive' software integrates the system, provides real-time capabilities, and provides the tools necessary to support the application requirements.
Application of ultra high pressure (UHP) in starch chemistry.
Kim, Hyun-Seok; Kim, Byung-Yong; Baik, Moo-Yeol
2012-01-01
Ultra high pressure (UHP) processing is an attractive non-thermal technique for food treatment and preservation at room temperature, with the potential to achieve interesting functional effects. The majority of UHP process applications in food systems have focused on shelf-life extension associated with non-thermal sterilization and a reduction or increase in enzymatic activity. Only a few studies have investigated modifications of structural characteristics and/or protein functionalities. Despite the rapid expansion of UHP applications in food systems, limited information is available on the effects of UHP on the structural and physicochemical properties of starch and/or its chemical derivatives included in most processed foods as major ingredients or minor additives. Starch and its chemical derivatives are responsible for textural and physical properties of food systems, impacting their end-use quality and/or shelf-life. This article reviews UHP processes for native (unmodified) starch granules and their effects on the physicochemical properties of UHP-treated starch. Furthermore, functional roles of UHP in acid-hydrolysis, hydroxypropylation, acetylation, and cross-linking reactions of starch granules, as well as the physicochemical properties of UHP-assisted starch chemical derivatives, are discussed.
A DNA network as an information processing system.
Santini, Cristina Costa; Bath, Jonathan; Turberfield, Andrew J; Tyrrell, Andy M
2012-01-01
Biomolecular systems that can process information are sought for computational applications, because of their potential for parallelism and miniaturization and because their biocompatibility also makes them suitable for future biomedical applications. DNA has been used to design machines, motors, finite automata, logic gates, reaction networks and logic programs, amongst many other structures and dynamic behaviours. Here we design and program a synthetic DNA network to implement computational paradigms abstracted from cellular regulatory networks. These show information processing properties that are desirable in artificial, engineered molecular systems, including robustness of the output in relation to different sources of variation. We show the results of numerical simulations of the dynamic behaviour of the network and preliminary experimental analysis of its main components.
NASA Astrophysics Data System (ADS)
Kapitan, Loginn
This research created a new model which provides an integrated approach to planning the effective selection and employment of airborne sensor systems in response to accidental or intentional chemical vapor releases. The approach taken was to use systems engineering and decision analysis methods to construct a model architecture which produced a modular structure for integrating both new and existing components into a logical procedure to assess the application of airborne sensor systems to address chemical vapor hazards. The resulting integrated process model includes an internal aggregation model which allowed differentiation among alternative airborne sensor systems. Both models were developed and validated by experts and demonstrated using appropriate hazardous chemical release scenarios. The resultant prototype integrated process model or system fills a current gap in capability allowing improved planning, training and exercise for HAZMAT teams and first responders when considering the selection and employment of airborne sensor systems. Through the research process, insights into the current response structure and how current airborne capability may be most effectively used were generated. Furthermore, the resultant prototype system is tailorable for local, state, and federal application, and can potentially be modified to help evaluate investments in new airborne sensor technology and systems. Better planning, training and preparedness exercising holds the prospect for the effective application of airborne assets for improved response to large scale chemical release incidents. Improved response will result in fewer casualties and lives lost, reduced economic impact, and increased protection of critical infrastructure when faced with accidental and intentional terrorist release of hazardous industrial chemicals. With the prospect of more airborne sensor systems becoming available, this prototype system integrates existing and new tools into an effective process for the selection and employment of airborne sensors to better plan, train and exercise ahead of potential chemical release events.
Innovative IT system for material management in warehouses
NASA Astrophysics Data System (ADS)
Papoutsidakis, Michael; Sigala, Maria; Simeonaki, Eleni; Tseles, Dimitrios
2017-09-01
Nowadays through the rapid development of technology in all areas there is a constant effort to introduce technological solutions in everyday life with emphasis on materials management information systems (Enterprise Resource Planning). During the last few years the variety of these systems has been increased for small business or for SMEs as well as for larger companies and industries. In the field of material management and main management operations with automated processes, ERP applications have only recently begun to make their appearance. In this paper will be presented the development of a system for automated material storage process in a system built through specific roles that will manage materials using an integrated barcode scanner. In addition we will analyse and describe the operation and modules of other systems that have been created for the same usage. The aim of this project is to create a prototype application that will be innovative with a flexible nature that will give solutions, with low cost and it will be user friendly. This application will allow quick and proper materials management for storage. The expected result is that the application can be used by smart devices in android environment and computers without an external barcode scanner, making the application accessible to the buyer at low cost.
Hybrid Method for Mobile learning Cooperative: Study of Timor Leste
NASA Astrophysics Data System (ADS)
da Costa Tavares, Ofelia Cizela; Suyoto; Pranowo
2018-02-01
In the modern world today the decision support system is very useful to help in solving a problem, so this study discusses the learning process of savings and loan cooperatives in Timor Leste. The purpose of the observation is that the people of Timor Leste are still in the process of learning the use DSS for good saving and loan cooperative process. Based on existing research on the Timor Leste community on credit cooperatives, a mobile application will be built that will help the cooperative learning process in East Timorese society. The methods used for decision making are AHP (Analytical Hierarchy Process) and SAW (simple additive Weighting) method to see the result of each criterion and the weight of the value. The result of this research is mobile leaning cooperative in decision support system by using SAW and AHP method. Originality Value: Changed the two methods of mobile application development using AHP and SAW methods to help the decision support system process of a savings and credit cooperative in Timor Leste.
NASA Technical Reports Server (NTRS)
1981-01-01
The Space Transportation System (STS) is discussed, including the launch processing system, the thermal protection subsystem, meteorological research, sound supression water system, rotating service structure, improved hypergol or removal systems, fiber optics research, precision positioning, remote controlled solid rocket booster nozzle plugs, ground operations for Centaur orbital transfer vehicle, parachute drying, STS hazardous waste disposal and recycle, toxic waste technology and control concepts, fast analytical densitometry study, shuttle inventory management system, operational intercommunications system improvement, and protective garment ensemble. Terrestrial applications are also covered, including LANDSAT applications to water resources, satellite freeze forecast system, application of ground penetrating radar to soil survey, turtle tracking, evaluating computer drawn ground cover maps, sparkless load pulsar, and coupling a microcomputer and computing integrator with a gas chromatograph.
NASA Astrophysics Data System (ADS)
Zaharov, A. A.; Nissenbaum, O. V.; Ponomaryov, K. Y.; Nesgovorov, E. S.
2018-01-01
In this paper we study application of Internet of Thing concept and devices to secure automated process control systems. We review different approaches in IoT (Internet of Things) architecture and design and propose them for several applications in security of automated process control systems. We consider an Attribute-based encryption in context of access control mechanism implementation and promote a secret key distribution scheme between attribute authorities and end devices.
1979-12-01
The Marine Corps Tactical Command and Control System (MTACCS) is expected to provide increased decision making speed and power through automated ... processing and display of data which previously was processed manually. The landing Force Organizational Systems Study (LFOSS) has challenged Marines to
NASA Technical Reports Server (NTRS)
Carr, J. H.; Hurley, P. J.; Martin, P. J.
1978-01-01
Applications of Thermal Energy Storage (TES) in a paper and pulp mill power house were studied as one approach to the transfer of steam production from fossil fuel boilers to waste fuel of (hog fuel) boilers. Data from specific mills were analyzed, and various TES concepts evaluated for application in the process steam supply system. Constant pressure and variable pressure steam accumulators were found to be the most attractive storage concepts for this application.
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1993-01-01
This paper presents an overview of the application of the Space Generic Open Avionics Architecture (SGOAA) to the Space Shuttle Data Processing System (DPS) architecture design. This application has been performed to validate the SGOAA, and its potential use in flight critical systems. The paper summarizes key elements of the Space Shuttle avionics architecture, data processing system requirements and software architecture as currently implemented. It then summarizes the SGOAA architecture and describes a tailoring of the SGOAA to the Space Shuttle. The SGOAA consists of a generic system architecture for the entities in spacecraft avionics, a generic processing external and internal hardware architecture, a six class model of interfaces and functional subsystem architectures for data services and operations control capabilities. It has been proposed as an avionics architecture standard with the National Aeronautics and Space Administration (NASA), through its Strategic Avionics Technology Working Group, and is being considered by the Society of Aeronautic Engineers (SAE) as an SAE Avionics Standard. This architecture was developed for the Flight Data Systems Division of JSC by the Lockheed Engineering and Sciences Company, Houston, Texas.
Arvanitoyannis, Ioannis S; Varzakas, Theodoros H
2008-05-01
The Failure Mode and Effect Analysis (FMEA) model was applied for risk assessment of salmon manufacturing. A tentative approach of FMEA application to the salmon industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (salmon processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points were identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram and fishbone diagram). In this work, a comparison of ISO 22000 analysis with HACCP is carried out over salmon processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Fish receiving, casing/marking, blood removal, evisceration, filet-making cooling/freezing, and distribution were the processes identified as the ones with the highest RPN (252, 240, 210, 210, 210, 210, 200 respectively) and corrective actions were undertaken. After the application of corrective actions, a second calculation of RPN values was carried out resulting in substantially lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO 22000 system of a salmon processing industry is anticipated to prove advantageous to industrialists, state food inspectors, and consumers.
Arvanitoyannis, Ioannis S; Varzakas, Theodoros H
2009-08-01
Failure Mode and Effect Analysis (FMEA) has been applied for the risk assessment of snails manufacturing. A tentative approach of FMEA application to the snails industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (snails processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over snails processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Sterilization of tins, bioaccumulation of heavy metals, packaging of shells and poisonous mushrooms, were the processes identified as the ones with the highest RPN (280, 240, 147, 144, respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a snails processing industry is considered imperative.
Transplant Image Processing Technology under Windows into the Platform Based on MiniGUI
NASA Astrophysics Data System (ADS)
Gan, Lan; Zhang, Xu; Lv, Wenya; Yu, Jia
MFC has a large number of digital image processing-related API functions, object-oriented and class mechanisms which provides image processing technology strong support in Windows. But in embedded systems, image processing technology dues to the restrictions of hardware and software do not have the environment of MFC in Windows. Therefore, this paper draws on the experience of image processing technology of Windows and transplants it into MiniGUI embedded systems. The results show that MiniGUI/Embedded graphical user interface applications about image processing which used in embedded image processing system can be good results.
A web-based computer aided system for liver surgery planning: initial implementation on RayPlus
NASA Astrophysics Data System (ADS)
Luo, Ming; Yuan, Rong; Sun, Zhi; Li, Tianhong; Xie, Qingguo
2016-03-01
At present, computer aided systems for liver surgery design and risk evaluation are widely used in clinical all over the world. However, most systems are local applications that run on high-performance workstations, and the images have to processed offline. Compared with local applications, a web-based system is accessible anywhere and for a range of regardless of relative processing power or operating system. RayPlus (http://rayplus.life.hust.edu.cn), a B/S platform for medical image processing, was developed to give a jump start on web-based medical image processing. In this paper, we implement a computer aided system for liver surgery planning on the architecture of RayPlus. The system consists of a series of processing to CT images including filtering, segmentation, visualization and analyzing. Each processing is packaged into an executable program and runs on the server side. CT images in DICOM format are processed step by to interactive modeling on browser with zero-installation and server-side computing. The system supports users to semi-automatically segment the liver, intrahepatic vessel and tumor from the pre-processed images. Then, surface and volume models are built to analyze the vessel structure and the relative position between adjacent organs. The results show that the initial implementation meets satisfactorily its first-order objectives and provide an accurate 3D delineation of the liver anatomy. Vessel labeling and resection simulation are planned to add in the future. The system is available on Internet at the link mentioned above and an open username for testing is offered.
Jeong, Seung Tak; Kim, Gil Won; Hwang, Hyun Young; Kim, Pil Joo; Kim, Sang Yoon
2018-02-01
Livestock manure application can stimulate greenhouse gas (GHG) emissions, especially methane (CH 4 ) in rice paddy. The stabilized organic matter (OM) is recommended to suppress CH 4 emission without counting the additional GHG emission during the composting process. To evaluate the effect of compost utilization on the net global warming potential (GWP) of a rice cropping system, the fluxes of GHGs from composting to land application were calculated by a life cycle assessment (LCA) method. The model framework was composed of GHG fluxes from industrial activities and biogenic GHG fluxes from the composting and rice cultivation processes. Fresh manure emitted 30MgCO 2 -eq.ha -1 , 90% and 10% of which were contributed by CH 4 and nitrous oxide (N 2 O) fluxes, respectively, during rice cultivation. Compost utilization decreased net GWP by 25% over that of the fresh manure during the whole process. The composting process increased the GWP of the industrial processes by 35%, but the 60% reduction in CH 4 emissions from the rice paddy mainly influenced the reduction of GWP during the overall process. Therefore, compost application could be a good management strategy to reduce GHG emissions from rice paddy systems. Copyright © 2017 Elsevier B.V. All rights reserved.
Hybrid photonic signal processing
NASA Astrophysics Data System (ADS)
Ghauri, Farzan Naseer
This thesis proposes research of novel hybrid photonic signal processing systems in the areas of optical communications, test and measurement, RF signal processing and extreme environment optical sensors. It will be shown that use of innovative hybrid techniques allows design of photonic signal processing systems with superior performance parameters and enhanced capabilities. These applications can be divided into domains of analog-digital hybrid signal processing applications and free-space---fiber-coupled hybrid optical sensors. The analog-digital hybrid signal processing applications include a high-performance analog-digital hybrid MEMS variable optical attenuator that can simultaneously provide high dynamic range as well as high resolution attenuation controls; an analog-digital hybrid MEMS beam profiler that allows high-power watt-level laser beam profiling and also provides both submicron-level high resolution and wide area profiling coverage; and all optical transversal RF filters that operate on the principle of broadband optical spectral control using MEMS and/or Acousto-Optic tunable Filters (AOTF) devices which can provide continuous, digital or hybrid signal time delay and weight selection. The hybrid optical sensors presented in the thesis are extreme environment pressure sensors and dual temperature-pressure sensors. The sensors employ hybrid free-space and fiber-coupled techniques for remotely monitoring a system under simultaneous extremely high temperatures and pressures.
Food drying process by power ultrasound.
de la Fuente-Blanco, S; Riera-Franco de Sarabia, E; Acosta-Aparicio, V M; Blanco-Blanco, A; Gallego-Juárez, J A
2006-12-22
Drying processes, which have a great significance in the food industry, are frequently based on the use of thermal energy. Nevertheless, such methods may produce structural changes in the products. Consequently, a great emphasis is presently given to novel treatments where the quality will be preserved. Such is the case of the application of high-power ultrasound which represents an emergent and promising technology. During the last few years, we have been involved in the development of an ultrasonic dehydration process, based on the application of the ultrasonic vibration in direct contact with the product. Such a process has been the object of a detailed study at laboratory stage on the influence of the different parameters involved. This paper deals with the development and testing of a prototype system for the application and evaluation of the process at a pre-industrial stage. Such prototype is based on a high-power rectangular plate transducer, working at a frequency of 20 kHz, with a power capacity of about 100 W. In order to study mechanical and thermal effects, the system is provided with a series of sensors which permit monitoring the parameters of the process. Specific software has also been developed to facilitate data collection and analysis. The system has been tested with vegetable samples.
System of Systems Engineering and Integration Process for Network Transport Assessment
2016-09-01
SOSE&I CONCEPTS The DOD-sourced “Systems Engineering Guide for Systems of Systems” provides an overview of the SoS environment and SE considerations...usage as a guide in application of systems engineering processes. They are listed verbatim below as defined in the DOD SE guide (ODUSD[A&T]SSE 2008...Technology (A&T), Systems and Software Engineering (SSE). 2008. Systems Engineering Guide for Systems of Systems. Washington, DC: ODUSD(A&T)SSE
A knowledge based expert system for propellant system monitoring at the Kennedy Space Center
NASA Technical Reports Server (NTRS)
Jamieson, J. R.; Delaune, C.; Scarl, E.
1985-01-01
The Lox Expert System (LES) is the first attempt to build a realtime expert system capable of simulating the thought processes of NASA system engineers, with regard to fluids systems analysis and troubleshooting. An overview of the hardware and software describes the techniques used, and possible applications to other process control systems. LES is now in the advanced development stage, with a full implementation planned for late 1985.
Pattern recognition for Space Applications Center director's discretionary fund
NASA Technical Reports Server (NTRS)
Singley, M. E.
1984-01-01
Results and conclusions are presented on the application of recent developments in pattern recognition to spacecraft star mapping systems. Sensor data for two representative starfields are processed by an adaptive shape-seeking version of the Fc-V algorithm with good results. Cluster validity measures are evaluated, but not found especially useful to this application. Recommendations are given two system configurations worthy of additional study,
Application programs written by using customizing tools of a computer-aided design system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, X.; Huang, R.; Juricic, D.
1995-12-31
Customizing tools of Computer-Aided Design Systems have been developed to such a degree as to become equivalent to powerful higher-level programming languages that are especially suitable for graphics applications. Two examples of application programs written by using AutoCAD`s customizing tools are given in some detail to illustrate their power. One tool uses AutoLISP list-processing language to develop an application program that produces four views of a given solid model. The other uses AutoCAD Developmental System, based on program modules written in C, to produce an application program that renders a freehand sketch from a given CAD drawing.
USAF solar thermal applications overview
NASA Technical Reports Server (NTRS)
Hauger, J. S.; Simpson, J. A.
1981-01-01
Process heat applications were compared to solar thermal technologies. The generic process heat applications were analyzed for solar thermal technology utilization, using SERI's PROSYS/ECONOMAT model in an end use matching analysis and a separate analysis was made for solar ponds. Solar technologies appear attractive in a large number of applications. Low temperature applications at sites with high insolation and high fuel costs were found to be most attractive. No one solar thermal technology emerges as a clearly universal or preferred technology, however,, solar ponds offer a potential high payoff in a few, selected applications. It was shown that troughs and flat plate systems are cost effective in a large number of applications.
Integrating UIMA annotators in a web-based text processing framework.
Chen, Xiang; Arnold, Corey W
2013-01-01
The Unstructured Information Management Architecture (UIMA) [1] framework is a growing platform for natural language processing (NLP) applications. However, such applications may be difficult for non-technical users deploy. This project presents a web-based framework that wraps UIMA-based annotator systems into a graphical user interface for researchers and clinicians, and a web service for developers. An annotator that extracts data elements from lung cancer radiology reports is presented to illustrate the use of the system. Annotation results from the web system can be exported to multiple formats for users to utilize in other aspects of their research and workflow. This project demonstrates the benefits of a lay-user interface for complex NLP applications. Efforts such as this can lead to increased interest and support for NLP work in the clinical domain.
7 CFR 1822.271 - Processing applications.
Code of Federal Regulations, 2010 CFR
2010-01-01
... specific provisions of State law under which the applicant is organized; a copy of the applicant's articles... services such as doctors, dentists, and hospitals. (9) If facilities such as water and sewage systems... or recommendation. County committees will not be used to review RHS loan applications. (e) Assembly...
Supporting large scale applications on networks of workstations
NASA Technical Reports Server (NTRS)
Cooper, Robert; Birman, Kenneth P.
1989-01-01
Distributed applications on networks of workstations are an increasingly common way to satisfy computing needs. However, existing mechanisms for distributed programming exhibit poor performance and reliability as application size increases. Extension of the ISIS distributed programming system to support large scale distributed applications by providing hierarchical process groups is discussed. Incorporation of hierarchy in the program structure and exploitation of this to limit the communication and storage required in any one component of the distributed system is examined.
Autonomous control systems: applications to remote sensing and image processing
NASA Astrophysics Data System (ADS)
Jamshidi, Mohammad
2001-11-01
One of the main challenges of any control (or image processing) paradigm is being able to handle complex systems under unforeseen uncertainties. A system may be called complex here if its dimension (order) is too high and its model (if available) is nonlinear, interconnected, and information on the system is uncertain such that classical techniques cannot easily handle the problem. Examples of complex systems are power networks, space robotic colonies, national air traffic control system, and integrated manufacturing plant, the Hubble Telescope, the International Space Station, etc. Soft computing, a consortia of methodologies such as fuzzy logic, neuro-computing, genetic algorithms and genetic programming, has proven to be powerful tools for adding autonomy and semi-autonomy to many complex systems. For such systems the size of soft computing control architecture will be nearly infinite. In this paper new paradigms using soft computing approaches are utilized to design autonomous controllers and image enhancers for a number of application areas. These applications are satellite array formations for synthetic aperture radar interferometry (InSAR) and enhancement of analog and digital images.
NASA Astrophysics Data System (ADS)
Ćwikła, G.; Gwiazda, A.; Banaś, W.; Monica, Z.; Foit, K.
2017-08-01
The article presents the study of possible application of selected methods of complex description, that can be used as a support of the Manufacturing Information Acquisition System (MIAS) methodology, describing how to design a data acquisition system, allowing for collecting and processing real-time data on the functioning of a production system, necessary for management of a company. MIAS can allow conversion into Cyber-Physical Production System. MIAS is gathering and pre-processing data on the state of production system, including e.g. realisation of production orders, state of machines, materials and human resources. Systematised approach and model-based development is proposed for improving the quality of the design of MIAS methodology-based complex systems supporting data acquisition in various types of companies. Graphical specification can be the baseline for any model-based development in specified areas. The possibility of application of SysML and BPMN, both being UML-based languages, representing different approaches to modelling of requirements, architecture and implementation of the data acquisition system, as a tools supporting description of required features of MIAS, were considered.
NASA Technical Reports Server (NTRS)
Liberman, Eugene M.; Manner, David B.; Dolce, James L.; Mellor, Pamela A.
1993-01-01
Expert systems are widely used in health monitoring and fault detection applications. One of the key features of an expert system is that it possesses a large body of knowledge about the application for which it was designed. When the user consults this knowledge base, it is essential that the expert system's reasoning process and its conclusions be as concise as possible. If, in addition, an expert system is part of a process monitoring system, the expert system's conclusions must be combined with current events of the process. Under these circumstances, it is difficult for a user to absorb and respond to all the available information. For example, a user can become distracted and confused if two or more unrelated devices in different parts of the system require attention. A human interface designed to integrate expert system diagnoses with process data and to focus the user's attention to the important matters provides a solution to the 'information overload' problem. This paper will discuss a user interface to the power distribution expert system for Space Station Freedom. The importance of features which simplify assessing system status and which minimize navigating through layers of information will be discussed. Design rationale and implementation choices will also be presented.
47 CFR 25.157 - Consideration of NGSO-like satellite applications.
Code of Federal Regulations, 2012 CFR
2012-10-01
... satellite systems, in which the satellites are designed to communicate with earth stations with omni... response to a public notice initiating a processing round, or a “lead application,” i.e., all other NGSO... public notice. This public notice will initiate a processing round, establish a cut-off date for...
47 CFR 25.157 - Consideration of NGSO-like satellite applications.
Code of Federal Regulations, 2013 CFR
2013-10-01
... satellite systems, in which the satellites are designed to communicate with earth stations with omni... response to a public notice initiating a processing round, or a “lead application,” i.e., all other NGSO... public notice. This public notice will initiate a processing round, establish a cut-off date for...
47 CFR 25.157 - Consideration of NGSO-like satellite applications.
Code of Federal Regulations, 2011 CFR
2011-10-01
... satellite systems, in which the satellites are designed to communicate with earth stations with omni... response to a public notice initiating a processing round, or a “lead application,” i.e., all other NGSO... public notice. This public notice will initiate a processing round, establish a cut-off date for...
47 CFR 25.157 - Consideration of NGSO-like satellite applications.
Code of Federal Regulations, 2014 CFR
2014-10-01
... satellite systems, in which the satellites are designed to communicate with earth stations with omni... response to a public notice initiating a processing round, or a “lead application,” i.e., all other NGSO... public notice. This public notice will initiate a processing round, establish a cut-off date for...
Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic
NASA Technical Reports Server (NTRS)
Leucht, Kurt W.; Semmel, Glenn S.
2008-01-01
The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.
Precision laser processing for micro electronics and fiber optic manufacturing
NASA Astrophysics Data System (ADS)
Webb, Andrew; Osborne, Mike; Foster-Turner, Gideon; Dinkel, Duane W.
2008-02-01
The application of laser based materials processing for precision micro scale manufacturing in the electronics and fiber optic industry is becoming increasingly widespread and accepted. This presentation will review latest laser technologies available and discuss the issues to be considered in choosing the most appropriate laser and processing parameters. High repetition rate, short duration pulsed lasers have improved rapidly in recent years in terms of both performance and reliability enabling flexible, cost effective processing of many material types including metal, silicon, plastic, ceramic and glass. Demonstrating the relevance of laser micromachining, application examples where laser processing is in use for production will be presented, including miniaturization of surface mount capacitors by applying a laser technique for demetalization of tracks in the capacitor manufacturing process and high quality laser machining of fiber optics including stripping, cleaving and lensing, resulting in optical quality finishes without the need for traditional polishing. Applications include telecoms, biomedical and sensing. OpTek Systems was formed in 2000 and provide fully integrated systems and sub contract services for laser processes. They are headquartered in the UK and are establishing a presence in North America through a laser processing facility in South Carolina and sales office in the North East.
Using task analysis to improve the requirements elicitation in health information system.
Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa
2007-01-01
This paper describes the application of task analysis within the design process of a Web-based information system for managing clinical information in hemophilia care, in order to improve the requirements elicitation and, consequently, to validate the domain model obtained in a previous phase of the design process (system analysis). The use of task analysis in this case proved to be a practical and efficient way to improve the requirements engineering process by involving users in the design process.
Space Debris Detection on the HPDP, a Coarse-Grained Reconfigurable Array Architecture for Space
NASA Astrophysics Data System (ADS)
Suarez, Diego Andres; Bretz, Daniel; Helfers, Tim; Weidendorfer, Josef; Utzmann, Jens
2016-08-01
Stream processing, widely used in communications and digital signal processing applications, requires high- throughput data processing that is achieved in most cases using Application-Specific Integrated Circuit (ASIC) designs. Lack of programmability is an issue especially in space applications, which use on-board components with long life-cycles requiring applications updates. To this end, the High Performance Data Processor (HPDP) architecture integrates an array of coarse-grained reconfigurable elements to provide both flexible and efficient computational power suitable for stream-based data processing applications in space. In this work the capabilities of the HPDP architecture are demonstrated with the implementation of a real-time image processing algorithm for space debris detection in a space-based space surveillance system. The implementation challenges and alternatives are described making trade-offs to improve performance at the expense of negligible degradation of detection accuracy. The proposed implementation uses over 99% of the available computational resources. Performance estimations based on simulations show that the HPDP can amply match the application requirements.
Robotic Attention Processing And Its Application To Visual Guidance
NASA Astrophysics Data System (ADS)
Barth, Matthew; Inoue, Hirochika
1988-03-01
This paper describes a method of real-time visual attention processing for robots performing visual guidance. This robot attention processing is based on a novel vision processor, the multi-window vision system that was developed at the University of Tokyo. The multi-window vision system is unique in that it only processes visual information inside local area windows. These local area windows are quite flexible in their ability to move anywhere on the visual screen, change their size and shape, and alter their pixel sampling rate. By using these windows for specific attention tasks, it is possible to perform high speed attention processing. The primary attention skills of detecting motion, tracking an object, and interpreting an image are all performed at high speed on the multi-window vision system. A basic robotic attention scheme using the attention skills was developed. The attention skills involved detection and tracking of salient visual features. The tracking and motion information thus obtained was utilized in producing the response to the visual stimulus. The response of the attention scheme was quick enough to be applicable to the real-time vision processing tasks of playing a video 'pong' game, and later using an automobile driving simulator. By detecting the motion of a 'ball' on a video screen and then tracking the movement, the attention scheme was able to control a 'paddle' in order to keep the ball in play. The response was faster than that of a human's, allowing the attention scheme to play the video game at higher speeds. Further, in the application to the driving simulator, the attention scheme was able to control both direction and velocity of a simulated vehicle following a lead car. These two applications show the potential of local visual processing in its use for robotic attention processing.
Varzakas, Theodoros H
2011-09-01
The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of pastry processing. A tentative approach of FMEA application to the pastry industry was attempted in conjunction with ISO22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (pastry processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over pastry processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the Risk Priority Number (RPN) per identified processing hazard. Storage of raw materials and storage of final products at -18°C followed by freezing were the processes identified as the ones with the highest RPN (225, 225, and 144 respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a pastry processing industry is considered imperative.
Naval Systems Engineering Guide
2004-10-01
Decision Critical Design Review System Integration Activities IOC FRP Decision Review Production & Deployment Sustainment IOT & FOC Sustainmen...reentered when things change significantly, such as funding, requirements, or schedule. This process must start at the very beginning of a Major...outputs through sub-processes will reveal a number of things : a. Determine the level of process applicability and tailoring required. b. Additional
High Performance Input/Output for Parallel Computer Systems
NASA Technical Reports Server (NTRS)
Ligon, W. B.
1996-01-01
The goal of our project is to study the I/O characteristics of parallel applications used in Earth Science data processing systems such as Regional Data Centers (RDCs) or EOSDIS. Our approach is to study the runtime behavior of typical programs and the effect of key parameters of the I/O subsystem both under simulation and with direct experimentation on parallel systems. Our three year activity has focused on two items: developing a test bed that facilitates experimentation with parallel I/O, and studying representative programs from the Earth science data processing application domain. The Parallel Virtual File System (PVFS) has been developed for use on a number of platforms including the Tiger Parallel Architecture Workbench (TPAW) simulator, The Intel Paragon, a cluster of DEC Alpha workstations, and the Beowulf system (at CESDIS). PVFS provides considerable flexibility in configuring I/O in a UNIX- like environment. Access to key performance parameters facilitates experimentation. We have studied several key applications fiom levels 1,2 and 3 of the typical RDC processing scenario including instrument calibration and navigation, image classification, and numerical modeling codes. We have also considered large-scale scientific database codes used to organize image data.
A Framework for Performing V&V within Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1996-01-01
Verification and validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In order to provide early detection of errors, V&V is conducted in parallel with system development, often beginning with the concept phase. In reuse-based software engineering, however, decisions on the requirements, design and even implementation of domain assets can be made prior to beginning development of a specific system. In this case, V&V must be performed during domain engineering in order to have an impact on system development. This paper describes a framework for performing V&V within architecture-centric, reuse-based software engineering. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.
1979-11-01
a generalized cooccurrence matrix. Describing image texture is an important problem in the design of image understanding systems . Applications as...display system design optimization and video signal processing. Based on a study by Southern Research Institute , a number of options were identified...Specification for Target Acquisition Designation System (U), RFP # AMC-DP-AAH-H4020, i2 Apr 77. 4. Terminal Homing Applications of Solid State Image
Client-Side Data Processing and Training for Multispectral Imagery Applications in the GOES-R Era
NASA Technical Reports Server (NTRS)
Fuell, Kevin; Gravelle, Chad; Burks, Jason; Berndt, Emily; Schultz, Lori; Molthan, Andrew; Leroy, Anita
2016-01-01
RGB imagery can be created locally (i.e. client-side) from single band imagery already on the system with little impact given recommended change to texture cache in AWIPS II. Training/Reference material accessible to forecasters within their operational display system improves RGB interpretation and application as demonstrated at OPG. Application examples from experienced forecasters are needed to support the larger community use of RGB imagery and these can be integrated into the user's display system.
Optical fiber sensors for life support applications
NASA Technical Reports Server (NTRS)
Lieberman, R. A.; Schmidlin, E. M.; Ferrell, D. J.; Syracuse, S. J.
1992-01-01
Preliminary experimental results on systems designed to demonstrate sensor operation in regenerative food production and crew air supply applications are presented. The systems use conventional fibers and sources in conjunction with custom wavelength division multiplexers in their optical signal processing sections and nonstandard porous optical fibers in the optical sensing elements. It is considered to be possible to create practical sensors for life-support system applications, and particularly, in regenerative food production environments, based on based on reversible sensors for oxygen, carbon monoxide, and humidity.
Automated design of spacecraft systems power subsystems
NASA Technical Reports Server (NTRS)
Terrile, Richard J.; Kordon, Mark; Mandutianu, Dan; Salcedo, Jose; Wood, Eric; Hashemi, Mona
2006-01-01
This paper discusses the application of evolutionary computing to a dynamic space vehicle power subsystem resource and performance simulation in a parallel processing environment. Our objective is to demonstrate the feasibility, application and advantage of using evolutionary computation techniques for the early design search and optimization of space systems.
Processes for metal extraction
NASA Technical Reports Server (NTRS)
Bowersox, David F.
1992-01-01
This report describes the processing of plutonium at Los Alamos National Laboratory (LANL), and operation illustrating concepts that may be applicable to the processing of lunar materials. The toxic nature of plutonium requires a highly closed system for processing lunar surface materials.
ERIC Educational Resources Information Center
Chowdhury, Gobinda G.
2003-01-01
Discusses issues related to natural language processing, including theoretical developments; natural language understanding; tools and techniques; natural language text processing systems; abstracting; information extraction; information retrieval; interfaces; software; Internet, Web, and digital library applications; machine translation for…
NASA Technical Reports Server (NTRS)
Lucero, John
2016-01-01
The presentation will provide an overview of the fundamentals and principles of Systems Engineering (SE). This includes understanding the processes that are used to assist the engineer in a successful design, build and implementation of solutions. The context of this presentation will be to describe the involvement of SE throughout the life-cycle of a project from cradle to grave. Due to the ever growing number of complex technical problems facing our world, a Systems Engineering approach is desirable for many reasons. The interdisciplinary technical structure of current systems, technical processes representing System Design, Technical Management and Product Realization are instrumental in the development and integration of new technologies into mainstream applications. This tutorial will demonstrate the application of SE tools to these types of problems..
Local, regional and national interoperability in hospital-level systems architecture.
Mykkänen, J; Korpela, M; Ripatti, S; Rannanheimo, J; Sorri, J
2007-01-01
Interoperability of applications in health care is faced with various needs by patients, health professionals, organizations and policy makers. A combination of existing and new applications is a necessity. Hospitals are in a position to drive many integration solutions, but need approaches which combine local, regional and national requirements and initiatives with open standards to support flexible processes and applications on a local hospital level. We discuss systems architecture of hospitals in relation to various processes and applications, and highlight current challenges and prospects using a service-oriented architecture approach. We also illustrate these aspects with examples from Finnish hospitals. A set of main services and elements of service-oriented architectures for health care facilities are identified, with medium-term focus which acknowledges existing systems as a core part of service-oriented solutions. The services and elements are grouped according to functional and interoperability cohesion. A transition towards service-oriented architecture in health care must acknowledge existing health information systems and promote the specification of central processes and software services locally and across organizations. Software industry best practices such as SOA must be combined with health care knowledge to respond to central challenges such as continuous change in health care. A service-oriented approach cannot entirely rely on common standards and frameworks but it must be locally adapted and complemented.
PILOT: An intelligent distributed operations support system
NASA Technical Reports Server (NTRS)
Rasmussen, Arthur N.
1993-01-01
The Real-Time Data System (RTDS) project is exploring the application of advanced technologies to the real-time flight operations environment of the Mission Control Centers at NASA's Johnson Space Center. The system, based on a network of engineering workstations, provides services such as delivery of real time telemetry data to flight control applications. To automate the operation of this complex distributed environment, a facility called PILOT (Process Integrity Level and Operation Tracker) is being developed. PILOT comprises a set of distributed agents cooperating with a rule-based expert system; together they monitor process operation and data flows throughout the RTDS network. The goal of PILOT is to provide unattended management and automated operation under user control.
NASA Technical Reports Server (NTRS)
Cole, Richard
1991-01-01
The major goals of this effort are as follows: (1) to examine technology insertion options to optimize Advanced Information Processing System (AIPS) performance in the Advanced Launch System (ALS) environment; (2) to examine the AIPS concepts to ensure that valuable new technologies are not excluded from the AIPS/ALS implementations; (3) to examine advanced microprocessors applicable to AIPS/ALS, (4) to examine radiation hardening technologies applicable to AIPS/ALS; (5) to reach conclusions on AIPS hardware building blocks implementation technologies; and (6) reach conclusions on appropriate architectural improvements. The hardware building blocks are the Fault-Tolerant Processor, the Input/Output Sequencers (IOS), and the Intercomputer Interface Sequencers (ICIS).
Modular, bluetooth enabled, wireless electroencephalograph (EEG) platform.
Lovelace, Joseph A; Witt, Tyler S; Beyette, Fred R
2013-01-01
A design for a modular, compact, and accurate wireless electroencephalograph (EEG) system is proposed. EEG is the only non-invasive measure for neuronal function of the brain. Using a number of digital signal processing (DSP) techniques, this neuronal function can be acquired and processed into meaningful representations of brain activity. The system described here utilizes Bluetooth to wirelessly transmit the digitized brain signal for an end application use. In this way, the system is portable, and modular in terms of the device to which it can interface. Brain Computer Interface (BCI) has become a popular extension of EEG systems in modern research. This design serves as a platform for applications using BCI capability.
The Specific Features of design and process engineering in branch of industrial enterprise
NASA Astrophysics Data System (ADS)
Sosedko, V. V.; Yanishevskaya, A. G.
2017-06-01
Production output of industrial enterprise is organized in debugged working mechanisms at each stage of product’s life cycle from initial design documentation to product and finishing it with utilization. The topic of article is mathematical model of the system design and process engineering in branch of the industrial enterprise, statistical processing of estimated implementation results of developed mathematical model in branch, and demonstration of advantages at application at this enterprise. During the creation of model a data flow about driving of information, orders, details and modules in branch of enterprise groups of divisions were classified. Proceeding from the analysis of divisions activity, a data flow, details and documents the state graph of design and process engineering was constructed, transitions were described and coefficients are appropriated. To each condition of system of the constructed state graph the corresponding limiting state probabilities were defined, and also Kolmogorov’s equations are worked out. When integration of sets of equations of Kolmogorov the state probability of system activity the specified divisions and production as function of time in each instant is defined. On the basis of developed mathematical model of uniform system of designing and process engineering and manufacture, and a state graph by authors statistical processing the application of mathematical model results was carried out, and also advantage at application at this enterprise is shown. Researches on studying of loading services probability of branch and third-party contractors (the orders received from branch within a month) were conducted. The developed mathematical model of system design and process engineering and manufacture can be applied to definition of activity state probability of divisions and manufacture as function of time in each instant that will allow to keep account of loading of performance of work in branches of the enterprise.
Friction Stir Welding of Large Scale Cryogenic Tanks for Aerospace Applications
NASA Technical Reports Server (NTRS)
Russell, Carolyn; Ding, R. Jeffrey
1998-01-01
The Marshall Space Flight Center (MSFC) has established a facility for the joining of large-scale aluminum cryogenic propellant tanks using the friction stir welding process. Longitudinal welds, approximately five meters in length, have been made by retrofitting an existing vertical fusion weld system, designed to fabricate tank barrel sections ranging from two to ten meters in diameter. The structural design requirements of the tooling, clamping and travel system will be described in this presentation along with process controls and real-time data acquisition developed for this application. The approach to retrofitting other large welding tools at MSFC with the friction stir welding process will also be discussed.
Synthetic Aperture Radar (SAR) data processing
NASA Technical Reports Server (NTRS)
Beckner, F. L.; Ahr, H. A.; Ausherman, D. A.; Cutrona, L. J.; Francisco, S.; Harrison, R. E.; Heuser, J. S.; Jordan, R. L.; Justus, J.; Manning, B.
1978-01-01
The available and optimal methods for generating SAR imagery for NASA applications were identified. The SAR image quality and data processing requirements associated with these applications were studied. Mathematical operations and algorithms required to process sensor data into SAR imagery were defined. The architecture of SAR image formation processors was discussed, and technology necessary to implement the SAR data processors used in both general purpose and dedicated imaging systems was addressed.
2012-01-01
Background There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. Methods We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. Results We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. Conclusions The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy. PMID:23259846
Yusof, Maryati Mohd; Khodambashi, Soudabeh; Mokhtar, Ariffin Marzuki
2012-12-21
There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy.
Review of optimization techniques of polygeneration systems for building applications
NASA Astrophysics Data System (ADS)
Y, Rong A.; Y, Su; R, Lahdelma
2016-08-01
Polygeneration means simultaneous production of two or more energy products in a single integrated process. Polygeneration is an energy-efficient technology and plays an important role in transition into future low-carbon energy systems. It can find wide applications in utilities, different types of industrial sectors and building sectors. This paper mainly focus on polygeneration applications in building sectors. The scales of polygeneration systems in building sectors range from the micro-level for a single home building to the large- level for residential districts. Also the development of polygeneration microgrid is related to building applications. The paper aims at giving a comprehensive review for optimization techniques for designing, synthesizing and operating different types of polygeneration systems for building applications.
NASA Astrophysics Data System (ADS)
Ji, Yunguang; Xu, Yangyang; Li, Hongtao; Oklejas, Michael; Xue, Shuqi
2018-01-01
A new type of hydraulic turbocharger energy recovery system was designed and applied in the decarbonisation process by propylene carbonate of a 100k tons ammonia synthesis system firstly in China. Compared with existing energy recovery devices, hydraulic turbocharger energy recovery system runs more smoothly, has lower failure rate, longer service life and greater comprehensive benefits due to its unique structure, simpler adjustment process and better adaptability to fluid fluctuation.
45 CFR 205.37 - Responsibilities of the Administration for Children and Families (ACF).
Code of Federal Regulations, 2012 CFR
2012-10-01
... Application Processing and Information Retrieval System Guide. The initial advance automatic data processing... description of the proposed statewide management system, including the description of information flows, input..., review, assess, and inspect the planning, design, and operation of, statewide management information...
45 CFR 205.37 - Responsibilities of the Administration for Children and Families (ACF).
Code of Federal Regulations, 2011 CFR
2011-10-01
... Application Processing and Information Retrieval System Guide. The initial advance automatic data processing... description of the proposed statewide management system, including the description of information flows, input..., review, assess, and inspect the planning, design, and operation of, statewide management information...
The methods used for simulating aerosol physical and chemical processes in a new air pollution modeling system are discussed and analyzed. Such processes include emissions, nucleation, coagulation, reversible chemistry, condensation, dissolution, evaporation, irreversible chem...
ERIC Educational Resources Information Center
Haapaniemi, Peter
1990-01-01
Describes imaging technology, which allows huge numbers of words and illustrations to be reduced to tiny fraction of space required by originals and discusses current applications. Highlights include image processing system at National Archives; use by banks for high-speed check processing; engineering document management systems (EDMS); folder…
Second Order Boltzmann-Gibbs Principle for Polynomial Functions and Applications
NASA Astrophysics Data System (ADS)
Gonçalves, Patrícia; Jara, Milton; Simon, Marielle
2017-01-01
In this paper we give a new proof of the second order Boltzmann-Gibbs principle introduced in Gonçalves and Jara (Arch Ration Mech Anal 212(2):597-644, 2014). The proof does not impose the knowledge on the spectral gap inequality for the underlying model and it relies on a proper decomposition of the antisymmetric part of the current of the system in terms of polynomial functions. In addition, we fully derive the convergence of the equilibrium fluctuations towards (1) a trivial process in case of super-diffusive systems, (2) an Ornstein-Uhlenbeck process or the unique energy solution of the stochastic Burgers equation, as defined in Gubinelli and Jara (SPDEs Anal Comput (1):325-350, 2013) and Gubinelli and Perkowski (Arxiv:1508.07764, 2015), in case of weakly asymmetric diffusive systems. Examples and applications are presented for weakly and partial asymmetric exclusion processes, weakly asymmetric speed change exclusion processes and hamiltonian systems with exponential interactions.
Neural network face recognition using wavelets
NASA Astrophysics Data System (ADS)
Karunaratne, Passant V.; Jouny, Ismail I.
1997-04-01
The recognition of human faces is a phenomenon that has been mastered by the human visual system and that has been researched extensively in the domain of computer neural networks and image processing. This research is involved in the study of neural networks and wavelet image processing techniques in the application of human face recognition. The objective of the system is to acquire a digitized still image of a human face, carry out pre-processing on the image as required, an then, given a prior database of images of possible individuals, be able to recognize the individual in the image. The pre-processing segment of the system includes several procedures, namely image compression, denoising, and feature extraction. The image processing is carried out using Daubechies wavelets. Once the images have been passed through the wavelet-based image processor they can be efficiently analyzed by means of a neural network. A back- propagation neural network is used for the recognition segment of the system. The main constraints of the system is with regard to the characteristics of the images being processed. The system should be able to carry out effective recognition of the human faces irrespective of the individual's facial-expression, presence of extraneous objects such as head-gear or spectacles, and face/head orientation. A potential application of this face recognition system would be as a secondary verification method in an automated teller machine.
Application of the JDL data fusion process model for cyber security
NASA Astrophysics Data System (ADS)
Giacobe, Nicklaus A.
2010-04-01
A number of cyber security technologies have proposed the use of data fusion to enhance the defensive capabilities of the network and aid in the development of situational awareness for the security analyst. While there have been advances in fusion technologies and the application of fusion in intrusion detection systems (IDSs), in particular, additional progress can be made by gaining a better understanding of a variety of data fusion processes and applying them to the cyber security application domain. This research explores the underlying processes identified in the Joint Directors of Laboratories (JDL) data fusion process model and further describes them in a cyber security context.
Computers for Manned Space Applications Base on Commercial Off-the-Shelf Components
NASA Astrophysics Data System (ADS)
Vogel, T.; Gronowski, M.
2009-05-01
Similar to the consumer markets there has been an ever increasing demand in processing power, signal processing capabilities and memory space also for computers used for science data processing in space. An important driver of this development have been the payload developers for the International Space Station, requesting high-speed data acquisition and fast control loops in increasingly complex systems. Current experiments now even perform video processing and compression with their payload controllers. Nowadays the requirements for a space qualified computer are often far beyond the capabilities of, for example, the classic SPARC architecture that is found in ERC32 or LEON CPUs. An increase in performance usually demands costly and power consuming application specific solutions. Continuous developments over the last few years have now led to an alternative approach that is based on complete electronics modules manufactured for commercial and industrial customers. Computer modules used in industrial environments with a high demand for reliability under harsh environmental conditions like chemical reactors, electrical power plants or on manufacturing lines are entered into a selection procedure. Promising candidates then undergo a detailed characterisation process developed by Astrium Space Transportation. After thorough analysis and some modifications, these modules can replace fully qualified custom built electronics in specific, although not safety critical applications in manned space. This paper focuses on the benefits of COTS1 based electronics modules and the necessary analyses and modifications for their utilisation in manned space applications on the ISS. Some considerations regarding overall systems architecture will also be included. Furthermore this paper will also pinpoint issues that render such modules unsuitable for specific tasks, and justify the reasons. Finally, the conclusion of this paper will advocate the implementation of COTS based electronics for a range of applications within specifically adapted systems. The findings in this paper are extrapolated from two reference computer systems, both having been launched in 2008. One of those was a LEON-2 based computer installed onboard the Columbus Orbital Facility while the other system consisted mainly of a commercial Power-PC module that was modified for a launch mounted on the ICC pallet in the Space Shuttle's cargo bay. Both systems are currently upgraded and extended for future applications.
Theory of a general class of dissipative processes.
NASA Technical Reports Server (NTRS)
Hale, J. K.; Lasalle, J. P.; Slemrod, M.
1972-01-01
Development of a theory of periodic processes that is of sufficient generality for being applied to systems defined by partial differential equations (distributed parameter systems) and functional differential equations of the retarded and neutral type (hereditary systems), as well as to systems arising in the theory of elasticity. In particular, the attempt is made to develop a meaningful general theory of dissipative periodic systems with a wide range of applications.
Adapting Nielsen’s Design Heuristics to Dual Processing for Clinical Decision Support
Taft, Teresa; Staes, Catherine; Slager, Stacey; Weir, Charlene
2016-01-01
The study objective was to improve the applicability of Nielson’s standard design heuristics for evaluating electronic health record (EHR) alerts and linked ordering support by integrating them with Dual Process theory. Through initial heuristic evaluation and a user study of 7 physicians, usability problems were identified. Through independent mapping of specific usability criteria to support for each of the Dual Cognitive processes (S1 and S2) and deliberation, agreement was reached on mapping criteria. Finally, usability errors from the heuristic and user study were mapped to S1 and S2. Adding a dual process perspective to specific heuristic analysis increases the applicability and relevance of computerized health information design evaluations. This mapping enables designers to measure that their systems are tailored to support attention allocation. System 1 will be supported by improving pattern recognition and saliency, and system 2 through efficiency and control of information access. PMID:28269915
Effects of Amine and Anhydride Curing Agents on the VARTM Matrix Processing Properties
NASA Technical Reports Server (NTRS)
Grimsley, Brian W.; Hubert, Pascal; Song, Xiaolan; Cano, Roberto J.; Loos, Alfred C.; Pipes, R. Byron
2002-01-01
To ensure successful application of composite structure for aerospace vehicles, it is necessary to develop material systems that meet a variety of requirements. The industry has recently developed a number of low-viscosity epoxy resins to meet the processing requirements associated with vacuum assisted resin transfer molding (VARTM) of aerospace components. The curing kinetics and viscosity of two of these resins, an amine-cured epoxy system, Applied Poleramic, Inc. VR-56-4 1, and an anhydride-cured epoxy system, A.T.A.R.D. Laboratories SI-ZG-5A, have been characterized for application in the VARTM process. Simulations were carried out using the process model, COMPRO, to examine heat transfer, curing kinetics and viscosity for different panel thicknesses and cure cycles. Results of these simulations indicate that the two resins have significantly different curing behaviors and flow characteristics.
Adapting Nielsen's Design Heuristics to Dual Processing for Clinical Decision Support.
Taft, Teresa; Staes, Catherine; Slager, Stacey; Weir, Charlene
2016-01-01
The study objective was to improve the applicability of Nielson's standard design heuristics for evaluating electronic health record (EHR) alerts and linked ordering support by integrating them with Dual Process theory. Through initial heuristic evaluation and a user study of 7 physicians, usability problems were identified. Through independent mapping of specific usability criteria to support for each of the Dual Cognitive processes (S1 and S2) and deliberation, agreement was reached on mapping criteria. Finally, usability errors from the heuristic and user study were mapped to S1 and S2. Adding a dual process perspective to specific heuristic analysis increases the applicability and relevance of computerized health information design evaluations. This mapping enables designers to measure that their systems are tailored to support attention allocation. System 1 will be supported by improving pattern recognition and saliency, and system 2 through efficiency and control of information access.
Process for selecting engineering tools : applied to selecting a SysML tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Spain, Mark J.; Post, Debra S.; Taylor, Jeffrey L.
2011-02-01
Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.
Rath, N; Kato, S; Levesque, J P; Mauel, M E; Navratil, G A; Peng, Q
2014-04-01
Fast, digital signal processing (DSP) has many applications. Typical hardware options for performing DSP are field-programmable gate arrays (FPGAs), application-specific integrated DSP chips, or general purpose personal computer systems. This paper presents a novel DSP platform that has been developed for feedback control on the HBT-EP tokamak device. The system runs all signal processing exclusively on a Graphics Processing Unit (GPU) to achieve real-time performance with latencies below 8 μs. Signals are transferred into and out of the GPU using PCI Express peer-to-peer direct-memory-access transfers without involvement of the central processing unit or host memory. Tests were performed on the feedback control system of the HBT-EP tokamak using forty 16-bit floating point inputs and outputs each and a sampling rate of up to 250 kHz. Signals were digitized by a D-TACQ ACQ196 module, processing done on an NVIDIA GTX 580 GPU programmed in CUDA, and analog output was generated by D-TACQ AO32CPCI modules.
Application of agent-based system for bioprocess description and process improvement.
Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J
2010-01-01
Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. Copyright 2009 American Institute of Chemical Engineers
Designing the accident and emergency system: lessons from manufacturing.
Walley, P
2003-03-01
To review the literature on manufacturing process design and demonstrate applicability in health care. Literature review and application of theory using two years activity data from two healthcare communities and extensive observation of activities over a six week period by seven researchers. It was possible to identify patient flows that could be used to design treatment processes around the needs of the patient. Some queues are built into existing treatment processes and can be removed by better process design. Capacity imbalance, not capacity shortage, causes some unnecessary waiting in accident and emergency departments. Clinicians would find that modern manufacturing theories produce more acceptable designs of systems. In particular, good quality is seen as a necessary pre-requisite of fast, efficient services.
Systems Engineering in NASA's R&TD Programs
NASA Technical Reports Server (NTRS)
Jones, Harry
2005-01-01
Systems engineering is largely the analysis and planning that support the design, development, and operation of systems. The most common application of systems engineering is in guiding systems development projects that use a phased process of requirements, specifications, design, and development. This paper investigates how systems engineering techniques should be applied in research and technology development programs for advanced space systems. These programs should include anticipatory engineering of future space flight systems and a project portfolio selection process, as well as systems engineering for multiple development projects.
NASA Technical Reports Server (NTRS)
1975-01-01
User benefits resulting from the application of space systems to previously described application areas were identified, and methods to assign priorities to application areas and to quantify the benefits were described. The following areas were selected for in-depth review: communications, materials processing in space, weather and climate, and institutional arrangements for space applications. Recommendations concerning studies that should be undertaken to develop a more precise understanding of the source and magnitude of the realizable economic benefits were also presented.
The Aerospace Energy Systems Laboratory: A BITBUS networking application
NASA Technical Reports Server (NTRS)
Glover, Richard D.; Oneill-Rood, Nora
1989-01-01
The NASA Ames-Dryden Flight Research Facility developed a computerized aircraft battery servicing facility called the Aerospace Energy Systems Laboratory (AESL). This system employs distributed processing with communications provided by a 2.4-megabit BITBUS local area network. Customized handlers provide real time status, remote command, and file transfer protocols between a central system running the iRMX-II operating system and ten slave stations running the iRMX-I operating system. The hardware configuration and software components required to implement this BITBUS application are required.
Conference on Space and Military Applications of Automation and Robotics
NASA Technical Reports Server (NTRS)
1988-01-01
Topics addressed include: robotics; deployment strategies; artificial intelligence; expert systems; sensors and image processing; robotic systems; guidance, navigation, and control; aerospace and missile system manufacturing; and telerobotics.
NASA Astrophysics Data System (ADS)
Lemmen, Carsten; Hofmeister, Richard; Klingbeil, Knut; Hassan Nasermoaddeli, M.; Kerimoglu, Onur; Burchard, Hans; Kösters, Frank; Wirtz, Kai W.
2018-03-01
Shelf and coastal sea processes extend from the atmosphere through the water column and into the seabed. These processes reflect intimate interactions between physical, chemical, and biological states on multiple scales. As a consequence, coastal system modelling requires a high and flexible degree of process and domain integration; this has so far hardly been achieved by current model systems. The lack of modularity and flexibility in integrated models hinders the exchange of data and model components and has historically imposed the supremacy of specific physical driver models. We present the Modular System for Shelves and Coasts (MOSSCO; http://www.mossco.de), a novel domain and process coupling system tailored but not limited to the coupling challenges of and applications in the coastal ocean. MOSSCO builds on the Earth System Modeling Framework (ESMF) and on the Framework for Aquatic Biogeochemical Models (FABM). It goes beyond existing technologies by creating a unique level of modularity in both domain and process coupling, including a clear separation of component and basic model interfaces, flexible scheduling of several tens of models, and facilitation of iterative development at the lab and the station and on the coastal ocean scale. MOSSCO is rich in metadata and its concepts are also applicable outside the coastal domain. For coastal modelling, it contains dozens of example coupling configurations and tested set-ups for coupled applications. Thus, MOSSCO addresses the technology needs of a growing marine coastal Earth system community that encompasses very different disciplines, numerical tools, and research questions.
Pc-Based Floating Point Imaging Workstation
NASA Astrophysics Data System (ADS)
Guzak, Chris J.; Pier, Richard M.; Chinn, Patty; Kim, Yongmin
1989-07-01
The medical, military, scientific and industrial communities have come to rely on imaging and computer graphics for solutions to many types of problems. Systems based on imaging technology are used to acquire and process images, and analyze and extract data from images that would otherwise be of little use. Images can be transformed and enhanced to reveal detail and meaning that would go undetected without imaging techniques. The success of imaging has increased the demand for faster and less expensive imaging systems and as these systems become available, more and more applications are discovered and more demands are made. From the designer's perspective the challenge to meet these demands forces him to attack the problem of imaging from a different perspective. The computing demands of imaging algorithms must be balanced against the desire for affordability and flexibility. Systems must be flexible and easy to use, ready for current applications but at the same time anticipating new, unthought of uses. Here at the University of Washington Image Processing Systems Lab (IPSL) we are focusing our attention on imaging and graphics systems that implement imaging algorithms for use in an interactive environment. We have developed a PC-based imaging workstation with the goal to provide powerful and flexible, floating point processing capabilities, along with graphics functions in an affordable package suitable for diverse environments and many applications.
NASA Technical Reports Server (NTRS)
Deb, Somnath (Inventor); Ghoshal, Sudipto (Inventor); Malepati, Venkata N. (Inventor); Kleinman, David L. (Inventor); Cavanaugh, Kevin F. (Inventor)
2004-01-01
A network-based diagnosis server for monitoring and diagnosing a system, the server being remote from the system it is observing, comprises a sensor for generating signals indicative of a characteristic of a component of the system, a network-interfaced sensor agent coupled to the sensor for receiving signals therefrom, a broker module coupled to the network for sending signals to and receiving signals from the sensor agent, a handler application connected to the broker module for transmitting signals to and receiving signals therefrom, a reasoner application in communication with the handler application for processing, and responding to signals received from the handler application, wherein the sensor agent, broker module, handler application, and reasoner applications operate simultaneously relative to each other, such that the present invention diagnosis server performs continuous monitoring and diagnosing of said components of the system in real time. The diagnosis server is readily adaptable to various different systems.
PI2GIS: processing image to geographical information systems, a learning tool for QGIS
NASA Astrophysics Data System (ADS)
Correia, R.; Teodoro, A.; Duarte, L.
2017-10-01
To perform an accurate interpretation of remote sensing images, it is necessary to extract information using different image processing techniques. Nowadays, it became usual to use image processing plugins to add new capabilities/functionalities integrated in Geographical Information System (GIS) software. The aim of this work was to develop an open source application to automatically process and classify remote sensing images from a set of satellite input data. The application was integrated in a GIS software (QGIS), automating several image processing steps. The use of QGIS for this purpose is justified since it is easy and quick to develop new plugins, using Python language. This plugin is inspired in the Semi-Automatic Classification Plugin (SCP) developed by Luca Congedo. SCP allows the supervised classification of remote sensing images, the calculation of vegetation indices such as NDVI (Normalized Difference Vegetation Index) and EVI (Enhanced Vegetation Index) and other image processing operations. When analysing SCP, it was realized that a set of operations, that are very useful in teaching classes of remote sensing and image processing tasks, were lacking, such as the visualization of histograms, the application of filters, different image corrections, unsupervised classification and several environmental indices computation. The new set of operations included in the PI2GIS plugin can be divided into three groups: pre-processing, processing, and classification procedures. The application was tested consider an image from Landsat 8 OLI from a North area of Portugal.
NASA Astrophysics Data System (ADS)
Pescaru, A.; Oanta, E.; Axinte, T.; Dascalescu, A.-D.
2015-11-01
Computer aided engineering is based on models of the phenomena which are expressed as algorithms. The implementations of the algorithms are usually software applications which are processing a large volume of numerical data, regardless the size of the input data. In this way, the finite element method applications used to have an input data generator which was creating the entire volume of geometrical data, starting from the initial geometrical information and the parameters stored in the input data file. Moreover, there were several data processing stages, such as: renumbering of the nodes meant to minimize the size of the band length of the system of equations to be solved, computation of the equivalent nodal forces, computation of the element stiffness matrix, assemblation of system of equations, solving the system of equations, computation of the secondary variables. The modern software application use pre-processing and post-processing programs to easily handle the information. Beside this example, CAE applications use various stages of complex computation, being very interesting the accuracy of the final results. Along time, the development of CAE applications was a constant concern of the authors and the accuracy of the results was a very important target. The paper presents the various computing techniques which were imagined and implemented in the resulting applications: finite element method programs, finite difference element method programs, applied general numerical methods applications, data generators, graphical applications, experimental data reduction programs. In this context, the use of the extended precision data types was one of the solutions, the limitations being imposed by the size of the memory which may be allocated. To avoid the memory-related problems the data was stored in files. To minimize the execution time, part of the file was accessed using the dynamic memory allocation facilities. One of the most important consequences of the paper is the design of a library which includes the optimized solutions previously tested, that may be used for the easily development of original CAE cross-platform applications. Last but not least, beside the generality of the data type solutions, there is targeted the development of a software library which may be used for the easily development of node-based CAE applications, each node having several known or unknown parameters, the system of equations being automatically generated and solved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wantuck, P. J.; Hollen, R. M.
2002-01-01
This paper provides an overview of some design and automation-related projects ongoing within the Applied Engineering Technologies (AET) Group at Los Alamos National Laboratory. AET uses a diverse set of technical capabilities to develop and apply processes and technologies to applications for a variety of customers both internal and external to the Laboratory. The Advanced Recovery and Integrated Extraction System (ARIES) represents a new paradigm for the processing of nuclear material from retired weapon systems in an environment that seeks to minimize the radiation dose to workers. To achieve this goal, ARIES relies upon automation-based features to handle and processmore » the nuclear material. Our Chemical Process Development Team specializes in fuzzy logic and intelligent control systems. Neural network technology has been utilized in some advanced control systems developed by team members. Genetic algorithms and neural networks have often been applied for data analysis. Enterprise modeling, or discrete event simulation, as well as chemical process simulation has been employed for chemical process plant design. Fuel cell research and development has historically been an active effort within the AET organization. Under the principal sponsorship of the Department of Energy, the Fuel Cell Team is now focusing on technologies required to produce fuel cell compatible feed gas from reformation of a variety of conventional fuels (e.g., gasoline, natural gas), principally for automotive applications. This effort involves chemical reactor design and analysis, process modeling, catalyst analysis, as well as full scale system characterization and testing. The group's Automation and Robotics team has at its foundation many years of experience delivering automated and robotic systems for nuclear, analytical chemistry, and bioengineering applications. As an integrator of commercial systems and a developer of unique custom-made systems, the team currently supports the automation needs of many Laboratory programs.« less
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 10 2010-07-01 2010-07-01 false Closed vent systems and control devices; or emissions routed to a fuel gas system or process standards. 63.1034 Section 63.1034 Protection... stringent. The 20 parts per million by volume standard is not applicable to the provisions of § 63.1016. (ii...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 10 2011-07-01 2011-07-01 false Closed vent systems and control devices; or emissions routed to a fuel gas system or process standards. 63.1034 Section 63.1034 Protection... stringent. The 20 parts per million by volume standard is not applicable to the provisions of § 63.1016. (ii...