Science.gov

Sample records for processing system development

  1. An Instructional Systems Development Process.

    ERIC Educational Resources Information Center

    Campbell, Clifton P.

    Instructional systems development (ISD) is a systems approach to curriculum development and instructional delivery. It is oriented toward occupational needs with an emphasis on what it is that students must learn to perform specific tasks, what facilities best provide a setting for the neccessary learning, and what instructional methods and media…

  2. Developing an Internal Processing System.

    ERIC Educational Resources Information Center

    DeFord, Diane

    1997-01-01

    The goal in Reading Recovery is to support children to develop "in the head" operations or strategies that aid them to solve problems as they read and write continuous text. To help children in organizing experience and correct any idiosyncratic or unreliable relationships, teachers must understand how children develop their internal processing…

  3. The message processing and distribution system development

    NASA Astrophysics Data System (ADS)

    Whitten, K. L.

    1981-06-01

    A historical approach is used in presenting the life cycle development of the Navy's message processing and distribution system beginning with the planning phase and ending with the integrated logistic support phase. Several maintenance problems which occurred after the system was accepted for fleet use were examined to determine if they resulted from errors in the acquisition process. The critical decision points of the acquisition process are examined and constructive recommendations are made for avoiding the problems which hindered the successful development of this system.

  4. Teaching Information Systems Development via Process Variants

    ERIC Educational Resources Information Center

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  5. Digital processing system for developing countries

    NASA Technical Reports Server (NTRS)

    Nanayakkara, C.; Wagner, H.

    1977-01-01

    An effort was undertaken to perform simple digital processing tasks using pre-existing general purpose digital computers. An experimental software package, LIGMALS, was obtained and modified for this purpose. The resulting software permits basic processing tasks to be performed including level slicing, gray mapping and ratio processing. The experience gained in this project indicates a possible direction which may be used by other developing countries to obtain digital processing capabilities.

  6. A Comprehensive Process for Display Systems Development.

    ERIC Educational Resources Information Center

    Simcox, William A.

    A comprehensive development process for display design, focusing on computer-generated cathode ray tube (CRT) displays is presented. A framework is created for breaking the display into its component parts, used to guide the design process. The objective is to design or select the most cost effective graphics solution (hardware and software) to…

  7. System Development by Process Integrated Knowledge Management

    NASA Astrophysics Data System (ADS)

    Stoll, Margareth; Laner, Dietmar

    Due to globalization and ever shorter change cycle's organizations improve increasingly faster their products, services, technologies, IT and organization according to customer requirements, optimize their efficiency, effectiveness and reduce costs. Thus the largest potential is the continually improvement and the management of information, data and knowledge. Long time organizations had developed lot separate and frequently independent IT applications. In the last years they were integrated by interfaces and always more by common databases. In large sized enterprises or in the public administration IT must operate various different applications, which requires a lot of personal and cost. Many organizations improve their IT starting from the lived processes using new technologies, but ask not, how they can use technology to support new processes.

  8. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Basile, Lisa R.; Kelly, Angelita C.

    1987-01-01

    The Spacelab Data Processing Facility (SLDPF) is an integral part of the Space Shuttle data network for missions that involve attached scientific payloads. Expert system prototypes were developed to aid in the performance of the quality assurance function of the Spacelab and/or Attached Shuttle Payloads processed telemetry data. The Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS), two expert systems, were developed to determine their feasibility and potential in the quality assurance of processed telemetry data. The capabilities and performance of these systems are discussed.

  9. The Systems Engineering Process for Human Support Technology Development

    NASA Technical Reports Server (NTRS)

    Jones, Harry

    2005-01-01

    Systems engineering is designing and optimizing systems. This paper reviews the systems engineering process and indicates how it can be applied in the development of advanced human support systems. Systems engineering develops the performance requirements, subsystem specifications, and detailed designs needed to construct a desired system. Systems design is difficult, requiring both art and science and balancing human and technical considerations. The essential systems engineering activity is trading off and compromising between competing objectives such as performance and cost, schedule and risk. Systems engineering is not a complete independent process. It usually supports a system development project. This review emphasizes the NASA project management process as described in NASA Procedural Requirement (NPR) 7120.5B. The process is a top down phased approach that includes the most fundamental activities of systems engineering - requirements definition, systems analysis, and design. NPR 7120.5B also requires projects to perform the engineering analyses needed to ensure that the system will operate correctly with regard to reliability, safety, risk, cost, and human factors. We review the system development project process, the standard systems engineering design methodology, and some of the specialized systems analysis techniques. We will discuss how they could apply to advanced human support systems development. The purpose of advanced systems development is not directly to supply human space flight hardware, but rather to provide superior candidate systems that will be selected for implementation by future missions. The most direct application of systems engineering is in guiding the development of prototype and flight experiment hardware. However, anticipatory systems engineering of possible future flight systems would be useful in identifying the most promising development projects.

  10. Development of the Diagnostic Expert System for Tea Processing

    NASA Astrophysics Data System (ADS)

    Yoshitomi, Hitoshi; Yamaguchi, Yuichi

    A diagnostic expert system for tea processing which can presume the cause of the defect of the processed tea was developed to contribute to the improvement of tea processing. This system that consists of some programs can be used through the Internet. The inference engine, the core of the system adopts production system which is well used on artificial intelligence, and is coded by Prolog as the artificial intelligence oriented language. At present, 176 rules for inference have been registered on this system. The system will be able to presume better if more rules are added to the system.

  11. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William

    1987-01-01

    Spacelab Data Processing Facility (SLDPF) expert system prototypes were developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. The SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

  12. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William

    1987-01-01

    Spacelab Data Processing Facility (SLDPF) expert system prototypes have been developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

  13. Metal containing material processing on coater/developer system

    NASA Astrophysics Data System (ADS)

    Kawakami, Shinichiro; Mizunoura, Hiroshi; Matsunaga, Koichi; Hontake, Koichi; Nakamura, Hiroshi; Shimura, Satoru; Enomoto, Masashi

    2016-03-01

    Challenges of processing metal containing materials need to be addressed in order apply this technology to Behavior of metal containing materials on coater/developer processing including coating process, developer process and tool metal contamination is studied using CLEAN TRACKTM LITHIUS ProTM Z (Tokyo Electron Limited). Through this work, coating uniformity and coating film defectivity were studied. Metal containing material performance was comparable to conventional materials. Especially, new dispense system (NDS) demonstrated up to 80% reduction in coating defect for metal containing materials. As for processed wafer metal contamination, coated wafer metal contamination achieved less than 1.0E10 atoms/cm2 with 3 materials. After develop metal contamination also achieved less than 1.0E10 atoms/cm2 with 2 materials. Furthermore, through the metal defect study, metal residues and metal contamination were reduced by developer rinse optimization.

  14. Guideline Development Process in a Public Workers' Compensation System.

    PubMed

    Javaher, Simone P

    2015-08-01

    Washington state's public workers' compensation system has had a formal process for developing and implementing evidence-based clinical practice guidelines since 2007. Collaborating with the Industrial Insurance Medical Advisory Committee and clinicians from the medical community, the Office of the Medical Director has provided leadership and staff support necessary to develop guidelines that have improved outcomes and reduced the number of potentially harmful procedures. Guidelines are selected according to a prioritization schema and follow a development process consistent with that of the national Institute of Medicine. Evaluation criteria are also applied. Guidelines continue to be developed to provide clinical recommendations for optimizing care and reducing risk of harm. PMID:26231956

  15. Process approach in developing or improvement of student information systems

    NASA Astrophysics Data System (ADS)

    Jaskowska, Małgorzata

    2015-02-01

    An aim of research described in the article was to evaluate usefulness of the university information system, which precedes its reorganization. The study was conducted among representatives of all stakeholders - system users: candidates, students and university authorities. A need of system users expressed in the study: change of the approach in its construction - from purely information to procedural, it is consistent with a current process approach in systems design, intensified by the fashionable service oriented architecture (SOA). This thread was developed by conducting literature research and analysis of student information systems best practices. As a result the processes were selected and described, which implementation may assist the university system. Research result can be used by system designers for its improvement.

  16. Development of video processing based on coal flame detector system

    SciTech Connect

    He Wanqing; Yu Yuefeng; Xu Weiyong; Ma Liqun

    1999-07-01

    The principle and development of a set of pulverized coal combustion flame detection system, which is called intelligent image flame detector device based on digital video processing, is addressed in this paper. The system realizes multi-burner flame detection and processing using a distributive structure of engineering workstation and flame detectors via multi-serial-port communication. The software can deal with multi-tasks in a parallel way based on multi-thread mechanism. Streaming video capture and storage is provided to safe and playback the accidental Audio and Visual Interfaces (AVI) clips. The layer flame detectors can give the flame on/off signal through image processing. Pseudo-color visualization of flame temperature calculated from chromatic CCD signal is integrated into the system. The image flame detector system has been successfully used in thermal power generation units in China.

  17. Development of Systems Engineering Model for Spent Fuel Extraction Process

    SciTech Connect

    Sun, Lijian; Royyuru, Haritha; Hsieh, Hsuan-Tsung 'Sean'; Chen, Yitung; Clarksean, Randy; Vandegrift, George; Copple, Jackie; Laidler, James

    2004-07-01

    The mission of the Transmutation Research Program (TRP) at University of Nevada, Las Vegas (UNLV) is to establish a nuclear engineering test bed that can carry out effective transmutation and advanced reactor research and development effort. The Nevada Center for Advanced Computational Methods (NCACM) at UNLV is currently developing the systems engineering model, TRPSEMPro (Transmutation Research Program System Engineering Model Project, that provides process optimization through the automatic adjustment on input parameters, such as feed compositions, stages, flow rates, etc., based on the extraction efficiency of components and concerned output factors. An object-oriented programming (OOP) is considered. Such systems engineering model consists of task manager, task integration and solution monitor modules. A MS SQL server database is implemented for managing data flow from optimization processing. Task manager coordinates and interacts with other two modules. Task integration module works as a flowsheet constructor that builds task hierarchy, input parameter values and constrains. Task solution monitor component presents both in-progress and final outputs in tabulated and graphical formats. The system can monitor parameter justification outputs from optimization toolbox developed by Mathworks' MatLab commercial software. While initial parameter constraint identifications for using optimization process is tedious and time-consuming, the interface also provides a multiple-run process that executes a design matrix without invoking any optimization module. Experimental reports can be flexibly generated through database query and formatting. (authors)

  18. The Development of Sun-Tracking System Using Image Processing

    PubMed Central

    Lee, Cheng-Dar; Huang, Hong-Cheng; Yeh, Hong-Yih

    2013-01-01

    This article presents the development of an image-based sun position sensor and the algorithm for how to aim at the Sun precisely by using image processing. Four-quadrant light sensors and bar-shadow photo sensors were used to detect the Sun's position in the past years. Nevertheless, neither of them can maintain high accuracy under low irradiation conditions. Using the image-based Sun position sensor with image processing can address this drawback. To verify the performance of the Sun-tracking system including an image-based Sun position sensor and a tracking controller with embedded image processing algorithm, we established a Sun image tracking platform and did the performance testing in the laboratory; the results show that the proposed Sun tracking system had the capability to overcome the problem of unstable tracking in cloudy weather and achieve a tracking accuracy of 0.04°. PMID:23615582

  19. Development of KIAPS Observation Processing Package for Data Assimilation System

    NASA Astrophysics Data System (ADS)

    Kang, Jeon-Ho; Chun, Hyoung-Wook; Lee, Sihye; Han, Hyun-Jun; Ha, Su-Jin

    2015-04-01

    The Korea Institute of Atmospheric Prediction Systems (KIAPS) was founded in 2011 by the Korea Meteorological Administration (KMA) to develop Korea's own global Numerical Weather Prediction (NWP) system as nine year (2011-2019) project. Data assimilation team at KIAPS has been developing the observation processing system (KIAPS Package for Observation Processing: KPOP) to provide optimal observations to the data assimilation system for the KIAPS Global Model (KIAPS Integrated Model - Spectral Element method based on HOMME: KIM-SH). Currently, the KPOP is capable of processing the satellite radiance data (AMSU-A, IASI), GPS Radio Occultation (GPS-RO), AIRCRAFT (AMDAR, AIREP, and etc…), and synoptic observation (SONDE and SURFACE). KPOP adopted Radiative Transfer for TOVS version 10 (RTTOV_v10) to get brightness temperature (TB) for each channel at top of the atmosphere (TOA), and Radio Occultation Processing Package (ROPP) 1-dimensional forward module to get bending angle (BA) at each tangent point. The observation data are obtained from the KMA which has been composited with BUFR format to be converted with ODB that are used for operational data assimilation and monitoring at the KMA. The Unified Model (UM), Community Atmosphere - Spectral Element (CAM-SE) and KIM-SH model outputs are used for the bias correction (BC) and quality control (QC) of the observations, respectively. KPOP provides radiance and RO data for Local Ensemble Transform Kalman Filter (LETKF) and also provides SONDE, SURFACE and AIRCRAFT data for Three-Dimensional Variational Assimilation (3DVAR). We are expecting all of the observation type which processed in KPOP could be combined with both of the data assimilation method as soon as possible. The preliminary results from each observation type will be introduced with the current development status of the KPOP.

  20. Development and fabrication of a solar cell junction processing system

    NASA Technical Reports Server (NTRS)

    1984-01-01

    A processing system capable of producing solar cell junctions by ion implantation followed by pulsed electron beam annealing was developed and constructed. The machine was to be capable of processing 4-inch diameter single-crystal wafers at a rate of 10(7) wafers per year. A microcomputer-controlled pulsed electron beam annealer with a vacuum interlocked wafer transport system was designed, built and demonstrated to produce solar cell junctions on 4-inch wafers with an AMI efficiency of 12%. Experiments showed that a non-mass-analyzed (NMA) ion beam could implant 10 keV phosphorous dopant to form solar cell junctions which were equivalent to mass-analyzed implants. A NMA ion implanter, compatible with the pulsed electron beam annealer and wafer transport system was designed in detail but was not built because of program termination.

  1. Development and fabrication of a solar cell junction processing system

    NASA Technical Reports Server (NTRS)

    Bunker, S.

    1981-01-01

    A solar cell junction processing system was developed and fabricated. A pulsed electron beam for the four inch wafers is being assembled and tested, wafers were successfully pulsed, and solar cells fabricated. Assembly of the transport locks is completed. The transport was operated successfully but not with sufficient reproducibility. An experiment test facility to examine potential scaleup problems associated with the proposed ion implanter design was constructed and operated. Cells were implanted and found to have efficiency identical to the normal Spire implant process.

  2. Market development directory for solar industrial process heat systems

    SciTech Connect

    1980-02-01

    The purpose of this directory is to provide a basis for market development activities through a location listing of key trade associations, trade periodicals, and key firms for three target groups. Potential industrial users and potential IPH system designers were identified as the prime targets for market development activities. The bulk of the directory is a listing of these two groups. The third group, solar IPH equipment manufacturers, was included to provide an information source for potential industrial users and potential IPH system designers. Trade associates and their publications are listed for selected four-digit Standard Industrial Code (SIC) industries. Since industries requiring relatively lower temperature process heat probably will comprise most of the near-term market for solar IPH systems, the 80 SIC's included in this chapter have process temperature requirements less than 350/sup 0/F. Some key statistics and a location list of the largest plants (according to number of employees) in each state are included for 15 of the 80 SIC's. Architectural/engineering and consulting firms are listed which are known to have solar experience. Professional associated and periodicals to which information on solar IPH sytstems may be directed also are included. Solar equipment manufacturers and their associations are listed. The listing is based on the SERI Solar Energy Information Data Base (SEIDB).

  3. Laser processing system development of large area and high precision

    NASA Astrophysics Data System (ADS)

    Park, Hyeongchan; Ryu, Kwanghyun; Hwang, Taesang

    2013-03-01

    As industry of PCB (Printed Circuit Board) and display growing, this industry requires an increasingly high-precision quality so current cutting process in industry is preferred laser machining than mechanical machining. Now, laser machining is used almost "step and repeat" method in large area, but this method has a problem such as cutting quality in the continuity of edge parts, cutting speed and low productivity. To solve these problems in large area, on-the-fly (stagescanner synchronized system) is gradually increasing. On-the-fly technology is able to process large area with high speed because of stage-scanner synchronized moving. We designed laser-based high precision system with on-the-fly. In this system, we used UV nano-second pulse laser, power controller and scanner with telecentric f-theta lens. The power controller is consisted of HWP(Half Wave Plate), thin film plate polarizer, photo diode, micro step motor and control board. Laser power is possible to monitor real-time and adjust precision power by using power controller. Using this machine, we tested cutting of large area coverlay and sheet type large area PCB by applying on-the-fly. As a result, our developed machine is possible to process large area without the problem of the continuity of edge parts and by high cutting speed than competitor about coverlay.

  4. Development of Data Processing Software for NBI Spectroscopic Analysis System

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodan; Hu, Chundong; Sheng, Peng; Zhao, Yuanzhe; Wu, Deyun; Cui, Qinglong

    2015-04-01

    A set of data processing software is presented in this paper for processing NBI spectroscopic data. For better and more scientific managment and querying these data, they are managed uniformly by the NBI data server. The data processing software offers the functions of uploading beam spectral original and analytic data to the data server manually and automatically, querying and downloading all the NBI data, as well as dealing with local LZO data. The set software is composed of a server program and a client program. The server software is programmed in C/C++ under a CentOS development environment. The client software is developed under a VC 6.0 platform, which offers convenient operational human interfaces. The network communications between the server and the client are based on TCP. With the help of this set software, the NBI spectroscopic analysis system realizes the unattended automatic operation, and the clear interface also makes it much more convenient to offer beam intensity distribution data and beam power data to operators for operation decision-making. supported by National Natural Science Foundation of China (No. 11075183), the Chinese Academy of Sciences Knowledge Innovation

  5. On the Hilbert-Huang Transform Data Processing System Development

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Flatley, Thomas P.; Huang, Norden E.; Cornwell, Evette; Smith, Darell

    2003-01-01

    One of the main heritage tools used in scientific and engineering data spectrum analysis is the Fourier Integral Transform and its high performance digital equivalent - the Fast Fourier Transform (FFT). The Fourier view of nonlinear mechanics that had existed for a long time, and the associated FFT (fairly recent development), carry strong a-priori assumptions about the source data, such as linearity and of being stationary. Natural phenomena measurements are essentially nonlinear and nonstationary. A very recent development at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), known as the Hilbert-Huang Transform (HHT) proposes a novel approach to the solution for the nonlinear class of spectrum analysis problems. Using the Empirical Mode Decomposition (EMD) followed by the Hilbert Transform of the empirical decomposition data (HT), the HHT allows spectrum analysis of nonlinear and nonstationary data by using an engineering a-posteriori data processing, based on the EMD algorithm. This results in a non-constrained decomposition of a source real value data vector into a finite set of Intrinsic Mode Functions (IMF) that can be further analyzed for spectrum interpretation by the classical Hilbert Transform. This paper describes phase one of the development of a new engineering tool, the HHT Data Processing System (HHTDPS). The HHTDPS allows applying the "T to a data vector in a fashion similar to the heritage FFT. It is a generic, low cost, high performance personal computer (PC) based system that implements the HHT computational algorithms in a user friendly, file driven environment. This paper also presents a quantitative analysis for a complex waveform data sample, a summary of technology commercialization efforts and the lessons learned from this new technology development.

  6. Development of techniques for processing metal-metal oxide systems

    NASA Technical Reports Server (NTRS)

    Johnson, P. C.

    1976-01-01

    Techniques for producing model metal-metal oxide systems for the purpose of evaluating the results of processing such systems in the low-gravity environment afforded by a drop tower facility are described. Because of the lack of success in producing suitable materials samples and techniques for processing in the 3.5 seconds available, the program was discontinued.

  7. System Engineering Processes at Kennedy Space Center for Development of the SLS and Orion Launch Systems

    NASA Technical Reports Server (NTRS)

    Schafer, Eric J.

    2012-01-01

    There are over 40 subsystems being developed for the future SLS and Orion Launch Systems at Kennedy Space Center. These subsystems developed at the Kennedy Space Center Engineering Directorate follow a comprehensive design process which requires several different product deliverables during each phase of each of the subsystems. This Paper describes this process and gives an example of where the process has been applied.

  8. Development of an instructional expert system for hole drilling processes

    NASA Technical Reports Server (NTRS)

    Al-Mutawa, Souhaila; Srinivas, Vijay; Moon, Young Bai

    1990-01-01

    An expert system which captures the expertise of workshop technicians in the drilling domain was developed. The expert system is aimed at novice technicians who know how to operate the machines but have not acquired the decision making skills that are gained with experience. This paper describes the domain background and the stages of development of the expert system.

  9. Development of an automated ammunition processing system for battlefield use

    SciTech Connect

    Speaks, D.M.; Chesser, J.B.; Lloyd, P.D.; Miller, E.D.; Ray, T.L.; Weil, B.S.

    1995-03-01

    The Future Armored Resupply Vehicle (FARV) will be the companion ammunition resupply vehicle to the Advanced Field Artillery System (AFAS). These systems are currently being investigated by the US Army for future acquisition. The FARV will sustain the AFAS with ammunition and fuel and will significantly increase capabilities over current resupply vehicles. Currently ammunition is transferred to field artillery almost entirely by hand. The level of automation to be included into the FARV is still under consideration. At the request of the US Army`s Project Manager, AFAS/FARV, Oak Ridge National Laboratory (ORNL) identified and evaluated various concepts for the automated upload, processing, storage, and delivery equipment for the FARV. ORNL, working with the sponsor, established basic requirements and assumptions for concept development and the methodology for concept selection. A preliminary concept has been selected, and the associated critical technologies have been identified. ORNL has provided technology demonstrations of many of these critical technologies. A technology demonstrator which incorporates all individual components into a total process demonstration is planned for late FY 1995.

  10. Multi-kilowatt modularized spacecraft power processing system development

    NASA Technical Reports Server (NTRS)

    Andrews, R. E.; Hayden, J. H.; Hedges, R. T.; Rehmann, D. W.

    1975-01-01

    A review of existing information pertaining to spacecraft power processing systems and equipment was accomplished with a view towards applicability to the modularization of multi-kilowatt power processors. Power requirements for future spacecraft were determined from the NASA mission model-shuttle systems payload data study which provided the limits for modular power equipment capabilities. Three power processing systems were compared to evaluation criteria to select the system best suited for modularity. The shunt regulated direct energy transfer system was selected by this analysis for a conceptual design effort which produced equipment specifications, schematics, envelope drawings, and power module configurations.

  11. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Appendices; Volume 2

    NASA Technical Reports Server (NTRS)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g. missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  12. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Volume 1; Appendices

    NASA Technical Reports Server (NTRS)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g., missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  13. Risk communication strategy development using the aerospace systems engineering process

    NASA Technical Reports Server (NTRS)

    Dawson, S.; Sklar, M.

    2004-01-01

    This paper explains the goals and challenges of NASA's risk communication efforts and how the Aerospace Systems Engineering Process (ASEP) was used to map the risk communication strategy used at the Jet Propulsion Laboratory to achieve these goals.

  14. Development of Electronic Data Processing /EDP/ augmented management system

    NASA Technical Reports Server (NTRS)

    Scott, J. E.; Waddleton, T. R.

    1968-01-01

    To tailor the existing Unified Flight Analysis System to management data rather than technical data, a pilot model could be produced in breadboard form, using electronic data processing, in a matter of a few months at very moderate cost. Such a system lends itself to continuous refinement.

  15. Tracker: Image-Processing and Object-Tracking System Developed

    NASA Technical Reports Server (NTRS)

    Klimek, Robert B.; Wright, Theodore W.

    1999-01-01

    Tracker is an object-tracking and image-processing program designed and developed at the NASA Lewis Research Center to help with the analysis of images generated by microgravity combustion and fluid physics experiments. Experiments are often recorded on film or videotape for analysis later. Tracker automates the process of examining each frame of the recorded experiment, performing image-processing operations to bring out the desired detail, and recording the positions of the objects of interest. It can load sequences of images from disk files or acquire images (via a frame grabber) from film transports, videotape, laser disks, or a live camera. Tracker controls the image source to automatically advance to the next frame. It can employ a large array of image-processing operations to enhance the detail of the acquired images and can analyze an arbitrarily large number of objects simultaneously. Several different tracking algorithms are available, including conventional threshold and correlation-based techniques, and more esoteric procedures such as "snake" tracking and automated recognition of character data in the image. The Tracker software was written to be operated by researchers, thus every attempt was made to make the software as user friendly and self-explanatory as possible. Tracker is used by most of the microgravity combustion and fluid physics experiments performed by Lewis, and by visiting researchers. This includes experiments performed on the space shuttles, Mir, sounding rockets, zero-g research airplanes, drop towers, and ground-based laboratories. This software automates the analysis of the flame or liquid s physical parameters such as position, velocity, acceleration, size, shape, intensity characteristics, color, and centroid, as well as a number of other measurements. It can perform these operations on multiple objects simultaneously. Another key feature of Tracker is that it performs optical character recognition (OCR). This feature is useful in

  16. System Engineering Processes at Kennedy Space Center for Development of SLS and Orion Launch Systems

    NASA Technical Reports Server (NTRS)

    Schafer, Eric; Stambolian, Damon; Henderson, Gena

    2013-01-01

    There are over 40 subsystems being developed for the future SLS and Orion Launch Systems at Kennedy Space Center. These subsystems are developed at the Kennedy Space Center Engineering Directorate. The Engineering Directorate at Kennedy Space Center follows a comprehensive design process which requires several different product deliverables during each phase of each of the subsystems. This Presentation describes this process with examples of where the process has been applied.

  17. A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The purpose of this program is to demonstrate the technical readiness of a cost effective process sequence that has the potential for the production of flat plate photovoltaic modules which met the price goal in 1986 of $.70 or less per watt peak. Program efforts included: preliminary design review, preliminary cell fabrication using the proposed process sequence, verification of sandblasting back cleanup, study of resist parameters, evaluation of pull strength of the proposed metallization, measurement of contact resistance of Electroless Ni contacts, optimization of process parameter, design of the MEPSDU module, identification and testing of insulator tapes, development of a lamination process sequence, identification, discussions, demonstrations and visits with candidate equipment vendors, evaluation of proposals for tabbing and stringing machine.

  18. A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Design work for a photovoltaic module, fabricated using single crystal silicon dendritic web sheet material, resulted in the identification of surface treatment to the module glass superstrate which improved module efficiencies. A final solar module environmental test, a simulated hailstone impact test, was conducted on full size module superstrates to verify that the module's tempered glass superstrate can withstand specified hailstone impacts near the corners and edges of the module. Process sequence design work on the metallization process selective, liquid dopant investigation, dry processing, and antireflective/photoresist application technique tasks, and optimum thickness for Ti/Pd are discussed. A noncontact cleaning method for raw web cleaning was identified and antireflective and photoresist coatings for the dendritic webs were selected. The design of a cell string conveyor, an interconnect feed system, rolling ultrasonic spot bonding heat, and the identification of the optimal commercially available programmable control system are also discussed. An economic analysis to assess cost goals of the process sequence is also given.

  19. Decreasing costs of ground data processing system development using a software product line

    NASA Technical Reports Server (NTRS)

    Chaffin, Brian

    2005-01-01

    In this paper, I describe software product lines and why a Ground Data Processing System should use one. I also describe how to develop a software product line, using examples from an imaginary Ground Data Processing System.

  20. Carbon Dioxide Reduction Post-Processing Sub-System Development

    NASA Technical Reports Server (NTRS)

    Abney, Morgan B.; Miller, Lee A.; Greenwood, Zachary; Barton, Katherine

    2012-01-01

    The state-of-the-art Carbon Dioxide (CO2) Reduction Assembly (CRA) on the International Space Station (ISS) facilitates the recovery of oxygen from metabolic CO2. The CRA utilizes the Sabatier process to produce water with methane as a byproduct. The methane is currently vented overboard as a waste product. Because the CRA relies on hydrogen for oxygen recovery, the loss of methane ultimately results in a loss of oxygen. For missions beyond low earth orbit, it will prove essential to maximize oxygen recovery. For this purpose, NASA is exploring an integrated post-processor system to recover hydrogen from CRA methane. The post-processor, called a Plasma Pyrolysis Assembly (PPA) partially pyrolyzes methane to recover hydrogen with acetylene as a byproduct. In-flight operation of post-processor will require a Methane Purification Assembly (MePA) and an Acetylene Separation Assembly (ASepA). Recent efforts have focused on the design, fabrication, and testing of these components. The results and conclusions of these efforts will be discussed as well as future plans.

  1. Processing system

    NASA Technical Reports Server (NTRS)

    Hilland, J. E.

    1983-01-01

    To implement the analysis techniques and to provide end-to-end processing, a system was designed with the following capabilities: receive and catalog data from many sources; organize the data on mass storage for rapid access; edit for reasonableness; create new data sets by sorting on parameter, averaging and merging; provide statistical analysis and display tools; and distribute data on demand. Consideration was given to developing a flexible system that could meet immediate workshop needs and respond to future requirements. System architecture and data set details implemented are discussed.

  2. Systematic, Systemic and Motivating: The K-12 Career Development Process

    ERIC Educational Resources Information Center

    Snyder, Deborah; Jackson, Sherry

    2006-01-01

    In Butler County, Ohio, Butler Technology and Career Development Schools (Butler Tech) firmly believes that systematic delivery of career development theory and practice integrated with academic content standards will enable students to do all of the above. Because of this, Butler Tech's Career Initiatives division delivers a countywide career…

  3. A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Restructuring research objectives from a technical readiness demonstration program to an investigation of high risk, high payoff activities associated with producing photovoltaic modules using non-CZ sheet material is reported. Deletion of the module frame in favor of a frameless design, and modification in cell series parallel electrical interconnect configuration are reviewed. A baseline process sequence was identified for the fabrication of modules using the selected dendritic web sheet material, and economic evaluations of the sequence were completed.

  4. Development and fabrication of a solar cell junction processing system

    NASA Technical Reports Server (NTRS)

    Halvason, W.

    1980-01-01

    Experiments were completed which indicate that single-pulse, liquid-phase epitaxial regrowth is the optimum technique for pulsed electron beam annealing of ion implantation damage in silicon wafers. An electron beam which covers the entire area of the wafer was chosen for the solar cell processor. Beam control experiments to improve beam propagation and to test the concept of partial space charge and current neutralization were initiated. The electrical parameters of the pulsed electron beam subsystem were chosen on the basis of computer calculations and past experience in pulsed electron accelerator design and operation. The pulser, designated SPI-PULSE 7000, is designed to anneal 10 cm diameter silicon wafers at a rate of 30 per minute. The preliminary design of the major elements of the SPI-PULSE 7000 was completed, and the detailed design of many of the components begun. These elements include a capacitive energy store and charging system, an electron accelerator, a beam control system, a wafer handling system and pressure and vacuum assemblies.

  5. Development of GENOA Progressive Failure Parallel Processing Software Systems

    NASA Technical Reports Server (NTRS)

    Abdi, Frank; Minnetyan, Levon

    1999-01-01

    A capability consisting of software development and experimental techniques has been developed and is described. The capability is integrated into GENOA-PFA to model polymer matrix composite (PMC) structures. The capability considers the physics and mechanics of composite materials and structure by integration of a hierarchical multilevel macro-scale (lamina, laminate, and structure) and micro scale (fiber, matrix, and interface) simulation analyses. The modeling involves (1) ply layering methodology utilizing FEM elements with through-the-thickness representation, (2) simulation of effects of material defects and conditions (e.g., voids, fiber waviness, and residual stress) on global static and cyclic fatigue strengths, (3) including material nonlinearities (by updating properties periodically) and geometrical nonlinearities (by Lagrangian updating), (4) simulating crack initiation. and growth to failure under static, cyclic, creep, and impact loads. (5) progressive fracture analysis to determine durability and damage tolerance. (6) identifying the percent contribution of various possible composite failure modes involved in critical damage events. and (7) determining sensitivities of failure modes to design parameters (e.g., fiber volume fraction, ply thickness, fiber orientation. and adhesive-bond thickness). GENOA-PFA progressive failure analysis is now ready for use to investigate the effects on structural responses to PMC material degradation from damage induced by static, cyclic (fatigue). creep, and impact loading in 2D/3D PMC structures subjected to hygrothermal environments. Its use will significantly facilitate targeting design parameter changes that will be most effective in reducing the probability of a given failure mode occurring.

  6. The development process for the space shuttle primary avionics software system

    NASA Technical Reports Server (NTRS)

    Keller, T. W.

    1987-01-01

    Primary avionics software system; software development approach; user support and problem diagnosis; software releases and configuration; quality/productivity programs; and software development/production facilities are addressed. Also examined are the external evaluations of the IBM process.

  7. The Development of a Digital Processing System for Accurate Range Determinations. [for Teleoperator Maneuvering Systems

    NASA Technical Reports Server (NTRS)

    Pujol, A., Jr.

    1983-01-01

    The development of an accurate close range (from 0.0 meters to 30.0 meters) radar system for Teleoperator Maneuvering Systems (TMS) is discussed. The system under investigation is a digital processor that converts incoming signals from the radar system into their related frequency spectra. Identification will be attempted by correlating spectral characteristics with accurate range determinataions. The system will utilize an analog to digital converter for sampling and converting the signal from the radar system into 16-bit digital words (two bytes) for RAM storage, data manipulations, and computations. To remove unwanted frequency components the data will be retrieved from RAM and digitally filtered using large scale integration (LSI) circuits. Filtering will be performed by a biquadratic routine within the chip which carries out the required filter algorithm. For conversion to a frequency spectrum the filtered data will be processed by a Fast Fourier Transform chip. Analysis and identification of spectral characteristics for accurate range determinations will be made by microcomputer computations.

  8. The development of a coal-fired combustion system for industrial process heating applications

    SciTech Connect

    Not Available

    1992-07-16

    PETC has implemented a number of advanced combustion research projects that will lead to the establishment of a broad, commercially acceptable engineering data base for the advancement of coal as the fuel of choice for boilers, furnaces, and process heaters. Vortec Corporation's Coal-Fired Combustion System for Industrial Process Heating Applications has been selected for Phase III development under contract DE-AC22-91PC91161. This advanced combustion system research program is for the development of innovative coal-fired process heaters which can be used for high temperature melting, smelting, recycling, and refining processes. The process heater concepts to be developed are based on advanced glass melting and ore smelting furnaces developed and patented by Vortec Corporation. The process heater systems to be developed have multiple use applications; however, the Phase HI research effort is being focused on the development of a process heater system to be used for producing glass frits and wool fiber from boiler and incinerator ashes. The primary objective of the Phase III project is to develop and integrate all the system components, from fuel through total system controls, and then test the complete system in order to evaluate its potential marketability. The economic evaluation of commercial scale CMS processes has begun. In order to accurately estimate the cost of the primary process vessels, preliminary designs for 25, 50, and 100 ton/day systems have been started under Task 1. This data will serve as input data for life cycle cost analysis performed as part of techno-economic evaluations. The economic evaluations of commercial CMS systems will be an integral part of the commercialization plan.

  9. Low cost solar array project production process and equipment task. A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Technical readiness for the production of photovoltaic modules using single crystal silicon dendritic web sheet material is demonstrated by: (1) selection, design and implementation of solar cell and photovoltaic module process sequence in a Module Experimental Process System Development Unit; (2) demonstration runs; (3) passing of acceptance and qualification tests; and (4) achievement of a cost effective module.

  10. Preliminary paper - Integrated control process for the development of the mined geologic disposal system

    SciTech Connect

    Daniel, Russell B.; Harbert, Kevin R.; Calloway, David E.

    1997-11-26

    The US Department of Energy (DOE) Order 430.1, Life Cycle Asset Management, begins to focus DOE Programs and Projects on the total system life cycle instead of looking at project execution or operation as individual components. As DOE begins to implement this order, the DOE Management and Operating contractors must develop a process to control not only the contract baseline but also the overall life cycle baseline. This paper presents an integrated process that is currently being developed on the Yucca Mountain Project for DOE. The process integrates the current contract/project baseline management process with the management control process for design and the configuration management change control process.

  11. Lessons Learned From Developing Three Generations of Remote Sensing Science Data Processing Systems

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt; Fleig, Albert J.

    2005-01-01

    The Biospheric Information Systems Branch at NASA s Goddard Space Flight Center has developed three generations of Science Investigator-led Processing Systems for use with various remote sensing instruments. The first system is used for data from the MODIS instruments flown on NASA s Earth Observing Systems @OS) Terra and Aqua Spacecraft launched in 1999 and 2002 respectively. The second generation is for the Ozone Measuring Instrument flying on the EOS Aura spacecraft launched in 2004. We are now developing a third generation of the system for evaluation science data processing for the Ozone Mapping and Profiler Suite (OMPS) to be flown by the NPOESS Preparatory Project (NPP) in 2006. The initial system was based on large scale proprietary hardware, operating and database systems. The current OMI system and the OMPS system being developed are based on commodity hardware, the LINUX Operating System and on PostgreSQL, an Open Source RDBMS. The new system distributes its data archive across multiple server hosts and processes jobs on multiple processor boxes. We have created several instances of this system, including one for operational processing, one for testing and reprocessing and one for applications development and scientific analysis. Prior to receiving the first data from OMI we applied the system to reprocessing information from the Solar Backscatter Ultraviolet (SBUV) and Total Ozone Mapping Spectrometer (TOMS) instruments flown from 1978 until now. The system was able to process 25 years (108,000 orbits) of data and produce 800,000 files (400 GiB) of level 2 and level 3 products in less than a week. We will describe the lessons we have learned and tradeoffs between system design, hardware, operating systems, operational staffing, user support and operational procedures. During each generational phase, the system has become more generic and reusable. While the system is not currently shrink wrapped we believe it is to the point where it could be readily

  12. The Development of a Generic Framework for the Forensic Analysis of SCADA and Process Control Systems

    NASA Astrophysics Data System (ADS)

    Slay, Jill; Sitnikova, Elena

    There is continuing interest in researching generic security architectures and strategies for managing SCADA and process control systems. Documentation from various countries on IT security does now begin to recommendations for security controls for (federal) information systems which include connected process control systems. Little or no work exists in the public domain which takes a big picture approach to the issue of developing a generic or generalisable approach to SCADA and process control system forensics. The discussion raised in this paper is that before one can develop solutions to the problem of SCADA forensics, a good understanding of the forensic computing process, and the range of technical and procedural issues subsumed with in this process, need to be understood, and also agreed, by governments, industry and academia.

  13. Development of a data acquisition and processing system for precision agriculture

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A data acquisition and processing system for precision agriculture was developed by using MapX5.0 and Visual C 6.0. This system can be used easily and quickly for drawing grid maps in-field, creating parameters for grid-reorganization, guiding in-field data collection, converting data between diffe...

  14. Development of a data acquisition and processing system for precision agriculture

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A data acquisition and processing system for precision agriculture was developed by using MapX5.0 and Visual C6.0. This system can be used easily and quickly for drawing grid maps in-field, making out parameters for grid-reorganization, guiding for in-field data collection, converting data between ...

  15. BIOLOGICAL TREATABILITY OF KRW ENERGY SYSTEMS GASIFIER PDU (PROCESS DEVELOPMENT UNIT) WASTEWATERS

    EPA Science Inventory

    The report gives results of bench-scale biological treatability tests with wastewaters produced from the KRW Energy Systems gasifier process development unit (KRW-PDU). Goals of the tests were to assess the biotreatability of these aqueous wastes and to develop data for correlati...

  16. 78 FR 47012 - Developing Software Life Cycle Processes Used in Safety Systems of Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ...The U.S. Nuclear Regulatory Commission (NRC) is issuing a revised regulatory guide (RG), revision 1 of RG 1.173, ``Developing Software Life Cycle Processes for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' This RG endorses the Institute of Electrical and Electronic Engineers (IEEE) Standard (Std.) 1074-2006, ``IEEE Standard for Developing a Software Project Life......

  17. Development of a broadband telehealth system for critical care: process and lessons learned.

    PubMed

    Li, Jane; Wilson, Laurence S; Qiao, Rong-Yu; Percival, Terry; Krumm-Heller, Alex; Stapleton, Stuart; Cregan, Patrick

    2006-10-01

    A broadband telehealth system has been developed for supporting critical care services between a major referral hospital and a rural hospital by transmitting very high-quality, realtime multimedia information, including images, audio and real-time video, over an Internet Protocol (IP)-based network. The technical design team took an iterative and user-centred approach toward the system design. Usability tests with scenario analysis were incorporated into the development process to produce a system that operates seamlessly in the critical care environment. Careful analysis of the reliability of the system was incorporated into the clinical protocols for integration into existing work practices. The use of high-quality multimedia data, consideration of human factors early in the design process, and incorporation of proper development approaches were critical for the success of the system design. PMID:17042709

  18. How Process Helps You in Developing a High Quality Medical Information System

    NASA Astrophysics Data System (ADS)

    Akiyama, Yoshihiro

    A medical information system is one extreme in using tacit knowledge that patricians and medical experts such as medical doctors use a lot but the knowledge may include a lot of experience information and be not explicitly formulated or implied. This is simply different from other discipline areas such as embedded engineering systems. Developing a mechanical system critically depends on how effectively such various knowledge is organized and integrated in implementing a system. As such, the development process that customers, management, engineers, and teams are involved must be evaluated from this view point. Existence of tacit knowledge may not be sensed well enough at project beginning, however it is necessary for project success. This paper describes the problems and how the Personal Software Process (PSP) and Team Software Process (TSP2) manage this problem and then typical performance results are discussed. It may be said that PSP individual and TSP team are CMMI level 4 units respectively.

  19. Materials, Processes and Manufacturing in Ares 1 Upper Stage: Integration with Systems Design and Development

    NASA Technical Reports Server (NTRS)

    Bhat, Biliyar N.

    2008-01-01

    Ares I Crew Launch Vehicle Upper Stage is designed and developed based on sound systems engineering principles. Systems Engineering starts with Concept of Operations and Mission requirements, which in turn determine the launch system architecture and its performance requirements. The Ares I-Upper Stage is designed and developed to meet these requirements. Designers depend on the support from materials, processes and manufacturing during the design, development and verification of subsystems and components. The requirements relative to reliability, safety, operability and availability are also dependent on materials availability, characterization, process maturation and vendor support. This paper discusses the roles and responsibilities of materials and manufacturing engineering during the various phases of Ares IUS development, including design and analysis, hardware development, test and verification. Emphasis is placed how materials, processes and manufacturing support is integrated over the Upper Stage Project, both horizontally and vertically. In addition, the paper describes the approach used to ensure compliance with materials, processes, and manufacturing requirements during the project cycle, with focus on hardware systems design and development.

  20. Progress in the Development of Direct Osmotic Concentration Wastewater Recovery Process for Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Cath, Tzahi Y.; Adams, Dean V.; Childress, Amy; Gormly, Sherwin; Flynn, Michael

    2005-01-01

    Direct osmotic concentration (DOC) has been identified as a high potential technology for recycling of wastewater to drinking water in advanced life support (ALS) systems. As a result the DOC process has been selected for a NASA Rapid Technology Development Team (RTDT) effort. The existing prototype system has been developed to a Technology Readiness Level (TRL) 3. The current project focuses on advancing the development of this technology from TRL 3 to TRL 6 (appropriate for human rated testing). A new prototype of a DOC system is been designed and fabricated that addresses the deficiencies encountered during the testing of the original system and allowing the new prototype to achieve TRL 6. Background information is provided about the technologies investigated and their capabilities, results from preliminary tests, and the milestones plan and activities for the RTDT program intended to develop a second generation prototype of the DOC system.

  1. Functional process descriptions for the program to develop the Nuclear Waste Management System

    SciTech Connect

    Woods, T.W.

    1991-09-01

    The Office of Civilian Radioactive Waste Management (OCRWM) is executing a plan for improvement of the systems implemented to carry out its responsibilities under the Nuclear Waste Policy Act of 1982 (NWPA). As part of the plan, OCRWM is performing a systems engineering analysis of both the physical system, i.e., the Nuclear Waste Management System (NWMS), and the programmatic functions that must be accomplished to bring the physical system into being. The purpose of the program analysis is to provide a systematic identification and definition of all program functions, functional process flows, and function products necessary and sufficient to provide the physical system. The analysis resulting from this approach provides a basis for development of a comprehensive and integrated set of policies, standard practices, and procedures for the effective and efficient execution of the program. Thus, this analysis will form a basis for revising current OCRWM policies and procedures, or developing new ones is necessary. The primary purposes of this report are as follows: (1) summarizes the major functional processes and process flows that have been developed as a part of the program analysis, and (2) provide an introduction and assistance in understanding the detailed analysis information contained in the three volume report titled The Analysis of the Program to Develop the Nuclear Waste Management System (Woods 1991a).

  2. Development of an efficient automated hyperspectral processing system using embedded computing

    NASA Astrophysics Data System (ADS)

    Brown, Matthew S.; Glaser, Eli; Grassinger, Scott; Slone, Ambrose; Salvador, Mark

    2012-06-01

    Automated hyperspectral image processing enables rapid detection and identification of important military targets from hyperspectral surveillance and reconnaissance images. The majority of this processing is done using ground-based CPUs on hyperspectral data after it has been manually exfiltrated from the mobile sensor platform. However, by utilizing high-performance, on-board processing hardware, the data can be immediately processed, and the exploitation results can be distributed over a low-bandwidth downlink, allowing rapid responses to situations as they unfold. Additionally, transitioning to higher-performance and more-compact processing architectures such as GPUs, DSPs, and FPGAs will allow the size, weight, and power (SWaP) demands of the system to be reduced. This will allow the next generation of hyperspectral imaging and processing systems to be deployed on a much wider range of smaller manned and unmanned vehicles. In this paper, we present results on the development of an automated, near-real-time hyperspectral processing system using a commercially available NVIDIA® Telsa™ GPU. The processing chain utilizes GPU-optimized implementations of well-known atmospheric-correction, anomaly-detection, and target-detection algorithms in order to identify targetmaterial spectra from a hyperspectral image. We demonstrate that the system can return target-detection results for HYDICE data with 308×1280 pixels and 145 bands against 30 target spectra in less than four seconds.

  3. System approach to development of CAD and CAM laser technological processes

    NASA Astrophysics Data System (ADS)

    Lopota, Vitaliy A.

    1997-04-01

    Laser technology is one of the technologies the most highly developed at present; however, its wide introduction is constrained, on the one hand, by complexity and the high cost of the equipment, and, on the other hand, by complexity and insufficient knowledge about processes of laser welding, cutting and surface processing, as well as shortages of qualified staff. In spite of these reasons the technologies of laser processing of materials have reached such a level of development that their further perfection on the basis of accumulation of the empirical information has become impossible, whereas technological opportunities of such ways of laser processing, such as welding, cutting, melting and marking, are not completely used. For their successful realization it is necessary to choose the optimum mode of processing, but it is impossible because of the shortage of knowledge about the physical nature of given technological processes, as practically all existing information on these processes has qualitative character. On the other side, to provide the high quality and stability of treatment results it is necessary to have realization of the current monitoring and control during the most technological processes. For this purpose CAD and CAM systems are necessary, the absence of which is explained by the fact that for the present the physically adequate theoretical descriptions of laser technological processes are missing, and the experimental data, by virtue of non-systematic use of experimental work, does not give a complete experimental picture of the physical phenomena, proceeding at laser processing of materials.

  4. Design and development of a medical big data processing system based on Hadoop.

    PubMed

    Yao, Qin; Tian, Yu; Li, Peng-Fei; Tian, Li-Li; Qian, Yang-Ming; Li, Jing-Song

    2015-03-01

    Secondary use of medical big data is increasingly popular in healthcare services and clinical research. Understanding the logic behind medical big data demonstrates tendencies in hospital information technology and shows great significance for hospital information systems that are designing and expanding services. Big data has four characteristics--Volume, Variety, Velocity and Value (the 4 Vs)--that make traditional systems incapable of processing these data using standalones. Apache Hadoop MapReduce is a promising software framework for developing applications that process vast amounts of data in parallel with large clusters of commodity hardware in a reliable, fault-tolerant manner. With the Hadoop framework and MapReduce application program interface (API), we can more easily develop our own MapReduce applications to run on a Hadoop framework that can scale up from a single node to thousands of machines. This paper investigates a practical case of a Hadoop-based medical big data processing system. We developed this system to intelligently process medical big data and uncover some features of hospital information system user behaviors. This paper studies user behaviors regarding various data produced by different hospital information systems for daily work. In this paper, we also built a five-node Hadoop cluster to execute distributed MapReduce algorithms. Our distributed algorithms show promise in facilitating efficient data processing with medical big data in healthcare services and clinical research compared with single nodes. Additionally, with medical big data analytics, we can design our hospital information systems to be much more intelligent and easier to use by making personalized recommendations. PMID:25666927

  5. Development of the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2010-01-01

    A general overview of the development of a data acquisition and processing system is presented for a pulsed, 2-micron coherent Doppler Lidar system located in NASA Langley Research Center in Hampton, Virginia, USA. It is a comprehensive system that performs high-speed data acquisition, analysis, and data display both in real time and offline. The first flight missions are scheduled for the summer of 2010 as part of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The system as well as the control software is reviewed and its requirements and unique features are discussed.

  6. Methodology development of an engineering design expert system utilizing a modular knowledge-base inference process

    NASA Astrophysics Data System (ADS)

    Winter, Steven John

    Methodology development was conducted to incorporate a modular knowledge-base representation into an expert system engineering design application. The objective for using multidisciplinary methodologies in defining a design system was to develop a system framework that would be applicable to a wide range of engineering applications. The technique of "knowledge clustering" was used to construct a general decision tree for all factual information relating to the design application. This construction combined the design process surface knowledge and specific application depth knowledge. Utilization of both levels of knowledge created a system capable of processing multiple controlling tasks including; organizing factual information relative to the cognitive levels of the design process, building finite element models for depth knowledge analysis, developing a standardized finite element code for parallel processing, and determining a best solution generated by design optimization procedures. Proof of concept for the methodology developed here is shown in the implementation of an application defining the analysis and optimization of a composite aircraft canard subjected to a general compound loading condition. This application contained a wide range of factual information and heuristic rules. The analysis tools used included a finite element (FE) processor and numerical optimizer. An advisory knowledge-base was also developed to provide a standard for conversion of serial FE code for parallel processing. All knowledge-bases developed operated as either an advisory, selection, or classification systems. Laminate properties are limited to even-numbered, quasi-isotropic ply stacking sequences. This retained full influence of the coupled in-plane and bending effects of the structures behavior. The canard is modeled as a constant thickness plate and discretized into a varying number of four or nine-noded, quadrilateral, shear-deformable plate elements. The benefit gained by

  7. A Module Experimental Process System Development Unit (MEPSDU). [development of low cost solar arrays

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The technical readiness of a cost effective process sequence that has the potential for the production of flat plate photovoltaic modules which met the price goal in 1986 of $.70 or less per Watt peak was demonstrated. The proposed process sequence was reviewed and laboratory verification experiments were conducted. The preliminary process includes the following features: semicrystalline silicon (10 cm by 10 cm) as the silicon input material; spray on dopant diffusion source; Al paste BSF formation; spray on AR coating; electroless Ni plate solder dip metallization; laser scribe edges; K & S tabbing and stringing machine; and laminated EVA modules.

  8. A Scenario-Based Process for Requirements Development: Application to Mission Operations Systems

    NASA Technical Reports Server (NTRS)

    Bindschadler, Duane L.; Boyles, Carole A.

    2008-01-01

    The notion of using operational scenarios as part of requirements development during mission formulation (Phases A & B) is widely accepted as good system engineering practice. In the context of developing a Mission Operations System (MOS), there are numerous practical challenges to translating that notion into the cost-effective development of a useful set of requirements. These challenges can include such issues as a lack of Project-level focus on operations issues, insufficient or improper flowdown of requirements, flowdown of immature or poor-quality requirements from Project level, and MOS resource constraints (personnel expertise and/or dollars). System engineering theory must be translated into a practice that provides enough structure and standards to serve as guidance, but that retains sufficient flexibility to be tailored to the needs and constraints of a particular MOS or Project. We describe a detailed, scenario-based process for requirements development. Identifying a set of attributes for high quality requirements, we show how the portions of the process address many of those attributes. We also find that the basic process steps are robust, and can be effective even in challenging Project environments.

  9. Development of an on-line expert system for integrated alarm processing in nuclear power plants

    SciTech Connect

    Kim, Han Gon; Choi, Seong Soo; Kang, Ki Sig; Chang, Soon Heung

    1994-12-31

    An on-line expert system, called AFDS (Alarm Filtering and Diagnostic System), has been developed to assist operators in effectively maintaining plant safety and to enhance plant availability using advanced computer technologies for alarm processing. The AFDS is designed to perform alarm filtering and overall plantwide diagnosis when an abnormal state occurs. in addition to these functions, it carries out alarm prognosis to provide the operator with prediction-based messages and to generate high-level alarms that can be used as another diagnostic information. The system is developed on a SUN SPARC 2 workstation, and its target domain is the alarm system in the main control room of Yonggwang units 1 and 2.

  10. Development and fabrication of a solar cell junction processing system. Quarterly report No. 2, July 1980

    SciTech Connect

    Siesling, R.

    1980-07-01

    The basic objectives of the program are the following: (1) to design, develop, construct and deliver a junction processing system which will be capable of producing solar cell junctions by means of ion implantation followed by pulsed electron beam annealing; (2) to include in the system a wafer transport mechanism capable of transferring 4-inch-diameter wafers into and out of the vacuum chamber where the ion implantation and pulsed electron beam annealing processes take place; (3) to integrate, test and demonstrate the system prior to its delivery to JPL along with detailed operating and maintenance manuals; and (4) to estimate component lifetimes and costs, as necessary for the contract, for the performance of comprehensive analyses in accordance with the Solar Array Manufacturing Industry Costing Standards (SAMICS). Under this contract the automated junction formation equipment to be developed involves a new system design incorporating a modified, government-owned, JPL-controlled ion implanter into a Spire-developed pulsed electron beam annealer and wafer transport system. When modified, the ion implanter will deliver a 16 mA beam of /sup 31/P/sup +/ ions with a fluence of 2.5 x 10/sup 15/ ions per square centimeter at an energy of 10 keV. The throughput design goal rate for the junction processor is 10/sup 7/ four-inch-diameter wafers per year.

  11. Drug Development Process

    MedlinePlus

    ... Approvals The Drug Development Process The Drug Development Process Share Tweet Linkedin Pin it More sharing options ... public. More Information More in The Drug Development Process Step 1: Discovery and Development Step 2: Preclinical ...

  12. Tritium processing for the European test blanket systems: current status of the design and development strategy

    SciTech Connect

    Ricapito, I.; Calderoni, P.; Poitevin, Y.; Aiello, A.; Utili, M.; Demange, D.

    2015-03-15

    Tritium processing technologies of the two European Test Blanket Systems (TBS), HCLL (Helium Cooled Lithium Lead) and HCPB (Helium Cooled Pebble Bed), play an essential role in meeting the main objectives of the TBS experimental campaign in ITER. The compliancy with the ITER interface requirements, in terms of space availability, service fluids, limits on tritium release, constraints on maintenance, is driving the design of the TBS tritium processing systems. Other requirements come from the characteristics of the relevant test blanket module and the scientific programme that has to be developed and implemented. This paper identifies the main requirements for the design of the TBS tritium systems and equipment and, at the same time, provides an updated overview on the current design status, mainly focusing onto the tritium extractor from Pb-16Li and TBS tritium accountancy. Considerations are also given on the possible extrapolation to DEMO breeding blanket. (authors)

  13. Development of image processing LSI "SuperVchip" for real-time vision systems

    NASA Astrophysics Data System (ADS)

    Muramatsu, Shoji; Kobayashi, Yoshiki; Otsuka, Yasuo; Shojima, Hiroshi; Tsutsumi, Takayuki; Imai, Toshihiko; Yamada, Shigeyoshi

    2002-03-01

    A new image processing LSI SuperVchip with high-performance computing power has been developed. The SuperVchip has powerful capability for vision systems as follows: 1. General image processing by 3x3, 5x5, 7x7 kernel for high speed filtering function. 2. 16-parallel gray search engine units for robust template matching. 3. 49 block matching Pes to calculate the summation of the absolution difference in parallel for stereo vision function. 4. A color extraction unit for color object recognition. The SuperVchip also has peripheral function of vision systems, such as video interface, PCI extended interface, RISC engine interface and image memory controller on a chip. Therefore, small and high performance vision systems are realized via SuperVchip. In this paper, the above specific circuits are presented, and an architecture of a vision device equipped with SuperVchip and its performance are also described.

  14. Use of a continuous twin screw granulation and drying system during formulation development and process optimization.

    PubMed

    Vercruysse, J; Peeters, E; Fonteyne, M; Cappuyns, P; Delaet, U; Van Assche, I; De Beer, T; Remon, J P; Vervaet, C

    2015-01-01

    Since small scale is key for successful introduction of continuous techniques in the pharmaceutical industry to allow its use during formulation development and process optimization, it is essential to determine whether the product quality is similar when small quantities of materials are processed compared to the continuous processing of larger quantities. Therefore, the aim of this study was to investigate whether material processed in a single cell of the six-segmented fluid bed dryer of the ConsiGma™-25 system (a continuous twin screw granulation and drying system introduced by GEA Pharma Systems, Collette™, Wommelgem, Belgium) is predictive of granule and tablet quality during full-scale manufacturing when all drying cells are filled. Furthermore, the performance of the ConsiGma™-1 system (a mobile laboratory unit) was evaluated and compared to the ConsiGma™-25 system. A premix of two active ingredients, powdered cellulose, maize starch, pregelatinized starch and sodium starch glycolate was granulated with distilled water. After drying and milling (1000 μm, 800 rpm), granules were blended with magnesium stearate and compressed using a Modul™ P tablet press (tablet weight: 430 mg, main compression force: 12 kN). Single cell experiments using the ConsiGma™-25 system and ConsiGma™-1 system were performed in triplicate. Additionally, a 1h continuous run using the ConsiGma™-25 system was executed. Process outcomes (torque, barrel wall temperature, product temperature during drying) and granule (residual moisture content, particle size distribution, bulk and tapped density, hausner ratio, friability) as well as tablet (hardness, friability, disintegration time and dissolution) quality attributes were evaluated. By performing a 1h continuous run, it was detected that a stabilization period was needed for torque and barrel wall temperature due to initial layering of the screws and the screw chamber walls with material. Consequently, slightly deviating

  15. Microarthroscopy System With Image Processing Technology Developed for Minimally Invasive Surgery

    NASA Technical Reports Server (NTRS)

    Steele, Gynelle C.

    2001-01-01

    In a joint effort, NASA, Micro Medical Devices, and the Cleveland Clinic have developed a microarthroscopy system with digital image processing. This system consists of a disposable endoscope the size of a needle that is aimed at expanding the use of minimally invasive surgery on the knee, ankle, and other small joints. This device not only allows surgeons to make smaller incisions (by improving the clarity and brightness of images), but it gives them a better view of the injured area to make more accurate diagnoses. Because of its small size, the endoscope helps reduce physical trauma and speeds patient recovery. The faster recovery rate also makes the system cost effective for patients. The digital image processing software used with the device was originally developed by the NASA Glenn Research Center to conduct computer simulations of satellite positioning in space. It was later modified to reflect lessons learned in enhancing photographic images in support of the Center's microgravity program. Glenn's Photovoltaic Branch and Graphics and Visualization Lab (G-VIS) computer programmers and software developers enhanced and speed up graphic imaging for this application. Mary Vickerman at Glenn developed algorithms that enabled Micro Medical Devices to eliminate interference and improve the images.

  16. Development of the lateral line canal system through a bone remodeling process in zebrafish.

    PubMed

    Wada, Hironori; Iwasaki, Miki; Kawakami, Koichi

    2014-08-01

    The lateral line system of teleost fish is composed of mechanosensory receptors (neuromasts), comprising superficial receptors and others embedded in canals running under the skin. Canal diameter and size of the canal neuromasts are correlated with increasing body size, thus providing a very simple system to investigate mechanisms underlying the coordination between organ growth and body size. Here, we examine the development of the trunk lateral line canal system in zebrafish. We demonstrated that trunk canals originate from scales through a bone remodeling process, which we suggest is essential for the normal growth of canals and canal neuromasts. Moreover, we found that lateral line cells are required for the formation of canals, suggesting the existence of mutual interactions between the sensory system and surrounding connective tissues. PMID:24836859

  17. Intelligent process development of foam molding for the Thermal Protection System (TPS) of the space shuttle external tank

    NASA Technical Reports Server (NTRS)

    Bharwani, S. S.; Walls, J. T.; Jackson, M. E.

    1987-01-01

    A knowledge based system to assist process engineers in evaluating the processability and moldability of poly-isocyanurate (PIR) formulations for the thermal protection system of the Space Shuttle external tank (ET) is discussed. The Reaction Injection Molding- Process Development Advisor (RIM-PDA) is a coupled system which takes advantage of both symbolic and numeric processing techniques. This system will aid the process engineer in identifying a startup set of mold schedules and in refining the mold schedules to remedy specific process problems diagnosed by the system.

  18. The Development of Two Science Investigator-led Processing Systems (SIPS) for NASA's Earth Observation System (EOS)

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt

    2004-01-01

    In 2001, NASA Goddard Space Flight Center's Laboratory for Terrestrial Physics started the construction of a science Investigator-led Processing System (SIPS) for processing data from the Ozone Monitoring Instrument (OMI) which will launch on the Aura platform in mid 2004. The Ozone Monitoring Instrument (OMI) is a contribution of the Netherlands Agency for Aerospace Programs (NIVR) in collaboration with the Finnish Meteorological Institute (FMI) to the Earth Observing System (EOS) Aura mission. It will continue the Total Ozone Monitoring System (TOMS) record for total ozone and other atmospheric parameters related to ozone chemistry and climate. OMI measurements will be highly synergistic with the other instruments on the EOS Aura platform. The LTP previously developed the Moderate Resolution Imaging Spectrometer (MODIS) Data Processing System (MODAPS), which has been in full operations since the launches of the Terra and Aqua spacecrafts in December, 1999 and May, 2002 respectively. During that time, it has continually evolved to better support the needs of the MODIS team. We now run multiple instances of the system managing faster than real time reprocessings of the data as well as continuing forward processing. The new OMI Data Processing System (OMIDAPS) was adapted from the MODAPS. It will ingest raw data from the satellite ground station and process it to produce calibrated, geolocated higher level data products. These data products will be transmitted to the Goddard Distributed Active Archive Center (GDAAC) instance of the Earth Observing System (EOS) Data and Information System (EOSDIS) for long term archive and distribution to the public. The OMIDAPS will also provide data distribution to the OMI Science Team for quality assessment, algorithm improvement, calibration, etc. We have taken advantage of lessons learned from the MODIS experience and software already developed for MODIS. We made some changes in the hardware system organization, database and

  19. Development of an image processing system in splendid squid quality classification

    NASA Astrophysics Data System (ADS)

    Masunee, Niyada; Chaiprapat, Supapan; Waiyagan, Kriangkrai

    2013-07-01

    Agricultural products typically exhibit high variance in quality characteristics. To assure customer satisfaction and control manufacturing productivity, quality classification is necessary to screen off defective items and to grade the products. This article presents an application of image processing techniques on squid grading and defect discrimination. A preliminary study indicated that surface color was an efficient determinant to justify quality of splendid squids. In this study, a computer vision system (CVS) was developed to examine the characteristics of splendid squids. Using image processing techniques, squids could be classified into three different quality grades as in accordance with an industry standard. The developed system first sifted through squid images to reject ones with black marks. Qualified squids were graded on a proportion of white, pink, and red regions appearing on their bodies by using fuzzy logic. The system was evaluated on 100 images of squids at different quality levels. It was found that accuracy obtained by the proposed technique was 95% compared with sensory evaluation of an expert.

  20. Design and development of an in-line sputtering system and process development of thin film multilayer neutron supermirrors

    NASA Astrophysics Data System (ADS)

    Biswas, A.; Sampathkumar, R.; Kumar, Ajaya; Bhattacharyya, D.; Sahoo, N. K.; Lagoo, K. D.; Veerapur, R. D.; Padmanabhan, M.; Puri, R. K.; Bhattacharya, Debarati; Singh, Surendra; Basu, S.

    2014-12-01

    Neutron supermirrors and supermirror polarizers are thin film multilayer based devices which are used for reflecting and polarizing neutrons in various neutron based experiments. In the present communication, the in-house development of a 9 m long in-line dc sputtering system has been described which is suitable for deposition of neutron supermirrors on large size (1500 mm × 150 mm) substrates and in large numbers. The optimisation process of deposition of Co and Ti thin film, Co/Ti periodic multilayers, and a-periodic supermirrors have also been described. The system has been used to deposit thin film multilayer supermirror polarizers which show high reflectivity up to a reasonably large critical wavevector transfer of ˜0.06 Å-1 (corresponding to m = 2.5, i.e., 2.5 times critical wavevector transfer of natural Ni). The computer code for designing these supermirrors has also been developed in-house.

  1. Design and development of an in-line sputtering system and process development of thin film multilayer neutron supermirrors

    SciTech Connect

    Biswas, A.; Sampathkumar, R.; Kumar, Ajaya; Bhattacharyya, D.; Sahoo, N. K.; Lagoo, K. D.; Veerapur, R. D.; Padmanabhan, M.; Puri, R. K.; Bhattacharya, Debarati; Singh, Surendra; Basu, S.

    2014-12-15

    Neutron supermirrors and supermirror polarizers are thin film multilayer based devices which are used for reflecting and polarizing neutrons in various neutron based experiments. In the present communication, the in-house development of a 9 m long in-line dc sputtering system has been described which is suitable for deposition of neutron supermirrors on large size (1500 mm × 150 mm) substrates and in large numbers. The optimisation process of deposition of Co and Ti thin film, Co/Ti periodic multilayers, and a-periodic supermirrors have also been described. The system has been used to deposit thin film multilayer supermirror polarizers which show high reflectivity up to a reasonably large critical wavevector transfer of ∼0.06 Å{sup −1} (corresponding to m = 2.5, i.e., 2.5 times critical wavevector transfer of natural Ni). The computer code for designing these supermirrors has also been developed in-house.

  2. Flat-plate solar array project: Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The engineering design, fabrication, assembly, operation, economic analysis, and process support research and development for an Experimental Process System Development Unit for producing semiconductor-grade silicon using the slane-to-silicon process are reported. The design activity was completed. About 95% of purchased equipment was received. The draft of the operations manual was about 50% complete and the design of the free-space system continued. The system using silicon power transfer, melting, and shotting on a psuedocontinuous basis was demonstrated.

  3. Development and Application of a Process-based River System Model at a Continental Scale

    NASA Astrophysics Data System (ADS)

    Kim, S. S. H.; Dutta, D.; Vaze, J.; Hughes, J. D.; Yang, A.; Teng, J.

    2014-12-01

    Existing global and continental scale river models, mainly designed for integrating with global climate model, are of very course spatial resolutions and they lack many important hydrological processes, such as overbank flow, irrigation diversion, groundwater seepage/recharge, which operate at a much finer resolution. Thus, these models are not suitable for producing streamflow forecast at fine spatial resolution and water accounts at sub-catchment levels, which are important for water resources planning and management at regional and national scale. A large-scale river system model has been developed and implemented for water accounting in Australia as part of the Water Information Research and Development Alliance between Australia's Bureau of Meteorology (BoM) and CSIRO. The model, developed using node-link architecture, includes all major hydrological processes, anthropogenic water utilisation and storage routing that influence the streamflow in both regulated and unregulated river systems. It includes an irrigation model to compute water diversion for irrigation use and associated fluxes and stores and a storage-based floodplain inundation model to compute overbank flow from river to floodplain and associated floodplain fluxes and stores. An auto-calibration tool has been built within the modelling system to automatically calibrate the model in large river systems using Shuffled Complex Evolution optimiser and user-defined objective functions. The auto-calibration tool makes the model computationally efficient and practical for large basin applications. The model has been implemented in several large basins in Australia including the Murray-Darling Basin, covering more than 2 million km2. The results of calibration and validation of the model shows highly satisfactory performance. The model has been operalisationalised in BoM for producing various fluxes and stores for national water accounting. This paper introduces this newly developed river system model

  4. Evaluation of radiographs developed by a new ultrarapid film processing system.

    PubMed

    Schmidt, R A; Doi, K; Sekiya, M; Xu, X W; Giger, M L; Lu, C T; Mojtahedi, S; MacMahon, H

    1990-05-01

    The image quality of radiographs developed by a new ultrarapid processor was evaluated to determine if faster processing causes degradation in the image. The processor used was the Konica Super-Rapid SRX-501 model. Two films designed for this processor (Konica MGH-SR and MGL-SR) were processed in 45 sec and were compared with standard rapid processing in 90 sec of corresponding conventional films (Kodak TMG and OC). Rare-earth screens (Kodak Lanex Regular and Lanex Medium) used with the new and conventional films interleaved during angiographic studies or for phantom images were assessed for image quality. The basic imaging properties of the screen-film systems were examined by measuring (1) Hurter and Driffield curves, (2) modulation transfer functions by using the slit method, and (3) noise Wiener spectra. Subjective clinical assessment showed that the images obtained with ultrarapid processing were acceptable, with increased contrast and graininess. Hurter and Driffield curve measurements confirmed higher gradients. Modulation transfer function measurements were the same as for the conventional films. Noise Wiener spectrum measurements showed a 10% increase in noise for MGH-SR vs TMG film and a 30% increase for MGL-SR vs OC film. We conclude that acceptable image quality can be obtained using ultrarapid processing, with processing time approximately 60% that of conventional rapid processing. Potential applications include all areas in which rapid availability of the radiograph for interpretation is important. Although the processor studied was the first of its kind available, our evaluation indicates that the technology is available for a new class of ultrarapid processors. PMID:2108553

  5. The development of a zeolite system for upgrade of the Process Waste Treatment Plant

    SciTech Connect

    Robinson, S.M.; Kent, T.E.; Arnold, W.D.; Parrott, J.R. Jr.

    1993-10-01

    Studies have been undertaken to design an efficient zeolite ion exchange system for use at the ORNL Process Waste Treatment Plant to remove cesium and strontium to meet discharge limits. This report focuses on two areas: (1) design of column hardware and pretreatment steps needed to eliminate column plugging and channeling and (2) development of equilibrium models for the wastewater system. Results indicate that zeolite columns do not plug as quickly when the wastewater equalization is performed in the new Bethel Valley Storage Tanks instead of the former equalization basin where suspended solids concentration is high. A down-flow column with spent zeolite was used successfully as a prefilter to prevent plugging of the zeolite columns being used to remove strontium and cesium. Equilibrium studies indicate that a Langmuir isotherm models binary zeolite equilibrium data while the modified Dubinin-Polyani model predicts multicomponent data.

  6. Development of Three-Layer Simulation Model for Freezing Process of Food Solution Systems

    NASA Astrophysics Data System (ADS)

    Kaminishi, Koji; Araki, Tetsuya; Shirakashi, Ryo; Ueno, Shigeaki; Sagara, Yasuyuki

    A numerical model has been developed for simulating freezing phenomena of food solution systems. The cell model was simplified to apply to food solution systems, incorporating with the existence of 3 parts such as unfrozen, frozen and moving boundary layers. Moreover, the moving rate of freezing front model was also introduced and calculated by using the variable space network method proposed by Murray and Landis (1957). To demonstrate the validity of the model, it was applied to the freezing processes of coffee solutions. Since the model required the phase diagram of the material to be frozen, the initial freezing temperatures of 1-55 % coffee solutions were measured by the DSC method. The effective thermal conductivity for coffee solutions was determined as a function of temperature and solute concentration by using the Maxwell - Eucken model. One-dimensional freezing process of 10 % coffee solution was simulated based on its phase diagram and thermo-physical properties. The results were good agreement with the experimental data and then showed that the model could accurately describe the change in the location of the freezing front and the distributions of temperature as well as ice fraction during a freezing process.

  7. Image retrieval and processing system version 2.0 development work

    NASA Technical Reports Server (NTRS)

    Slavney, Susan H.; Guinness, Edward A.

    1991-01-01

    The Image Retrieval and Processing System (IRPS) is a software package developed at Washington University and used by the NASA Regional Planetary Image Facilities (RPIF's). The IRPS combines data base management and image processing components to allow the user to examine catalogs of image data, locate the data of interest, and perform radiometric and geometric calibration of the data in preparation for analysis. Version 1.0 of IRPS was completed in Aug. 1989 and was installed at several IRPS's. Other RPIF's use remote logins via NASA Science Internet to access IRPS at Washington University. Work was begun on designing and population a catalog of Magellan image products that will be part of IRPS Version 2.0, planned for release by the end of calendar year 1991. With this catalog, a user will be able to search by orbit and by location for Magellan Basic Image Data Records (BIDR's), Mosaicked Image Data Records (MIDR's), and Altimetry-Radiometry Composite Data Records (ARCDR's). The catalog will include the Magellan CD-ROM volume, director, and file name for each data product. The image processing component of IRPS is based on the Planetary Image Cartography Software (PICS) developed by the U.S. Geological Survey, Flagstaff, Arizona. To augment PICS capabilities, a set of image processing programs were developed that are compatible with PICS-format images. This software includes general-purpose functions that PICS does not have, analysis and utility programs for specific data sets, and programs from other sources that were modified to work with PICS images. Some of the software will be integrated into the Version 2.0 release of IRPS. A table is presented that lists the programs with a brief functional description of each.

  8. Investigation of coat-develop track system for placement error of contact hole shrink process

    NASA Astrophysics Data System (ADS)

    Harumoto, Masahiko; Stokes, Harold; Tanaka, Yuji; Kaneyama, Koji; Pieczulewski, Charles; Asai, Masaya; Servin, Isabelle; Argoud, Maxime; Gharbi, Ahmed; Lapeyre, Celine; Tiron, Raluca; Monget, Cedric

    2016-04-01

    Directed Self-Assembly (DSA) is a well-known candidate for next generation sub-15nm half-pitch lithography. [1-2] DSA processes on 300mm wafers have been demonstrated for several years, and have given a strong impression due to finer pattern results. [3-4] On t he other hand, specific issues with DSA processes have begun to be clear as a result of these recent challenges. [5-6] Pattern placement error, which means the pattern shift after DSA fabrication, is recognized as one of these typical issues. Coat-Develop Track systems contribute to the DSA pattern fabrication and also influence the DSA pattern performance.[4] In this study, the placement error was investigated using a simple contact-hole pattern and subsequent contact-hole shrink process implemented on the SOKUDO DUO track. Thus, we will show the placement error of contact-hole shrink using a DSA process and discuss the difference between DSA and other shrink methods.

  9. Barotropic processes associated with the development of the Mei-yu precipitation system

    NASA Astrophysics Data System (ADS)

    Li, Tingting; Li, Xiaofan

    2016-05-01

    The barotropic processes associated with the development of a precipitation system are investigated through analysis of cloud-resolving model simulations of Mei-yu torrential rainfall events over eastern China in mid-June 2011. During the model integration period, there were three major heavy rainfall events: 9-12, 13-16 and 16-20 June. The kinetic energy is converted from perturbation to mean circulations in the first and second period, whereas it is converted from mean to perturbation circulations in the third period. Further analysis shows that kinetic energy conversion is determined by vertical transport of zonal momentum. Thus, the prognostic equation of vertical transport of zonal momentum is derived, in which its tendency is associated with dynamic, pressure gradient and buoyancy processes. The kinetic energy conversion from perturbation to mean circulations in the first period is mainly associated with the dynamic processes. The kinetic energy conversion from mean to perturbation circulations in the third period is generally related to the pressure gradient processes.

  10. System design development for microwave and millimeter-wave materials processing

    NASA Astrophysics Data System (ADS)

    Feher, Lambert; Thumm, Manfred

    2002-06-01

    The most notable effect in processing dielectrics with micro- and millimeter-waves is volumetric heating of these materials, offering the opportunity of very high heating rates for the samples. In comparison to conventional heating where the heat transfer is diffusive and depends on the thermal conductivity of the material, the microwave field penetrates the sample and acts as an instantaneous heat source at each point of the sample. By this unique property, microwave heating at 2.45 GHz and 915 MHz ISM (Industrial, Medical, Scientific) frequencies is established as an important industrial technology since more than 50 years ago. Successful application of microwaves in industries has been reported e.g. by food processing systems, domestic ovens, rubber industry, vacuum drying etc. The present paper shows some outlines of microwave system development at Forschungszentrum Karlsruhe, IHM by transferring properties from the higher frequency regime (millimeter-waves) to lower frequency applications. Anyway, the need for using higher frequencies like 24 GHz (ISM frequency) for industrial applications has to be carefully verified with respect to special physical/engineering advantages or to limits the standard microwave technology meets for the specific problem.

  11. BIOGAS Process development

    SciTech Connect

    Ghosh, S.; Mensinger, M.C.; Sajjad, A.; Henry, M.P.

    1984-01-01

    The overall objective of the program is to demonstrate and commercialize the IGT two-phase BIOGAS Process for optimized methane production from, and simultaneous stabilization of, municipal solid waste (MSW). The specific objective of the current program is to conduct a laboratory-scale investigation of simple, cost-effective feed pretreatment techniques and selected digestion reactor designs to optimize methane production from MSW-sludge blends, and to select the best pretreatment and digestion conditions for testing during the subsequent program for process development unit (PDU) operation. A significant portion of the program efforts to date has been directed at evaluating and/or developing feeding, mixing and discharging systems for handling high concentration, large particle size RDF slurries for anaerobic digestion processes. The performance of such processes depends significantly on the operational success of these subsystems. The results of the subsystem testing have been implemented in the design and operation of the 10-L, 20-L, and 125-L digesters. These results will also be utilized to design the CSTR and the upflow digesters of a large two-phase system. Data collected during the initial phase of this research showed in general that methane production from RDF decreased as the loading rate was increased. Thermophilic digestion did not appear to be significantly better than mesophlic digestion. 9 figures, 3 tables.

  12. The Naval Enlisted Professional Development Information System (NEPDIS): Front End Analysis (FEA) Process. Technical Report 159.

    ERIC Educational Resources Information Center

    Aagard, James A.; Ansbro, Thomas M.

    The Naval Enlisted Professional Development Information System (NEPDIS) was designed to function as a fully computerized information assembly and analysis system to support labor force, personnel, and training management. The NEPDIS comprises separate training development, instructional, training record and evaluation, career development, and…

  13. Image Processing System

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Mallinckrodt Institute of Radiology (MIR) is using a digital image processing system which employs NASA-developed technology. MIR's computer system is the largest radiology system in the world. It is used in diagnostic imaging. Blood vessels are injected with x-ray dye, and the images which are produced indicate whether arteries are hardened or blocked. A computer program developed by Jet Propulsion Laboratory known as Mini-VICAR/IBIS was supplied to MIR by COSMIC. The program provides the basis for developing the computer imaging routines for data processing, contrast enhancement and picture display.

  14. Development and Evaluation of a Thai Learning System on the Web Using Natural Language Processing.

    ERIC Educational Resources Information Center

    Dansuwan, Suyada; Nishina, Kikuko; Akahori, Kanji; Shimizu, Yasutaka

    2001-01-01

    Describes the Thai Learning System, which is designed to help learners acquire the Thai word order system. The system facilitates the lessons on the Web using HyperText Markup Language and Perl programming, which interfaces with natural language processing by means of Prolog. (Author/VWL)

  15. Attitude determination of a high altitude balloon system. Part 2: Development of the parameter determination process

    NASA Technical Reports Server (NTRS)

    Nigro, N. J.; Elkouh, A. F.

    1975-01-01

    The attitude of the balloon system is determined as a function of time if: (a) a method for simulating the motion of the system is available, and (b) the initial state is known. The initial state is obtained by fitting the system motion (as measured by sensors) to the corresponding output predicted by the mathematical model. In the case of the LACATE experiment the sensors consisted of three orthogonally oriented rate gyros and a magnetometer all mounted on the research platform. The initial state was obtained by fitting the angular velocity components measured with the gyros to the corresponding values obtained from the solution of the math model. A block diagram illustrating the attitude determination process employed for the LACATE experiment is shown. The process consists of three essential parts; a process for simulating the balloon system, an instrumentation system for measuring the output, and a parameter estimation process for systematically and efficiently solving the initial state. Results are presented and discussed.

  16. Experience gained with development and commissioning of retrofitted process control systems for large power units

    NASA Astrophysics Data System (ADS)

    Idzon, O. M.; Grekhov, L. L.

    2009-01-01

    Experience gained for many years at ZAO Interavtomatika with work on retrofitting control and monitoring systems of large power units is summarized. Principles based on which these systems should be retrofitted are considered together with the factors influencing the choice of retrofitting option, as well as decisions on constructing a process control system during full and partial retrofitting. Recommendations are given for the optimal scope of functions that should be incorporated in the software and hardware tools of a process control system during its retrofitting.

  17. An Information System Development Method Combining Business Process Modeling with Executable Modeling and its Evaluation by Prototyping

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Hirabayashi, Syuichi; Suzuki, Ryo; Mitsui, Hiroyasu; Koizumi, Hisao

    The business in the enterprise is closely related with the information system to such an extent that the business activities are difficult without the information system. The system design technique that considers the business process well, and that enables a quick system development is requested. In addition, the demand for the development cost is also severe than before. To cope with the current situation, the modeling technology named BPM(Business Process Management/Modeling)is drawing attention and becoming important as a key technology. BPM is a technology to model business activities as business processes and visualize them to improve the business efficiency. However, a general methodology to develop the information system using the analysis result of BPM doesn't exist, and a few development cases are reported. This paper proposes an information system development method combining business process modeling with executable modeling. In this paper we describe a guideline to support consistency of development and development efficiency and the framework enabling to develop the information system from model. We have prototyped the information system with the proposed method and our experience has shown that the methodology is valuable.

  18. Tank Waste Remediation System tank waste pretreatment and vitrification process development testing requirements assessment

    SciTech Connect

    Howden, G.F.

    1994-10-24

    A multi-faceted study was initiated in November 1993 to provide assurance that needed testing capabilities, facilities, and support infrastructure (sampling systems, casks, transportation systems, permits, etc.) would be available when needed for process and equipment development to support pretreatment and vitrification facility design and construction schedules. This first major report provides a snapshot of the known testing needs for pretreatment, low-level waste (LLW) and high-level waste (HLW) vitrification, and documents the results of a series of preliminary studies and workshops to define the issues needing resolution by cold or hot testing. Identified in this report are more than 140 Hanford Site tank waste pretreatment and LLW/HLW vitrification technology issues that can only be resolved by testing. The report also broadly characterizes the level of testing needed to resolve each issue. A second report will provide a strategy(ies) for ensuring timely test capability. Later reports will assess the capabilities of existing facilities to support needed testing and will recommend siting of the tests together with needed facility and infrastructure upgrades or additions.

  19. Development of digital interactive processing system for NOAA satellites AVHRR data

    NASA Astrophysics Data System (ADS)

    Gupta, R. K.; Murthy, N. N.

    The paper discusses the digital image processing system for NOAA/AVHRR data including Land applications - configured around VAX 11/750 host computer supported with FPS 100 Array Processor, Comtal graphic display and HP Plotting devices; wherein the system software for relational Data Base together with query and editing facilities, Man-Machine Interface using form, menu and prompt inputs including validation of user entries for data type and range; preprocessing software for data calibration, Sun-angle correction, Geometric Corrections for Earth curvature effect and Earth rotation offsets and Earth location of AVHRR image have been accomplished. The implemented image enhancement techniques such as grey level stretching, histogram equalization and convolution are discussed. The software implementation details for the computation of vegetative index and normalized vegetative index using NOAA/AVHRR channels 1 and 2 data together with output are presented; scientific background for such computations and obtainability of similar indices from Landsat/MSS data are also included. The paper concludes by specifying the further software developments planned and the progress envisaged in the field of vegetation index studies.

  20. The Impact of the Bologna Process on the Development of the Greek Quality Assurance System

    ERIC Educational Resources Information Center

    Asderaki, Foteini

    2009-01-01

    Greece, an EU-member state since 1981, lagged behind other European countries in the development of a national quality assurance system. This article charts the route to the establishment of a quality assurance system in Greece. While national evaluation and accreditation systems were established in most European countries during the mid-1980s and…

  1. Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, & Visualization

    SciTech Connect

    Wright, David L.

    2004-12-01

    Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, and Visualization Methods with Applications to Site Characterization EMSP Project 86992 Progress Report as of 9/2004.

  2. System design, development, and production process modeling: A versatile and powerful acquisition management decision support tool

    SciTech Connect

    Rafuse, H.E.

    1996-12-31

    A series of studies have been completed on the manufacturing operations of light, medium, and heavy tactical vehicle system producers to facilitate critical system acquisition resource decisions by the United States Army Program Executive Officer, Tactical Wheeled Vehicles. The principal programs were the Family of Medium Tactical Vehicles (FMTV) production programs at Stewart & Stevenson Services, Inc.; the heavy TWV production programs at the Oshkosh Truck Corporation in Oshkosh, Wisconsin; and the light TWV and 2.5 ton remanufacturing production programs at the AM General Corporation in South Bend, Indiana. Each contractor`s production scenarios were analyzed and modeled to accurately quantify the relationship between production rates and unit costs. Specific objectives included identifying (1) Minimum Sustaining Rates to support current and future budgetary requirements and resource programming for potential follow-on procurements, (2) thresholds where production rate changes significantly affect unit costs, and (3) critical production program factors and their impacts to production rate versus unit cost relationships. Two different techniques were utilized initially in conducting the analyses. One technique principally focused on collecting and analyzing applicable historical production program information, where available, to develop a statistical predictive model. A second and much more exhaustive technique focused on a detailed modeling of each contractor`s production processes, flows, and operations. A standard architecture of multiple linked functional modules was used for each process model. Using the standard architecture, the individual modules were tailored to specific contractor operations. Each model contains detailed information on manpower, burden rates, material, material price/quantity relationships, capital, manufacturing support, program management, and all related direct and indirect costs applicable to the production programs.

  3. EARSEC SAR processing system

    NASA Astrophysics Data System (ADS)

    Protheroe, Mark; Sloggett, David R.; Sieber, Alois J.

    1994-12-01

    Traditionally, the production of high quality Synthetic Aperture Radar imagery has been an area where a potential user would have to expend large amounts of money in either the bespoke development of a processing chain dedicated to his requirements or in the purchase of a dedicated hardware platform adapted using accelerator boards and enhanced memory management. Whichever option the user adopted there were limitations based on the desire for a realistic throughput in data load and time. The user had a choice, made early in the purchase, for either a system that adopted innovative algorithmic manipulation, to limit the processing time of the purchase of expensive hardware. The former limits the quality of the product, while the latter excludes the user from any visibility into the processing chain. Clearly there was a need for a SAR processing architecture that gave the user a choice into the methodology to be adopted for a particular processing sequence, allowing him to decide on either a quick (lower quality) product or a detailed slower (high quality) product, without having to change the algorithmic base of his processor or the hardware platform. The European Commission, through the Advanced Techniques unit of the Joint Research Centre (JRC) Institute for Remote Sensing at Ispra in Italy, realizing the limitations on current processing abilities, initiated its own program to build airborne SAR and Electro-Optical (EO) sensor systems. This program is called the European Airborne Remote Sensing Capabilities (EARSEC) program. This paper describes the processing system developed for the airborne SAR sensor system. The paper considers the requirements for the system and the design of the EARSEC Airborne SAR Processing System. It highlights the development of an open SAR processing architecture where users have full access to intermediate products that arise from each of the major processing stages. It also describes the main processing stages in the overall

  4. Considerations in developing geographic informations systems based on low-cost digital image processing

    NASA Technical Reports Server (NTRS)

    Henderson, F. M.; Dobson, M. W.

    1981-01-01

    The potential of digital image processing systems costing $20,000 or less for geographic information systems is assessed with the emphasis on the volume of data to be handled, the commercial hardware systems available, and the basic software for: (1) data entry, conversion and digitization; (2) georeferencing and geometric correction; (3) data structuring; (4) editing and updating; (5) analysis and retrieval; (6) output drivers; and (7) data management. Costs must also be considered as tangible and intangible factors.

  5. Development and implementation of the verification process for the shuttle avionics system

    NASA Technical Reports Server (NTRS)

    Smith, H. E.; Fouts, W. B.; Mesmer, J.

    1985-01-01

    The background of the shuttle avionics system design and the unique drivers associated with the redundant digital multiplexed data processing system are examined. With flight software pervading to the lowest elements of the flight critical subsystems, it was necessary to identify a unique and orderly approach of verifying the system as flight ready for STS-1. The approach and implementation plan is discussed, and both technical problems and management issues are dealt with.

  6. A Module Experimental Process System Development Unit (MEPSDU). [flat plate solar arrays

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a cost effective process sequence that has the potential for the production of flat plate photovoltaic modules which meet the price goal in 1986 of 70 cents or less per Watt peak is described. The major accomplishments include (1) an improved AR coating technique; (2) the use of sand blast back clean-up to reduce clean up costs and to allow much of the Al paste to serve as a back conductor; and (3) the development of wave soldering for use with solar cells. Cells were processed to evaluate different process steps, a cell and minimodule test plan was prepared and data were collected for preliminary Samics cost analysis.

  7. Renovation of CPF (Chemical Processing Facility) for Development of Advanced Fast Reactor Fuel Cycle System

    SciTech Connect

    Shinichi Aose; Takafumi Kitajima; Kouji Ogasawara; Kazunori Nomura; Shigehiko Miyachi; Yoshiaki Ichige; Tadahiro Shinozaki; Shinichi Ohuchi

    2008-01-15

    CPF (Chemical Processing Facility) was constructed at Nuclear Fuel Cycle Engineering Laboratories of JAEA (Japan Atomic Energy Agency) in 1980 as a basic research field where spent fuel pins from fast reactor (FR) and high level liquid waste can be dealt with. The renovation consists of remodeling of the CA-3 cell and the laboratory A, installation of globe boxes, hoods and analytical equipments to the laboratory C and the analytical laboratory. Also maintenance equipments in the CA-5 cell which had been out of order were repaired. The CA-3 cell is the main cell in which important equipments such as a dissolver, a clarifier and extractors are installed for carrying out the hot test using the irradiated FR fuel. Since the CPF had specialized originally in the research function for the Purex process, it was desired to execute the research and development of such new, various reprocessing processes. Formerly, equipments were arranged in wide space and connected with not only each other but also with utility supply system mainly by fixed stainless steel pipes. It caused shortage of operation space in flexibility for basic experimental study. Old equipments in the CA-3 cell including vessels and pipes were removed after successful decontamination, and new equipments were installed conformably to the new design. For the purpose of easy installation and rearranging the experimental equipments, equipments are basically connected by flexible pipes. Since dissolver is able to be easily replaced, various dissolution experiments is conducted. Insoluble residue generated by dissolution of spent fuel is clarified by centrifugal. This small apparatus is effective to space-saving. Mini mixer settlers or centrifugal contactors are put on to the prescribed limited space in front of the backside wall. Fresh reagents such as solvent, scrubbing and stripping solution are continuously fed from the laboratory A to the extractor by the reagent supply system with semi-automatic observation

  8. Expert System Development in the Classroom: Processes and Outcomes. Technical Report 91-1.

    ERIC Educational Resources Information Center

    Wideman, Herbert H.; Owston, Ronald D.

    This study examined cognitive processes and outcomes associated with student knowledge base development. Sixty-nine students from two grade 8 classes were randomly assigned to one of three groups: a knowledge base development (KBD) group, a problem-solving software group, and a control group. Those in the KBD group received relevant instruction…

  9. Flat-plate solar-array project. Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The engineering design, fabrication, assembly, operation, economic analysis, and process support R and D for an Experimental Process System Development Unit (EPSDU) are reported. About 95% of purchased equipment is received and will be reshipped to the West Coast location. The Data Collection System is completed. In the area of melting/consolidation, to the system using silicon powder transfer, melting and shotting on a pseudocontinuous basis is demonstrated. It is proposed to continue the very promising fluid bed work.

  10. Flat-plate solar-array project. Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Astrophysics Data System (ADS)

    1981-09-01

    The engineering design, fabrication, assembly, operation, economic analysis, and process support R and D for an Experimental Process System Development Unit (EPSDU) are reported. About 95% of purchased equipment is received and will be reshipped to the West Coast location. The Data Collection System is completed. In the area of melting/consolidation, to the system using silicon powder transfer, melting and shotting on a pseudocontinuous basis is demonstrated. It is proposed to continue the very promising fluid bed work.

  11. Development of laser cladding system with process monitoring by x-ray imaging

    NASA Astrophysics Data System (ADS)

    Terada, Takaya; Yamada, Tomonori; Nishimura, Akihiko

    2014-02-01

    We have been developing a new laser cladding system to repair the damages of parts in aging plants. It consists of some devices which are a laser torch, composite-type optical fiber, QCW fiber laser and etc. All devices are installed in a mobile rack, so we can carry it to plants, laboratories or anywhere we want to use. We should irradiate the work with the best accuracy of laser beam and filler wire in laser cladding. A composite-type optical fiberscope is useful. This fiberscope was composed of a center fiber for beam delivery surrounded by 20000 fibers for visible image delivery. Thus it always keeps target on center of gun-sight. We succeeded to make a line laser cladding on an inside wall of 1-inch tube by our system. Before this success, we solved two serious problems which are the contamination of optics and the deformation of droplet. Observing laser cladding process by X-ray imaging with Spring-8 synchrotron radiation, we found that the molten pool depth was formed to be under a hundred micrometers for 10 milliseconds. A Quasi-CW fiber laser with 1 kW was employed for a heat source to generate the shallow molten pool. The X-ray shadowgraph clarified that a molten droplet was formed at the edge of a wire up to a millimeter size. It grew up if the wire didn't contact with the tube wall in initial state. Here we succeeded to measure the thermo-electromotive force voltage between a wire and a tube metal to confirm whether both came in contact. We propose to apply the laser cladding technology to the maintenance of aging industrial plants and nuclear facilities.

  12. MicroRNAs (MiRs) Precisely Regulate Immune System Development and Function in Immunosenescence Process.

    PubMed

    Aalaei-Andabili, Seyed Hossein; Rezaei, Nima

    2016-01-01

    Human aging is a complex process with pivotal changes in gene expression of biological pathways. Immune system dysfunction has been recognized as one of the most important abnormalities induced by senescent names immunosenescence. Emerging evidences suggest miR role in immunosenescence. We aimed to systemically review all relevant reports to clearly state miR effects on immunosenescence process. Sensitive electronic searches carried out. Quality assessment has been performed. Since majority of the included studies were laboratory works, and therefore heterogen, we discussed miR effects on immunological aging process nonstatically. Forty-six articles were found in the initial search. After exclusion of 34 articles, 12 studies enrolled to the final stage. We found that miRs have crucial roles in exact function of immune system. MiRs are involved in the regulation of the aging process in the immune system components and target certain genes, promoting or inhibiting immune system reaction to invasion. Also, miRs control life span of the immune system members by regulation of the genes involved in the apoptosis. Interestingly, we found that immunosenescence is controllable by proper manipulation of the various miRs expression. DNA methylation and histone acetylation have been discovered as novel strategies, altering NF-κB binding ability to the miR promoter sites. Effect of miRs on impairment of immune system function due to the aging is emerging. Although it has been accepted that miRs have determinant roles in the regulation of the immunosenescence; however, most of the reports are concluded from animal/laboratory works, suggesting the necessity of more investigations in human. PMID:26327579

  13. Development of a microblood-typing system using assembly-free process based on virtual environment

    NASA Astrophysics Data System (ADS)

    Lee, Seung-Jae; Kang, Hyun-Wook; Kim, Yonggoo; Lee, Gyoo-Whung; Lim, Geunbae; Cho, Dong-Woo

    2005-02-01

    ABO typing is the first test done on blood that is to be used for transfusion. A person must receive ABO-matched blood, as ABO incompatibility is the major cause of fatal transfusion reactions. Until now, this blood typing has been done manually, and there is therefore a need for an automated typing machine that uses a very small volume of blood. In this paper, we present a new micro blood-typing system with a fully 3-dimentional geometry, which was realized using micro-stereolithography. This system was fabricated with a novel integration process based on a virtual environment and blood typing experiments using this system were successfully performed.

  14. Development of a System for Thermoelectric Heat Recovery from Stationary Industrial Processes

    NASA Astrophysics Data System (ADS)

    Ebling, D. G.; Krumm, A.; Pfeiffelmann, B.; Gottschald, J.; Bruchmann, J.; Benim, A. C.; Adam, M.; Labs, R.; Herbertz, R. R.; Stunz, A.

    2016-05-01

    The hot forming process of steel requires temperatures of up to 1300°C. Usually, the invested energy is lost to the environment by the subsequent cooling of the forged parts to room temperature. Thermoelectric systems are able to recover this wasted heat by converting the heat into electrical energy and feeding it into the power grid. The proposed thermoelectric system covers an absorption surface of half a square meter, and it is equipped with 50 Bismuth-Telluride based thermoelectric generators, five cold plates, and five inverters. Measurements were performed under production conditions of the industrial environment of the forging process. The heat distribution and temperature profiles are measured and modeled based on the prevailing production conditions and geometric boundary conditions. Under quasi-stationary conditions, the thermoelectric system absorbs a heat radiation of 14.8 kW and feeds electrical power of 388 W into the power grid. The discussed model predicts the measured values with slight deviations.

  15. Development of a System for Thermoelectric Heat Recovery from Stationary Industrial Processes

    NASA Astrophysics Data System (ADS)

    Ebling, D. G.; Krumm, A.; Pfeiffelmann, B.; Gottschald, J.; Bruchmann, J.; Benim, A. C.; Adam, M.; Labs, R.; Herbertz, R. R.; Stunz, A.

    2016-07-01

    The hot forming process of steel requires temperatures of up to 1300°C. Usually, the invested energy is lost to the environment by the subsequent cooling of the forged parts to room temperature. Thermoelectric systems are able to recover this wasted heat by converting the heat into electrical energy and feeding it into the power grid. The proposed thermoelectric system covers an absorption surface of half a square meter, and it is equipped with 50 Bismuth-Telluride based thermoelectric generators, five cold plates, and five inverters. Measurements were performed under production conditions of the industrial environment of the forging process. The heat distribution and temperature profiles are measured and modeled based on the prevailing production conditions and geometric boundary conditions. Under quasi-stationary conditions, the thermoelectric system absorbs a heat radiation of 14.8 kW and feeds electrical power of 388 W into the power grid. The discussed model predicts the measured values with slight deviations.

  16. Enhanced Geothermal Systems Research and Development: Models of Subsurface Chemical Processes Affecting Fluid Flow

    SciTech Connect

    Moller, Nancy; Weare J. H.

    2008-05-29

    Successful exploitation of the vast amount of heat stored beneath the earth’s surface in hydrothermal and fluid-limited, low permeability geothermal resources would greatly expand the Nation’s domestic energy inventory and thereby promote a more secure energy supply, a stronger economy and a cleaner environment. However, a major factor limiting the expanded development of current hydrothermal resources as well as the production of enhanced geothermal systems (EGS) is insufficient knowledge about the chemical processes controlling subsurface fluid flow. With funding from past grants from the DOE geothermal program and other agencies, we successfully developed advanced equation of state (EOS) and simulation technologies that accurately describe the chemistry of geothermal reservoirs and energy production processes via their free energies for wide XTP ranges. Using the specific interaction equations of Pitzer, we showed that our TEQUIL chemical models can correctly simulate behavior (e.g., mineral scaling and saturation ratios, gas break out, brine mixing effects, down hole temperatures and fluid chemical composition, spent brine incompatibilities) within the compositional range (Na-K-Ca-Cl-SO4-CO3-H2O-SiO2-CO2(g)) and temperature range (T < 350°C) associated with many current geothermal energy production sites that produce brines with temperatures below the critical point of water. The goal of research carried out under DOE grant DE-FG36-04GO14300 (10/1/2004-12/31/2007) was to expand the compositional range of our Pitzer-based TEQUIL fluid/rock interaction models to include the important aluminum and silica interactions (T < 350°C). Aluminum is the third most abundant element in the earth’s crust; and, as a constituent of aluminosilicate minerals, it is found in two thirds of the minerals in the earth’s crust. The ability to accurately characterize effects of temperature, fluid mixing and interactions between major rock-forming minerals and hydrothermal and

  17. Development of Energy Models for Production Systems and Processes to Inform Environmentally Benign Decision-Making

    NASA Astrophysics Data System (ADS)

    Diaz-Elsayed, Nancy

    Between 2008 and 2035 global energy demand is expected to grow by 53%. While most industry-level analyses of manufacturing in the United States (U.S.) have traditionally focused on high energy consumers such as the petroleum, chemical, paper, primary metal, and food sectors, the remaining sectors account for the majority of establishments in the U.S. Specifically, of the establishments participating in the Energy Information Administration's Manufacturing Energy Consumption Survey in 2006, the non-energy intensive" sectors still consumed 4*109 GJ of energy, i.e., one-quarter of the energy consumed by the manufacturing sectors, which is enough to power 98 million homes for a year. The increasing use of renewable energy sources and the introduction of energy-efficient technologies in manufacturing operations support the advancement towards a cleaner future, but having a good understanding of how the systems and processes function can reduce the environmental burden even further. To facilitate this, methods are developed to model the energy of manufacturing across three hierarchical levels: production equipment, factory operations, and industry; these methods are used to accurately assess the current state and provide effective recommendations to further reduce energy consumption. First, the energy consumption of production equipment is characterized to provide machine operators and product designers with viable methods to estimate the environmental impact of the manufacturing phase of a product. The energy model of production equipment is tested and found to have an average accuracy of 97% for a product requiring machining with a variable material removal rate profile. However, changing the use of production equipment alone will not result in an optimal solution since machines are part of a larger system. Which machines to use, how to schedule production runs while accounting for idle time, the design of the factory layout to facilitate production, and even the

  18. Low cost solar array project: Experimental process system development unit for producing semiconductor-grade silicon using silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The design, fabrication, and installation of an experimental process system development unit (EPSDU) were analyzed. Supporting research and development were performed to provide an information data base usable for the EPSDU and for technological design and economical analysis for potential scale-up of the process. Iterative economic analyses were conducted for the estimated product cost for the production of semiconductor grade silicon in a facility capable of producing 1000-MT/Yr.

  19. Flat-plate solar array project: Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The process technology for the manufacture of semiconductor-grade silicon in a large commercial plant by 1986, at a price less than $14 per kilogram of silicon based on 1975 dollars is discussed. The engineering design, installation, checkout, and operation of an Experimental Process System Development unit was discussed. Quality control of scaling-up the process and an economic analysis of product and production costs are discussed.

  20. Development of a new flux map processing code for moveable detector system in PWR

    SciTech Connect

    Li, W.; Lu, H.; Li, J.; Dang, Z.; Zhang, X.

    2013-07-01

    This paper presents an introduction to the development of the flux map processing code MAPLE developed by China Nuclear Power Technology Research Institute (CNPPJ), China Guangdong Nuclear Power Group (CGN). The method to get the three-dimensional 'measured' power distribution according to measurement signal has also been described. Three methods, namely, Weight Coefficient Method (WCM), Polynomial Expand Method (PEM) and Thin Plane Spline (TPS) method, have been applied to fit the deviation between measured and predicted results for two-dimensional radial plane. The measured flux map data of the LINGAO nuclear power plant (NPP) is processed using MAPLE as a test case to compare the effectiveness of the three methods, combined with a 3D neutronics code COCO. Assembly power distribution results show that MAPLE results are reasonable and satisfied. More verification and validation of the MAPLE code will be carried out in future. (authors)

  1. Laser forming process development

    SciTech Connect

    Blake, R.J.

    1996-12-31

    This paper is a summary of the activities performed for the process development of laser thermal forming sheet metal parts in support of rapid prototyping. A 400 watt pulsed Nd:YAG laser and 50 watt desktop CO{sub 2} laser were used during initial process development. Several tool-assisted laser forming approaches were conceived during the development of the process, and simple fixtures for process development/understanding were used throughout all testing. Much of the actual forming was performed with the base material in an unfixtured state. CRES (304) was used for baseline development, but the effort was directed toward forming titanium (e.g., 6Al-4V, 15V-3Cr-3Sn-3Al). Several DOE (i.e., Design of Experiment) techniques were employed during development and a Neural Net Computer Model was conceived for process control. This program was a joint effort in cooperation with the American Welding Society under contract with the Defense Advanced Research Projects Agency (DARPA). A synopsis of the laser forming process development, future opportunities, and applications are presented.

  2. Advanced multilateration theory, software development, and data processing: The MICRODOT system

    NASA Technical Reports Server (NTRS)

    Escobal, P. R.; Gallagher, J. F.; Vonroos, O. H.

    1976-01-01

    The process of geometric parameter estimation to accuracies of one centimeter, i.e., multilateration, is defined and applications are listed. A brief functional explanation of the theory is presented. Next, various multilateration systems are described in order of increasing system complexity. Expected systems accuracy is discussed from a general point of view and a summary of the errors is listed. An outline of the design of a software processing system for multilateration, called MICRODOT, is presented next. The links of this software, which can be used for multilateration data simulations or operational data reduction, are examined on an individual basis. Functional flow diagrams are presented to aid in understanding the software capability. MICRODOT capability is described with respect to vehicle configurations, interstation coordinate reduction, geophysical parameter estimation, and orbit determination. Numerical results obtained from MICRODOT via data simulations are displayed both for hypothetical and real world vehicle/station configurations such as used in the GEOS-3 Project. These simulations show the inherent power of the multilateration procedure.

  3. The influence of gravity on the process of development of animal systems

    NASA Technical Reports Server (NTRS)

    Malacinski, G. M.; Neff, A. W.

    1984-01-01

    The development of animal systems is described in terms of a series of overlapping phases: pattern specification; differentiation; growth; and aging. The extent to which altered (micro) gravity (g) affects those phases is briefly reviewed for several animal systems. As a model, amphibian egg/early embryo is described. Recent data derived from clinostat protocols indicates that microgravity simulation alters early pattern specification (dorsal/ventral polarity) but does not adversely influence subsequent morphogenesis. Possible explanations for the absence of catastrophic microgravity effects on amphibian embryogenesis are discussed.

  4. Facilitating the Specification Capture and Transformation Process in the Development of Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Filho, Aluzio Haendehen; Caminada, Numo; Haeusler, Edward Hermann; vonStaa, Arndt

    2004-01-01

    To support the development of flexible and reusable MAS, we have built a framework designated MAS-CF. MAS-CF is a component framework that implements a layered architecture based on contextual composition. Interaction rules, controlled by architecture mechanisms, ensure very low coupling, making possible the sharing of distributed services in a transparent, dynamic and independent way. These properties propitiate large-scale reuse, since organizational abstractions can be reused and propagated to all instances created from a framework. The objective is to reduce complexity and development time of multi-agent systems through the reuse of generic organizational abstractions.

  5. Advanced development of a pressurized ash agglomerating fluidized-bed coal gasification system: Topical report, Process analysis, FY 1983

    SciTech Connect

    1987-07-31

    KRW Energy Systems, Inc., is engaged in the continuing development of a pressurized, fluidized-bed gasification process at its Waltz Mill Site in Madison, Pennsylvania. The overall objective of the program is to demonstrate the viability of the KRW process for the environmentally-acceptable production of low- and medium-Btu fuel gas from a variety of fossilized carbonaceous feedstocks and industrial fuels. This report presents process analysis of the 24 ton-per-day Process Development Unit (PDU) operations and is a continuation of the process analysis work performed in 1980 and 1981. Included is work performed on PDU process data; gasification; char-ash separation; ash agglomeration; fines carryover, recycle, and consumption; deposit formation; materials; and environmental, health, and safety issues. 63 figs., 43 tabs.

  6. Development of automatic movement analysis system for a small laboratory animal using image processing

    NASA Astrophysics Data System (ADS)

    Nagatomo, Satoshi; Kawasue, Kikuhito; Koshimoto, Chihiro

    2013-03-01

    Activity analysis in a small laboratory animal is an effective procedure for various bioscience fields. The simplest way to obtain animal activity data is just observation and recording manually, even though this is labor intensive and rather subjective. In order to analyze animal movement automatically and objectivity, expensive equipment is usually needed. In the present study, we develop animal activity analysis system by means of a template matching method with video recorded movements in laboratory animal at a low cost.

  7. Development of an advanced spacecraft water and waste materials processing system

    NASA Technical Reports Server (NTRS)

    Murray, R. W.; Schelkopf, J. D.; Middleton, R. L.

    1975-01-01

    An Integrated Waste Management-Water System (WM-WS) which uses radioisotopes for thermal energy is described and results of its trial in a 4-man, 180 day simulated space mission are presented. It collects urine, feces, trash, and wash water in zero gravity, processes the wastes to a common evaporator, distills and catalytically purifies the water, and separates and incinerates the solid residues using little oxygen and no chemical additives or expendable filters. Technical details on all subsystems are given along with performance specifications. Data on recovered water and heat loss obtained in test trials are presented. The closed loop incinerator and other projects underway to increase system efficiency and capacity are discussed.

  8. Biomass Torrefaction Process Review and Moving Bed Torrefaction System Model Development

    SciTech Connect

    Jaya Shakar Tumuluru; Shahab Sokhansanj; Christopher T. Wright; Richard D. Boardman

    2010-08-01

    Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-230 C and 270-280 C. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, producing a final product that will have a lower mass but a higher heating value. An important aspect of research is to establish a degree of torrefaction where gains in heating value offset the loss of mass. There is a lack of literature on torrefaction reactor designs and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes (a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and (b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed for different capacities, designing the heat loads and gas flow rates, and developing an interactive excel sheet where the user can define design specifications. In this report, 25-1000 kg/hr are used in equations for the design of the torrefier, examples of calculations, and specifications for the torrefier.

  9. Biomass Torrefaction Process Review and Moving Bed Torrefaction System Model Development

    SciTech Connect

    Jaya Shakar Tumuluru; Shahab Sokhansanj; Christopher T. Wright

    2010-08-01

    Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300°C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200–230ºC and 270–280ºC. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, producing a final product that will have a lower mass but a higher heating value. An important aspect of research is to establish a degree of torrefaction where gains in heating value offset the loss of mass. There is a lack of literature on torrefaction reactor designs and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed for different capacities, designing the heat loads and gas flow rates, and developing an interactive excel sheet where the user can define design specifications. In this report, 25–1000 kg/hr are used in equations for the design of the torrefier, examples of calculations, and specifications for the torrefier.

  10. Development of a Digital Signal Processing System for the X-ray Microcalorimeter onboard ASTRO-H

    NASA Astrophysics Data System (ADS)

    Seta, Hiromi; Tashiro, Makoto S.; Terada, Yukikatsu; Shimoda, Yuya; Onda, Kaori; Ishisaki, Yoshitaka; Tsujimoto, Masahiro; Hagihara, Toshishige; Takei, Yoh; Mitsuda, Kazuhisa; Boyce, Kevin R.; Szymkowiak, Andrew E.

    2009-12-01

    A digital signal processing system for the X-ray microcalorimeter array (SXS) is being developed for the next Japanese X-ray astronomy satellite, ASTRO-H. The SXS digital signal processing system evaluates each pulse by an optimal filtering process. For the ASTRO-H project, we decided to employ digital electronics hardware, which includes a digital I/O board based upon FPGAs, and a separate CPU board. It is crucially important for the FPGA to be able to detect the presence of an ``secondary'' pulses on the tail of an initial pulse. In order to detect the contaminating pulses, we have developed a new finite impulse response filter, to compensate for the undershoot in the derivative. By employing the filter it is possible for FPGA to detect the secondary pulse very close the first pulse, and to reduce the load of the CPU in the secondary pulse searching process.

  11. Development of a prototype spatial information processing system for hydrologic research

    NASA Technical Reports Server (NTRS)

    Sircar, Jayanta K.

    1991-01-01

    Significant advances have been made in the last decade in the areas of Geographic Information Systems (GIS) and spatial analysis technology, both in hardware and software. Science user requirements are so problem specific that currently no single system can satisfy all of the needs. The work presented here forms part of a conceptual framework for an all-encompassing science-user workstation system. While definition and development of the system as a whole will take several years, it is intended that small scale projects such as the current work will address some of the more short term needs. Such projects can provide a quick mechanism to integrate tools into the workstation environment forming a larger, more complete hydrologic analysis platform. Described here are two components that are very important to the practical use of remote sensing and digital map data in hydrology. Described here is a graph-theoretic technique to rasterize elevation contour maps. Also described is a system to manipulate synthetic aperture radar (SAR) data files and extract soil moisture data.

  12. Development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements

    NASA Technical Reports Server (NTRS)

    Rey, Charles A.

    1991-01-01

    The development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements are discussed. Efforts were directed towards the following task areas: design and development of a High Temperature Acoustic Levitator (HAL) for containerless processing and property measurements at high temperatures; testing of the HAL module to establish this technology for use as a positioning device for microgravity uses; construction and evaluation of a brassboard hot wall Acoustic Levitation Furnace; construction and evaluation of a noncontact temperature measurement (NCTM) system based on AGEMA thermal imaging camera; construction of a prototype Division of Amplitude Polarimetric Pyrometer for NCTM of levitated specimens; evaluation of and recommendations for techniques to control contamination in containerless materials processing chambers; and evaluation of techniques for heating specimens to high temperatures for containerless materials experimentation.

  13. Personal Authority in the Family System: Development of a Questionnaire to Measure Personal Authority in Intergenerational Family Processes.

    ERIC Educational Resources Information Center

    Bray, James H.; And Others

    1984-01-01

    Reports a series of studies in the development of the Personal Authority in the Family System (PAFS) questionnaire, designed to measure family processes based on aspects of current intergenerational family theory. Results indicated that the scales have good internal consistency, and test-retest reliability, and supported construct validity. (JAC)

  14. 77 FR 50724 - Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ...The U.S. Nuclear Regulatory Commission (NRC or the Commission) is issuing for public comment draft regulatory guide (DG), DG-1210, ``Developing Software Life Cycle Processes for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1210 is proposed Revision 1 of RG 1.173, dated September 1997. This revision endorses, with clarifications, the enhanced consensus......

  15. A method to develop mission critical data processing systems for satellite based instruments. The spinning mode case.

    NASA Astrophysics Data System (ADS)

    Lazzarotto, Francesco; Fabiani, Sergio; Costa, Enrico; di Persio, Giuseppe; Del Monte, Ettore; Donnarumma, Immacolata; Evangelista, Yuri; Feroci, Marco; Pacciani, Luigi; Rubini, Alda; Soffitta, Paolo

    Modern satellite based experiments are often very complex real-time systems, composed by flight and ground segments that have challenging resource related constraints, in terms of size, weight, power, requirements for real-time response, fault tolerance, and specialized in-put/output hardware-software, and they must be certified to high levels of assurance. Hardware-software data processing systems have to be responsive to system degradation and to changes in the data acquisition modes, and actions have to be taken to change the organization of the mission operations. A big research & develop effort in a team composed by scientists and technologists can lead to produce software systems able to optimize the hardware to reach very high levels of performance or to pull degraded hardware to maintain satisfactory features. We'll show real-life examples describing a system, able to process the data of a X-Ray detecting satellite-based mission in spinning mode.

  16. Automating the training development process

    NASA Technical Reports Server (NTRS)

    Scott, Carol J.

    1993-01-01

    The Automated Training Development System (ATDS) was developed as a training tool for the JPL training environment. ATDS is based on the standard for military training programs and is designed to develop training from a system perspective, focusing on components in terms of the whole process. The principal feature of ATDS is data base maintainability. Everything is contained and maintained within the data base, and, if properly developed, it could be a training component of a software delivery and provided to CM as a controlled item. The analysis, development, design, presentation, and reporting phases in the ATDS instructional design method are illustrated.

  17. Development of metallization process

    NASA Technical Reports Server (NTRS)

    Garcia, A., III

    1983-01-01

    A non lead frit paste is evaluated. A two step process is discussed where the bulk of the metallization is Mo/Sn but a small ohmic pad is silver. A new matrix of paste formulations is developed. A variety of tests are performed on paste samples to determine electrical, thermal and structural properties.

  18. Instructional System Development.

    ERIC Educational Resources Information Center

    Department of the Air Force, Washington, DC.

    The manual presents a technology of instructional design and a model for developing and conducting efficient and cost effective Air Force instructional systems. Chapter 1 provides an overview of Instructional System Development (ISD). Chapters 2-6 each focus on a step of the process: analysis of system requirements; definition of…

  19. Electrification of precipitating systems over the Amazon: Physical processes of thunderstorm development

    NASA Astrophysics Data System (ADS)

    Albrecht, Rachel I.; Morales, Carlos A.; Silva Dias, Maria A. F.

    2011-04-01

    This study investigated the physical processes involved in the development of thunderstorms over southwestern Amazon by hypothesizing causalities for the observed cloud-to-ground lightning variability and the local environmental characteristics. Southwestern Amazon experiences every year a large variety of environmental factors, such as the gradual increase in atmospheric moisture, extremely high pollution due to biomass burning, and intense deforestation, which directly affects cloud development by differential surface energy partition. In the end of the dry period it was observed higher percentages of positive cloud-to-ground (+CG) lightning due to a relative increase in +CG dominated thunderstorms (positive thunderstorms). Positive (negative) thunderstorms initiated preferentially over deforested (forest) areas with higher (lower) cloud base heights, shallower (deeper) warm cloud depths, and higher (lower) convective potential available energy. These features characterized the positive (negative) thunderstorms as deeper (relatively shallower) clouds, stronger (relatively weaker) updrafts with enhanced (decreased) mixed and cold vertically integrated liquid. No significant difference between thunderstorms (negative and positive) and nonthunderstorms were observed in terms of atmospheric pollution, once the atmosphere was overwhelmed by pollution leading to an updraft-limited regime. However, in the wet season both negative and positive thunderstorms occurred during periods of relatively higher aerosol concentration and differentiated size distributions, suggesting an aerosol-limited regime where cloud electrification could be dependent on the aerosol concentration to suppress the warm and enhance the ice phase. The suggested causalities are consistent with the invoked hypotheses, but they are not observed facts; they are just hypotheses based on plausible physical mechanisms.

  20. Twenty-four well plate miniature bioreactor system as a scale-down model for cell culture process development.

    PubMed

    Chen, Aaron; Chitta, Rajesh; Chang, David; Amanullah, Ashraf

    2009-01-01

    Increasing the throughput and efficiency of cell culture process development has become increasingly important to rapidly screen and optimize cell culture media and process parameters. This study describes the application of a miniaturized bioreactor system as a scaled-down model for cell culture process development using a CHO cell line expressing a recombinant protein. The microbioreactor system (M24) provides non-invasive online monitoring and control capability for process parameters such as pH, dissolved oxygen (DO), and temperature at the individual well level. A systematic evaluation of the M24 for cell culture process applications was successfully completed. Several challenges were initially identified. These included uneven gas distribution in the wells due to system design and lot to lot variability, foaming issues caused by sparging required for active DO control, and pH control limitation under conditions of minimal dissolved CO2. A high degree of variability was found which was addressed by changes in the system design. The foaming issue was resolved by addition of anti-foam, reduction of sparge rate, and elimination of DO control. The pH control limitation was overcome by a single manual liquid base addition. Intra-well reproducibility, as indicated by measurements of process parameters, cell growth, metabolite profiles, protein titer, protein quality, and scale-equivalency between the M24 and 2 L bioreactor cultures were very good. This evaluation has shown feasibility of utilizing the M24 as a scale-down tool for cell culture application development under industrially relevant process conditions. PMID:18683260

  1. The Scenario-Based Engineering Process (SEP): a user-centered approach for the development of health care systems.

    PubMed

    Harbison, K; Kelly, J; Burnell, L; Silva, J

    1995-01-01

    The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks. PMID:8591321

  2. Volcanic alert system (VAS) developed during the 2011-2014 El Hierro (Canary Islands) volcanic process

    NASA Astrophysics Data System (ADS)

    García, Alicia; Berrocoso, Manuel; Marrero, José M.; Fernández-Ros, Alberto; Prates, Gonçalo; De la Cruz-Reyna, Servando; Ortiz, Ramón

    2014-06-01

    The 2011 volcanic unrest at El Hierro Island illustrated the need for a Volcanic Alert System (VAS) specifically designed for the management of volcanic crises developing after long repose periods. The VAS comprises the monitoring network, the software tools for analysis of the monitoring parameters, the Volcanic Activity Level (VAL) management, and the assessment of hazard. The VAS presented here focuses on phenomena related to moderate eruptions, and on potentially destructive volcano-tectonic earthquakes and landslides. We introduce a set of new data analysis tools, aimed to detect data trend changes, as well as spurious signals related to instrumental failure. When data-trend changes and/or malfunctions are detected, a watchdog is triggered, issuing a watch-out warning (WOW) to the Monitoring Scientific Team (MST). The changes in data patterns are then translated by the MST into a VAL that is easy to use and understand by scientists, technicians, and decision-makers. Although the VAS was designed specifically for the unrest episodes at El Hierro, the methodologies may prove useful at other volcanic systems.

  3. An overview of the Software Development Process for the NASA Langley Atmospheric Data Center Archive Next Generation system

    NASA Astrophysics Data System (ADS)

    Piatko, P.; Perez, J.; Kinney, J. B.

    2013-12-01

    The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center is responsible for the archive and distribution of Earth science data in the areas of radiation budget, clouds, aerosols, and tropospheric chemistry. The ASDC has developed and implemented the Archive Next Generation (ANGe) system, a state-of-the-art data ingest, archival, and distribution system to serve the atmospheric sciences data provider and user communities. The ANGe project follows a software development process that covers the full life-cycle of the system, from initial requirements to deployment to production to long-term maintenance of the software. The project uses several tools to support the different stages of the process, such as Subversion for source code control, JIRA for change management, Confluence for documentation and collaboration, and Bamboo for continuous integration. Based on our experience with developing ANGe and other projects at the ASDC, we also provide support for local science projects by setting up Subversion repositories and tools such as Trac, and providing training and support on their use. An overview of the software development process and the tools used to support it will be presented.

  4. The process of development of a prioritization tool for a clinical decision support build within a computerized provider order entry system: Experiences from St Luke's Health System.

    PubMed

    Wolf, Matthew; Miller, Suzanne; DeJong, Doug; House, John A; Dirks, Carl; Beasley, Brent

    2016-09-01

    To establish a process for the development of a prioritization tool for a clinical decision support build within a computerized provider order entry system and concurrently to prioritize alerts for Saint Luke's Health System. The process of prioritizing clinical decision support alerts included (a) consensus sessions to establish a prioritization process and identify clinical decision support alerts through a modified Delphi process and (b) a clinical decision support survey to validate the results. All members of our health system's physician quality organization, Saint Luke's Care as well as clinicians, administrators, and pharmacy staff throughout Saint Luke's Health System, were invited to participate in this confidential survey. The consensus sessions yielded a prioritization process through alert contextualization and associated Likert-type scales. Utilizing this process, the clinical decision support survey polled the opinions of 850 clinicians with a 64.7 percent response rate. Three of the top rated alerts were approved for the pre-implementation build at Saint Luke's Health System: Acute Myocardial Infarction Core Measure Sets, Deep Vein Thrombosis Prophylaxis within 4 h, and Criteria for Sepsis. This study establishes a process for developing a prioritization tool for a clinical decision support build within a computerized provider order entry system that may be applicable to similar institutions. PMID:25814483

  5. Low cost solar array project production process and equipment task: A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Several major modifications were made to the design presented at the PDR. The frame was deleted in favor of a "frameless" design which will provide a substantially improved cell packing factor. Potential shaded cell damage resulting from operation into a short circuit can be eliminated by a change in the cell series/parallel electrical interconnect configuration. The baseline process sequence defined for the MEPSON was refined and equipment design and specification work was completed. SAMICS cost analysis work accelerated, format A's were prepared and computer simulations completed. Design work on the automated cell interconnect station was focused on bond technique selection experiments.

  6. Development of a next-generation automated DICOM processing system in a PACS-less research environment.

    PubMed

    Ziegler, Scott E

    2012-10-01

    The use of clinical imaging modalities within the pharmaceutical research space provides value and challenges. Typical clinical settings will utilize a Picture Archive and Communication System (PACS) to transmit and manage Digital Imaging and Communications in Medicine (DICOM) images generated by clinical imaging systems. However, a PACS is complex and provides many features that are not required within a research setting, making it difficult to generate a business case and determine the return on investment. We have developed a next-generation DICOM processing system using open-source software, commodity server hardware such as Apple Xserve®, high-performance network-attached storage (NAS), and in-house-developed preprocessing programs. DICOM-transmitted files are arranged in a flat file folder hierarchy easily accessible via our downstream analysis tools and a standard file browser. This next-generation system had a minimal construction cost due to the reuse of all the components from our first-generation system with the addition of a second server for a few thousand dollars. Performance metrics were gathered and the system was found to be highly scalable, performed significantly better than the first-generation system, is modular, has satisfactory image integrity, and is easier to maintain than the first-generation system. The resulting system is also portable across platforms and utilizes minimal hardware resources, allowing for easier upgrades and migration to smaller form factors at the hardware end-of-life. This system has been in production successfully for 8 months and services five clinical instruments and three pre-clinical instruments. This system has provided us with the necessary DICOM C-Store functionality, eliminating the need for a clinical PACS for day-to-day image processing. PMID:22546983

  7. Silicon Web Process Development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Hopkins, R. H.; Mchugh, J. P.; Hill, F. E.; Heimlich, M. E.; Driggers, J. M.

    1978-01-01

    Progress in the development of techniques to grow silicon web at 25 wq cm/min output rate is reported. Feasibility of web growth with simultaneous melt replenishment is discussed. Other factors covered include: (1) tests of aftertrimmers to improve web width; (2) evaluation of growth lid designs to raise speed and output rate; (3) tests of melt replenishment hardware; and (4) investigation of directed gas flow systems to control unwanted oxide deposition in the system and to improve convective cooling of the web. Compatibility with sufficient solar cell performance is emphasized.

  8. Develop Recovery Systems for Separations of Salts from Process Streams for use in Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Colon, Guillermo

    1998-01-01

    The main objectives of this project were the development of a four-compartment electrolytic cell using high selective membranes to remove nitrate from crop residue leachate and convert it to nitric acid, and the development of an six compartment electrodialysis cell to remove selectively sodium from urine wastes. The recovery of both plant inedible biomass and human wastes nutrients to sustain a biomass production system are important aspects in the development of a controlled ecological life support system (CELSS) to provide the basic human needs required for life support during long term space missions. A four-compartment electrolytic cell has been proposed to remove selectively nitrate from crop residue and to convert it to nitric acid, which is actually used in the NASA-KSC Controlled Ecological Life Support System to control the pH of the aerobic bioreactors and biomass production chamber. Human activities in a closed system require large amount of air, water and minerals to sustain life and also generate wastes. Before using human wastes as nutrients, these must be treated to reduce organic content and to remove some minerals which have adverse effects on plant growth. Of all the minerals present in human urine, sodium chloride (NACl) is the only one that can not be used as nutrient for most plants. Human activities also requires sodium chloride as part of the diet. Therefore, technology to remove and recover sodium chloride from wastes is highly desirable. A six-compartment electrodialysis cell using high selective membranes has been proposed to remove and recover NaCl from human urine.

  9. Development Status of a CVD System to Deposit Tungsten onto UO2 Powder via the WCI6 Process

    NASA Technical Reports Server (NTRS)

    Mireles, O. R.; Kimberlin, A.; Broadway, J.; Hickman, R.

    2014-01-01

    Nuclear Thermal Propulsion (NTP) is under development for deep space exploration. NTP's high specific impulse (> 850 second) enables a large range of destinations, shorter trip durations, and improved reliability. W-60vol%UO2 CERMET fuel development efforts emphasize fabrication, performance testing and process optimization to meet service life requirements. Fuel elements must be able to survive operation in excess of 2850 K, exposure to flowing hydrogen (H2), vibration, acoustic, and radiation conditions. CTE mismatch between W and UO2 result in high thermal stresses and lead to mechanical failure as a result UO2 reduction by hot hydrogen (H2) [1]. Improved powder metallurgy fabrication process control and mitigated fuel loss can be attained by coating UO2 starting powders within a layer of high density tungsten [2]. This paper discusses the advances of a fluidized bed chemical vapor deposition (CVD) system that utilizes the H2-WCl6 reduction process.

  10. Data Processing System (DPS) software with experimental design, statistical analysis and data mining developed for use in entomological research.

    PubMed

    Tang, Qi-Yi; Zhang, Chuan-Xi

    2013-04-01

    A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. PMID:23955865

  11. Evaluating and Understanding Parameterized Convective Processes and Their Role in the Development of Mesoscale Precipitation Systems

    NASA Technical Reports Server (NTRS)

    Fritsch, J. Michael (Principal Investigator); Kain, John S.

    1995-01-01

    Research efforts during the first year focused on numerical simulations of two convective systems with the Penn State/NCAR mesoscale model. The first of these systems was tropical cyclone Irma, which occurred in 1987 in Australia's Gulf of Carpentaria during the AMEX field program. Comparison simulations of this system were done with two different convective parameterization schemes (CPS's), the Kain-Fritsch (1993 - KF) and the Betts-Miller (Betts 1986- BM) schemes. The second system was the June 10-11 1985 squall line simulation, which occurred over the Kansas-Oklahoma region during the PRE-STORM experiment. Simulations of this system using the KF scheme were examined in detail.

  12. Evaluating and Understanding Parameterized Convective Processes and Their Role in the Development of Mesoscale Precipitation Systems

    NASA Technical Reports Server (NTRS)

    Fritsch, J. Michael; Kain, John S.

    1996-01-01

    Research efforts focused on numerical simulations of two convective systems with the Penn State/NCAR mesoscale model. The first of these systems was tropical cyclone Irma, which occurred in 1987 in Australia's Gulf of Carpentaria during the AMEX field program. Comparison simulations of this system were done with two different convective parameterization schemes (CPS's), the Kain-Fritsch (KF) and the Betts-Miller (BM) schemes. The second system was the June 10-11, 1985 squall line simulation, which occurred over the Kansas-Oklahoma region during the PRE-STORM experiment. Simulations of this system using the KF scheme were examined in detail.

  13. Development and testing of a wet oxidation waste processing system. [for waste treatment aboard manned spacecraft

    NASA Technical Reports Server (NTRS)

    Weitzmann, A. L.

    1977-01-01

    The wet oxidation process is considered as a potential treatment method for wastes aboard manned spacecraft for these reasons: (1) Fecal and urine wastes are processed to sterile water and CO2 gas. However, the water requires post-treatment to remove salts and odor; (2) the residual ash is negligible in quantity, sterile and easily collected; and (3) the product CO2 gas can be processed through a reduction step to aid in material balance if needed. Reaction of waste materials with oxygen at elevated temperature and pressure also produces some nitrous oxide, as well as trace amounts of a few other gases.

  14. Signal processing development

    NASA Astrophysics Data System (ADS)

    Barrett, T. B.; Marshall, R.; Bloom, J.; Comer, C.; Caulfield, J.; Warde, C.; Salour, M.

    1989-11-01

    This electron microscope has been applied to the study of the growth of thin epitaxial films on silicon substrates. The study of the nature of platinum-silicide films formed by heating evaporated platinum films on these substrates is discussed. The use of ultra high vacuum systems together with a residual gas analyzer (RGA) is discussed as they relate to the preparation of silicides, a dielectric layer of silicon monoxide is evaporated and an ion beam implanter is used to form a special buried layer as a step toward silicon devices. Synthesis and single crystal growth of indium phosphide in a one-step in-situ process at high ambient pressures is discussed. Analysis of heat transfer by convection, conduction, and radiation in a closed pressure vessel is given. A set of source modules and NOS procedures have been prepared to permit easy access to a 3-dimensional, non-isotrophic ray-tracing program (the Jones - Stephenson program). This system is designed to be run on a CDC CYBER computer system or equivalent using the operating system.

  15. Developing a Microcomputer-Based Decision Support System: People and Process.

    ERIC Educational Resources Information Center

    Starratt, Joseph; And Others

    1990-01-01

    Discusses the need for management information and decision support systems in libraries, and identifies inertia and confusion as the main contributors to the lack of successful implementations. An attempt to initiate a decision support system at the University of Nebraska at Omaha is described, and both problems encountered and benefits gained are…

  16. Clementine Sensor Processing System

    NASA Technical Reports Server (NTRS)

    Feldstein, A. A.

    1993-01-01

    The design of the DSPSE Satellite Controller (DSC) is baselined as a single-string satellite controller. The DSC performs two main functions: health and maintenance of the spacecraft; and image capture, storage, and playback. The DSC contains two processors: a radiation-hardened Mil-Std-1750, and a commercial R3000. The Mil-Std-1750 processor performs all housekeeping operations, while the R3000 is mainly used to perform the image processing functions associated with the navigation functions, as well as performing various experiments. The DSC also contains a data handling unit (DHU) used to interface to various spacecraft imaging sensors and to capture, compress, and store selected images onto the solid-state data recorder. The development of the DSC evolved from several key requirements; the DSPSE satellite was to do the following: (1) have a radiation-hardened spacecraft control system and be immune to single-event upsets (SEU's); (2) use an R3000-based processor to run the star tracker software that was developed by SDIO (due to schedule and cost constraints, there was no time to port the software to a radiation-hardened processor); and (3) fly a commercial processor to verify its suitability for use in a space environment. In order to enhance the DSC reliability, the system was designed with multiple processing paths. These multiple processing paths provide for greater tolerance to various component failures. The DSC was designed so that all housekeeping processing functions are performed by either the Mil-Std-1750 processor or the R3000 processor. The image capture and storage is performed either by the DHU or the R3000 processor.

  17. An interactive image processing system.

    PubMed

    Troxel, D E

    1981-01-01

    A multiuser multiprocessing image processing system has been developed. It is an interactive picture manipulation and enhancement facility which is capable of executing a variety of image processing operations while simultaneously controlling real-time input and output of pictures. It was designed to provide a reliable picture processing system which would be cost-effective in the commercial production environment. Additional goals met by the system include flexibility and ease of operation and modification. PMID:21868923

  18. Development of a strategy for energy efficiency improvement in a Kraft process based on systems interactions analysis

    NASA Astrophysics Data System (ADS)

    Mateos-Espejel, Enrique

    The objective of this thesis is to develop, validate, and apply a unified methodology for the energy efficiency improvement of a Kraft process that addresses globally the interactions of the various process systems that affect its energy performance. An implementation strategy is the final result. An operating Kraft pulping mill situated in Eastern Canada with a production of 700 adt/d of high-grade bleached pulp was the case study. The Pulp and Paper industry is Canada's premier industry. It is characterized by large thermal energy and water consumption. Rising energy costs and more stringent environmental regulations have led the industry to refocus its efforts toward identifying ways to improve energy and water conservation. Energy and water aspects are usually analyzed independently, but in reality they are strongly interconnected. Therefore, there is a need for an integrated methodology, which considers energy and water aspects, as well as the optimal utilization and production of the utilities. The methodology consists of four successive stages. The first stage is the base case definition. The development of a focused, reliable and representative model of an operating process is a prerequisite to the optimization and fine tuning of its energy performance. A four-pronged procedure has been developed: data gathering, master diagram, utilities systems analysis, and simulation. The computer simulation has been focused on the energy and water systems. The second stage corresponds to the benchmarking analysis. The benchmarking of the base case has the objectives of identifying the process inefficiencies and to establish guidelines for the development of effective enhancement measures. The studied process is evaluated by a comparison of its efficiency to the current practice of the industry and by the application of new energy and exergy content indicators. The minimum energy and water requirements of the process are also determined in this step. The third stage is

  19. Development of information-and-control systems as a basis for modernizing the automated process control systems of operating power equipment

    NASA Astrophysics Data System (ADS)

    Shapiro, V. I.; Borisova, E. V.; Chausov, Yu. N.

    2014-03-01

    The main drawbacks inherent in the hardware of outdated control systems of power stations are discussed. It is shown that economically efficient and reliable operation of the process equipment will be impossible if certain part of these control systems is used further. It is pointed out that full retrofitting of outdated control systems on operating equipment in one go with replacing all technical facilities and cable connections by a modern computerized automation system involves certain difficulties if such work is carried out under the conditions of limited financial resources or a limited period of time destined for doing the works. A version of control system modernization is suggested that involves replacement of the most severely worn and outdated equipment (indicating and recording instruments, and local controllers) and retaining the existing cable routes and layout of board facilities. The modernization implies development of informationand-control systems constructed on the basis of a unified computerized automation system. Software and hardware products that have positively proven themselves in thermal power engineering are proposed for developing such an automation system. It is demonstrated that the proposed system has a considerable potential for its functional development and can become a basis for constructing a fully functional automated process control system.

  20. Development of information-measuring channels of the monitoring system of quality cut for technological process laser cutting of materials

    NASA Astrophysics Data System (ADS)

    Sukhov, Yuri T.; Matiushin, I. V.

    2001-01-01

    In the paper the researches on development of informational - measuring channels of quality indexes monitoring of laser cutting process with use of acoustic and optical signals of a different range are submitted. The estimations of their information significance and efficiency of use of acoustic and optical signals of a different range are submitted. The estimation of their information significance and efficiency of use are reduced. The structure of the informational - measuring research stand including the basic measuring channels for definition critical by criteria of quality and productivity of process parameters is offered. The carried out researches are aimed to system engineering of monitoring and the quality control of laser cutting processing of materials in real-time mode.

  1. High-throughput downstream process development for cell-based products using aqueous two-phase systems.

    PubMed

    Zimmermann, Sarah; Gretzinger, Sarah; Schwab, Marie-Luise; Scheeder, Christian; Zimmermann, Philipp K; Oelmeier, Stefan A; Gottwald, Eric; Bogsnes, Are; Hansson, Mattias; Staby, Arne; Hubbuch, Jürgen

    2016-09-16

    As the clinical development of cell-based therapeutics has evolved immensely within the past years, downstream processing strategies become more relevant than ever. Aqueous two-phase systems (ATPS) enable the label-free, scalable, and cost-effective separation of cells, making them a promising tool for downstream processing of cell-based therapeutics. Here, we report the development of an automated robotic screening that enables high-throughput cell partitioning analysis in ATPS. We demonstrate that this setup enables fast and systematic investigation of factors influencing cell partitioning. Moreover, we examined and optimized separation conditions for the differentiable promyelocytic cell line HL-60 and used a counter-current distribution-model to investigate optimal separation conditions for a multi-stage purification process. Finally, we show that the separation of CD11b-positive and CD11b-negative HL-60 cells is possible after partial DMSO-mediated differentiation towards the granulocytic lineage. The modeling data indicate that complete peak separation is possible with 30 transfers, and >93% of CD11b-positive HL-60 cells can be recovered with >99% purity. The here described screening platform facilitates faster, cheaper, and more directed downstream process development for cell-based therapeutics and presents a powerful tool for translational research. PMID:27567679

  2. Scaled Vitrification System III (SVS III) Process Development and Laboratory Tests at the West Valley Demonstration Project

    SciTech Connect

    V. Jain; S. M. Barnes; B. G. Bindi; R. A. Palmer

    2000-04-30

    At the West Valley Demonstration Project (WVDP),the Vitrification Facility (VF)is designed to convert the high-level radioactive waste (HLW)stored on the site to a stable glass for disposal at a Department of Energy (DOE)-specified federal repository. The Scaled Vitrification System III (SVS-III)verification tests were conducted between February 1995 and August 1995 as a supplemental means to support the vitrification process flowsheet, but at only one seventh the scale.During these tests,the process flowsheet was refined and optimized. The SVS-III test series was conducted with a focus on confirming the applicability of the Redox Forecasting Model, which was based on the Index of Feed Oxidation (IFO)developed during the Functional and Checkout Testing of Systems (FACTS)and SVS-I tests. Additional goals were to investigate the prototypical feed preparation cycle and test the new target glass composition. Included in this report are the basis and current designs of the major components of the Scale Vitrification System and the results of the SVS-III tests.The major subsystems described are the feed preparation and delivery, melter, and off-gas treatment systems. In addition,the correlation between the melter's operation and its various parameters;which included feed rate,cold cap coverage,oxygen reduction (redox)state of the glass,melter power,plenum temperature,and airlift analysis;were developed.

  3. Design Process for the Development of a New Truck Monitoring System - 13306

    SciTech Connect

    LeBlanc, P.J.; Bronson, Frazier

    2013-07-01

    Canberra Industries, Inc. has designed a new truck monitoring system for a facility in Japan. The customer desires to separately quantify the Cs-137 and Cs-134 content of truck cargo entering and leaving a Waste Consolidation Area. The content of the trucks will be some combination of sand, soil, and vegetation with densities ranging from 0.3 g/cc - 1.6 g/cc. The typical weight of the trucks will be approximately 10 tons, but can vary between 4 and 20 tons. The system must be sensitive enough to detect 100 Bq/kg in 10 seconds (with less than 10% relative standard deviation) but still have enough dynamic range to measure 1,000,000 Bq/kg material. The system will be operated in an outdoor environment. Starting from these requirements, Canberra explored all aspects of the counting system in order to provide the customer with the optimized solution. The desire to separately quantify Cs-137 and Cs-134 favors the use of a spectroscopic system as a solution. Using the In Situ Object Counting System (ISOCS) mathematical efficiency calculation tool, we explored various detector types, number, and physical arrangement for maximum performance. Given the choice of detector, the ISOCS software was used to investigate which geometric parameters (fill height, material density, etc.) caused the most fluctuations in the efficiency results. Furthermore, these variations were used to obtain quantitative estimates of the uncertainties associated with the possible physical variations in the truck size, detector positioning, and material composition, density, and fill height. Various shielding options were also explored to ensure that any measured Cs content would be from the truck and not from the surrounding area. The details of the various calculations along with the final design are given. (authors)

  4. Review on Biomass Torrefaction Process and Product Properties and Design of Moving Bed Torrefaction System Model Development

    SciTech Connect

    Jaya Shankar Tumuluru; Christopher T. Wright; Shahab Sokhansanj

    2011-08-01

    A Review on Torrefaction Process and Design of Moving Bed Torrefaction System for Biomass Processing Jaya Shankar Tumuluru1, Shahab Sokhansanj2 and Christopher T. Wright1 Idaho National Laboratory Biofuels and Renewable Energy Technologies Department Idaho Falls, Idaho 83415 Oak Ridge National Laboratory Bioenergy Resource and Engineering Systems Group Oak Ridge, TN 37831 Abstract Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-230 C and 270-280 C. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, which produces a final product that will have a lower mass but a higher heating value. There is a lack of literature on the design aspects of torrefaction reactor and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes (a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and (b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed torrefier for different capacities ranging from 25-1000 kg/hr, designing the heat loads and gas flow rates, and

  5. EDS coal liquefaction process development. Phase V. EDS Consolidation Program: flushing and blowdown system design

    SciTech Connect

    1984-01-01

    The flushing and blowdown system of an EDS plant provides the means of removing viscous coal products and slurry streams from plant vessels and lines. In addition, it provides the flushing oil needed during normal operations for purging instruments in slurry service, for flushing slurry pump and slurry agitator seals, and for flushing slurry safety valve inlet lines. It contains a blowdown system for collecting material from washing operations, including the transport of the collected material to slop tankage. The rerun options for depleting the inventory of collected slop are a related aspect of the flushing and blowdown system design although specific equipment for handling slop is not part of the flushing and blowdown system facilities. This report documents the results of a study which evaluates the flushing and blowdown requirements for a commercial-scale EDS plant. The work was conducted as part of the EDS Consolidation Program. The design recommendations represent a consolidation of learnings accrued during previous phases of the EDS Project including results obtained from ECLP operations, from the ECLP Test Program, and from past EDS Study Design preparations. 1 reference, 4 figures, 2 tables.

  6. Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hill, F. E.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.

    1980-01-01

    A barrier crucible design which consistently maintains melt stability over long periods of time was successfully tested and used in long growth runs. The pellet feeder for melt replenishment was operated continuously for growth runs of up to 17 hours. The liquid level sensor comprising a laser/sensor system was operated, performed well, and meets the requirements for maintaining liquid level height during growth and melt replenishment. An automated feedback loop connecting the feed mechanism and the liquid level sensing system was designed and constructed and operated successfully for 3.5 hours demonstrating the feasibility of semi-automated dendritic web growth. The sensitivity of the cost of sheet, to variations in capital equipment cost and recycling dendrites was calculated and it was shown that these factors have relatively little impact on sheet cost. Dendrites from web which had gone all the way through the solar cell fabrication process, when melted and grown into web, produce crystals which show no degradation in cell efficiency. Material quality remains high and cells made from web grown at the start, during, and the end of a run from a replenished melt show comparable efficiencies.

  7. Development of Neural Systems for Processing Social Exclusion from Childhood to Adolescence

    ERIC Educational Resources Information Center

    Bolling, Danielle Z.; Pitskel, Naomi B.; Deen, Ben; Crowley, Michael J.; Mayes, Linda C.; Pelphrey, Kevin A.

    2011-01-01

    Adolescence is a period of development in which peer relationships become especially important. A computer-based game (Cyberball) has been used to explore the effects of social exclusion in adolescents and adults. The current functional magnetic resonance imaging (fMRI) study used Cyberball to extend prior work to the cross-sectional study of…

  8. Spitzer Telemetry Processing System

    NASA Technical Reports Server (NTRS)

    Stanboli, Alice; Martinez, Elmain M.; McAuley, James M.

    2013-01-01

    The Spitzer Telemetry Processing System (SirtfTlmProc) was designed to address objectives of JPL's Multi-mission Image Processing Lab (MIPL) in processing spacecraft telemetry and distributing the resulting data to the science community. To minimize costs and maximize operability, the software design focused on automated error recovery, performance, and information management. The system processes telemetry from the Spitzer spacecraft and delivers Level 0 products to the Spitzer Science Center. SirtfTlmProc is a unique system with automated error notification and recovery, with a real-time continuous service that can go quiescent after periods of inactivity. The software can process 2 GB of telemetry and deliver Level 0 science products to the end user in four hours. It provides analysis tools so the operator can manage the system and troubleshoot problems. It automates telemetry processing in order to reduce staffing costs.

  9. Volcanic Alert System (VAS) developed during the (2011-2013) El Hierro (Canary Islands) volcanic process

    NASA Astrophysics Data System (ADS)

    Ortiz, Ramon; Berrocoso, Manuel; Marrero, Jose Manuel; Fernandez-Ros, Alberto; Prates, Gonçalo; De la Cruz-Reyna, Servando; Garcia, Alicia

    2014-05-01

    In volcanic areas with long repose periods (as El Hierro), recently installed monitoring networks offer no instrumental record of past eruptions nor experience in handling a volcanic crisis. Both conditions, uncertainty and inexperience, contribute to make the communication of hazard more difficult. In fact, in the initial phases of the unrest at El Hierro, the perception of volcanic risk was somewhat distorted, as even relatively low volcanic hazards caused a high political impact. The need of a Volcanic Alert System became then evident. In general, the Volcanic Alert System is comprised of the monitoring network, the software tools for the analysis of the observables, the management of the Volcanic Activity Level, and the assessment of the threat. The Volcanic Alert System presented here places special emphasis on phenomena associated to moderate eruptions, as well as on volcano-tectonic earthquakes and landslides, which in some cases, as in El Hierro, may be more destructive than an eruption itself. As part of the Volcanic Alert System, we introduce here the Volcanic Activity Level which continuously applies a routine analysis of monitoring data (particularly seismic and deformation data) to detect data trend changes or monitoring network failures. The data trend changes are quantified according to the Failure Forecast Method (FFM). When data changes and/or malfunctions are detected, by an automated watchdog, warnings are automatically issued to the Monitoring Scientific Team. Changes in the data patterns are then translated by the Monitoring Scientific Team into a simple Volcanic Activity Level, that is easy to use and understand by the scientists and technicians in charge for the technical management of the unrest. The main feature of the Volcanic Activity Level is its objectivity, as it does not depend on expert opinions, which are left to the Scientific Committee, and its capabilities for early detection of precursors. As a consequence of the El Hierro

  10. Development of neural systems for processing social exclusion from childhood to adolescence.

    PubMed

    Bolling, Danielle Z; Pitskel, Naomi B; Deen, Ben; Crowley, Michael J; Mayes, Linda C; Pelphrey, Kevin A

    2011-11-01

    Adolescence is a period of development in which peer relationships become especially important. A computer-based game (Cyberball) has been used to explore the effects of social exclusion in adolescents and adults. The current functional magnetic resonance imaging (fMRI) study used Cyberball to extend prior work to the cross-sectional study of younger children and adolescents (7 to 17 years), identifying age-related changes in the neural correlates of social exclusion across the important transition from middle childhood into adolescence. Additionally, a control task illustrated the specificity of these age-related changes for social exclusion as distinct from expectancy violation more generally. During exclusion, activation in and functional connectivity between ventrolateral prefrontal cortex and ventral anterior cingulate cortex increased with age. These effects were specific to social exclusion and did not exist for expectancy violation. Our results illustrate developmental changes from middle childhood through adolescence in both affective and regulatory brain regions during social exclusion. PMID:22010901

  11. A database of wavefront measurements for laser system modeling, optical component development and fabrication process qualification

    SciTech Connect

    Wolfe, C.R.; Lawson, J.K.; Aikens, D.M.; English, R.E.

    1995-04-12

    In the second half of the 1990`s, LLNL and others anticipate designing and beginning construction of the National Ignition Facility (NIF). The NIF will be capable of producing the worlds first laboratory scale fusion ignition and bum reaction by imploding a small target. The NIF will utilize approximately 192 simultaneous laser beams for this purpose. The laser will be capable of producing a shaped energy pulse of at least 1.8 million joules (MJ) with peak power of at least 500 trillion watts (TV). In total, the facility will require more than 7,000 large optical components. The performance of a high power laser of this kind can be seriously degraded by the presence of low amplitude, periodic modulations in the surface and transmitted wavefronts of the optics used. At high peak power, these phase modulations can convert into large intensity modulations by non-linear optical processes. This in turn can lead to loss in energy on target via many well known mechanisms. In some cases laser damage to the optics downstream of the source of the phase modulation can occur. The database described here contains wavefront phase maps of early prototype optical components for the NIF. It has only recently become possible to map the wavefront of these large aperture components with high spatial resolution. Modem large aperture static fringe and phase shifting interferometers equipped with large area solid state detectors have made this possible. In a series of measurements with these instruments, wide spatial bandwidth can be detected in the wavefront.

  12. Development of Automatic Live Linux Rebuilding System with Flexibility in Science and Engineering Education and Applying to Information Processing Education

    NASA Astrophysics Data System (ADS)

    Sonoda, Jun; Yamaki, Kota

    We develop an automatic Live Linux rebuilding system for science and engineering education, such as information processing education, numerical analysis and so on. Our system is enable to easily and automatically rebuild a customized Live Linux from a ISO image of Ubuntu, which is one of the Linux distribution. Also, it is easily possible to install/uninstall packages and to enable/disable init daemons. When we rebuild a Live Linux CD using our system, we show number of the operations is 8, and the rebuilding time is about 33 minutes on CD version and about 50 minutes on DVD version. Moreover, we have applied the rebuilded Live Linux CD in a class of information processing education in our college. As the results of a questionnaires survey from our 43 students who used the Live Linux CD, we obtain that the our Live Linux is useful for about 80 percents of students. From these results, we conclude that our system is able to easily and automatically rebuild a useful Live Linux in short time.

  13. Development & Optimization of Materials and Processes for a Cost Effective Photoelectrochemical Hydrogen Production System. Final report

    SciTech Connect

    McFarland, Eric W

    2011-01-17

    The overall project objective was to apply high throughput experimentation and combinatorial methods together with novel syntheses to discover and optimize efficient, practical, and economically sustainable materials for photoelectrochemical production of bulk hydrogen from water. Automated electrochemical synthesis and photoelectrochemical screening systems were designed and constructed and used to study a variety of new photoelectrocatalytic materials. We evaluated photocatalytic performance in the dark and under illumination with or without applied bias in a high-throughput manner and did detailed evaluation on many materials. Significant attention was given to -Fe2O3 based semiconductor materials and thin films with different dopants were synthesized by co-electrodeposition techniques. Approximately 30 dopants including Al, Zn, Cu, Ni, Co, Cr, Mo, Ti, Pt, etc. were investigated. Hematite thin films doped with Al, Ti, Pt, Cr, and Mo exhibited significant improvements in efficiency for photoelectrochemical water splitting compared with undoped hematite. In several cases we collaborated with theorists who used density functional theory to help explain performance trends and suggest new materials. The best materials were investigated in detail by X-ray diffraction (XRD), scanning electron microscopy (SEM), ultraviolet-visual spectroscopy (UV-Vis), X-ray photoelectron spectroscopy (XPS). The photoelectrocatalytic performance of the thin films was evaluated and their incident photon

  14. Fabrication process development of SiC/superalloy composite sheet for exhaust system components

    NASA Technical Reports Server (NTRS)

    Cornie, J. A.; Cook, C. S.; Anderson, C. A.

    1976-01-01

    A chemical compatibility study was conducted between SiC filament and the following P/M matrix alloys: Waspaloy, Hastelloy-X, NiCrAlY, Ha-188, S-57, FeCrAlY, and Incoloy 800. None of the couples demonstrated sufficient chemical compatibility to withstand the minimum HIP consolidation temperatures (996 C) or intended application temperature of the composite (982 C). However, Waspaloy, Haynes 188, and Hastelloy-X were the least reactive with SiC of the candidate alloys. Chemical vapor deposited tungsten was shown to be an effective diffusion barrier between the superalloy matrix and SiC filament providing a defect-free coating of sufficient thickness. However, the coating breaks down when the tungsten is converted into intermetallic compounds by interdiffusion with matrix constituents. Waspaloy was demonstrated to be the most effective matrix alloy candidate in contact with the CVD tungsten barrier because of its relatively low growth rate constant of the intermediate compound and the lack of formation of Kirkendall voids at the matrix-barrier interface. Fabrication methods were developed for producing panels of uniaxial and angle ply composites utilizing CVD tungsten coated filament.

  15. Emerging structure-function relations in the developing face processing system.

    PubMed

    Suzanne Scherf, K; Thomas, Cibu; Doyle, Jaime; Behrmann, Marlene

    2014-11-01

    To evaluate emerging structure-function relations in a neural circuit that mediates complex behavior, we investigated age-related differences among cortical regions that support face recognition behavior and the fiber tracts through which they transmit and receive signals using functional neuroimaging and diffusion tensor imaging. In a large sample of human participants (aged 6-23 years), we derived the microstructural and volumetric properties of the inferior longitudinal fasciculus (ILF), the inferior fronto-occipital fasciculus, and control tracts, using independently defined anatomical markers. We also determined the functional characteristics of core face- and place-selective regions that are distributed along the trajectory of the pathways of interest. We observed disproportionately large age-related differences in the volume, fractional anisotropy, and mean and radial, but not axial, diffusivities of the ILF. Critically, these differences in the structural properties of the ILF were tightly and specifically linked with an age-related increase in the size of a key face-selective functional region, the fusiform face area. This dynamic association between emerging structural and functional architecture in the developing brain may provide important clues about the mechanisms by which neural circuits become organized and optimized in the human cortex. PMID:23765156

  16. Renovation of Chemical Processing Facility for Development of Advanced Fast Reactor Fuel Cycle System in JNC

    SciTech Connect

    Atsushi Aoshima; Shigehiko Miyachi; Takashi Suganuma; Shinichi Nemoto

    2002-07-01

    The CPF had 4 laboratories (operation room A, laboratory A, laboratory C and analysis laboratory) in connection with reprocessing technology. The main laboratory, operation room A, has 5 hot cells. Since equipment in the main cell had been designed for small-scale verification of existing reprocessing steps, it was hardly able to respond flexibly to experimental studies on advanced technology. It was decided to remodel the cell according to the design that was newly laid out in order to ensure the function and space to conduct various basic tests. The other laboratories had no glove boxes for conducting basic experiments of important elements in the advanced reprocessing, such as actinides except U and Pu, lanthanides and so on. In order to meet various requirements of innovative technologies on advanced fuel cycle development, one laboratory is established more for study on dry reprocessing, and glove boxes, hoods and analytical equipment such as NMR, FT-IR, TI-MS are newly installed in the other laboratories in this renovation. After the renovation, hot tests in the CPF will be resumed from April 2002. (authors)

  17. Emerging Structure–Function Relations in the Developing Face Processing System

    PubMed Central

    Suzanne Scherf, K.; Thomas, Cibu; Doyle, Jaime; Behrmann, Marlene

    2014-01-01

    To evaluate emerging structure–function relations in a neural circuit that mediates complex behavior, we investigated age-related differences among cortical regions that support face recognition behavior and the fiber tracts through which they transmit and receive signals using functional neuroimaging and diffusion tensor imaging. In a large sample of human participants (aged 6–23 years), we derived the microstructural and volumetric properties of the inferior longitudinal fasciculus (ILF), the inferior fronto-occipital fasciculus, and control tracts, using independently defined anatomical markers. We also determined the functional characteristics of core face- and place-selective regions that are distributed along the trajectory of the pathways of interest. We observed disproportionately large age-related differences in the volume, fractional anisotropy, and mean and radial, but not axial, diffusivities of the ILF. Critically, these differences in the structural properties of the ILF were tightly and specifically linked with an age-related increase in the size of a key face-selective functional region, the fusiform face area. This dynamic association between emerging structural and functional architecture in the developing brain may provide important clues about the mechanisms by which neural circuits become organized and optimized in the human cortex. PMID:23765156

  18. Software Engineering Processes Used to Develop the NIF Integrated Computer Control System

    SciTech Connect

    Ludwigsen, A P; Carey, R W; Demaret, R D; Lagin, L J; Reddi, U P; Van Arsdall, P J

    2007-10-03

    We have developed a new target platform to study Laser Plasma Interaction in ignition-relevant condition at the Omega laser facility (LLE/Rochester)[1]. By shooting an interaction beam along the axis of a gas-filled hohlraum heated by up to 17 kJ of heater beam energy, we were able to create a millimeter-scale underdense uniform plasma at electron temperatures above 3 keV. Extensive Thomson scattering measurements allowed us to benchmark our hydrodynamic simulations performed with HYDRA [1]. As a result of this effort, we can use with much confidence these simulations as input parameters for our LPI simulation code pF3d [2]. In this paper, we show that by using accurate hydrodynamic profiles and full three-dimensional simulations including a realistic modeling of the laser intensity pattern generated by various smoothing options, fluid LPI theory reproduces the SBS thresholds and absolute reflectivity values and the absence of measurable SRS. This good agreement was made possible by the recent increase in computing power routinely available for such simulations.

  19. Industrial process surveillance system

    DOEpatents

    Gross, K.C.; Wegerich, S.W.; Singer, R.M.; Mott, J.E.

    1998-06-09

    A system and method are disclosed for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy. 96 figs.

  20. Industrial Process Surveillance System

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W; Singer, Ralph M.; Mott, Jack E.

    2001-01-30

    A system and method for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy.

  1. Industrial process surveillance system

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W.; Singer, Ralph M.; Mott, Jack E.

    1998-01-01

    A system and method for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy.

  2. Research and Development in the Computer and Information Sciences. Volume 2, Processing, Storage, and Output Requirements in Information Processing Systems: A Selective Literature Review.

    ERIC Educational Resources Information Center

    Stevens, Mary Elizabeth

    Areas of concern with respect to processing, storage, and output requirements of a generalized information processing system are considered. Special emphasis is placed on multiple-access systems. Problems of system management and control are discussed, including hierarchies of storage levels. Facsimile, digital, and mass random access storage…

  3. Development of fast data processing electronics for a stacked x-ray detector system with application as a polarimeter

    NASA Astrophysics Data System (ADS)

    Maier, Daniel; Dick, Jürgen; Distratis, Giuseppe; Kendziorra, Eckhard; Santangelo, Andrea; Schanz, Thomas; Tenzer, Christoph; Warth, Gabriele

    2012-09-01

    We have assembled a stacked setup consisting of a soft and hard X-ray detector with cooling capability and control-, readout-, and data processing electronics at the Institut für Astronomie und Astrophysik Tübingen (IAAT). The detector system is a 64 ×64 DePFET-Matrix in front of a CdTe-Caliste module. The detectors were developed at the Max-Planck Institute Semiconductor Laboratory (HLL) in Neuperlach and the Commissariat a l'Energie Atomique (CEA) in Saclay, respectively. In this combined structure the DePFET detector works as Low Energy Detector (LED) while the Caliste module (HED) only detects the high energy photons that have passed through the LED. In this work we present the current status of the setup. Furthermore, an intended application of the detector system as a polarimeter is described.

  4. The role of axonopathy in the mechanisms of development of demyelination processes in the central and peripheral nervous system.

    PubMed

    Merkulov, Yu A; Zavalishin, I A; Merkulova, D M

    2009-01-01

    The role of axonopathy in the development of demyelinating processes in the CNS and peripheral nervous system was addressed in studies of 43 patients with multiple sclerosis (MS) and 144 patients with chronic inflammatory demyelinating polyneuropathy (CIDPN). Patients with MS were found to have foci of reduced MRI intensity in the T1 regime ("black holes," present in 28%) and regional atrophy of the cerebral cortex (in 46%), which showed a significant association with the degree of invalidity on the EDSS (Kendall tau = 0.38 and 0.43; p = 0.038 and 0.021, respectively). The mean fatigue score on the FSS was 4.9 (3.6; 5.4). A significant increase in the central conduction time on the background of fatigue (p = 0.016), along with an absence of signs of impaired reliability of neuromuscular transmission and an absence of past-activation phenomena, suggested that central mechanisms were predominant in the formation of fatigue phenomena in MS. In addition, 34.9% of patients with MS showed signs of peripheral nervous system involvement, while the clinical-electrophysiological pattern in 12.5% of patients with CIDPN showed signs of CNS involvement. These data widen existing concepts of the mechanisms of formation of axonopathy in the CNS, based on evidence for the development of axon-demyelinating processes in CIDPN, which is the most accessible model of demyelination for study using contemporary neurophysiological methods. PMID:19089637

  5. Process evaluation distributed system

    NASA Technical Reports Server (NTRS)

    Moffatt, Christopher L. (Inventor)

    2006-01-01

    The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

  6. Emissions characterization and off-gas system development for processing simulated mixed waste in a plasma centrifugal furnace

    SciTech Connect

    Filius, K.D.; Whitworth, C.G.

    1996-12-31

    Plasma arc technology is a high temperature process that completely oxidizes organic waste fractions: inorganic hazardous and radionuclide waste fractions are oxidized and encapsulated in a highly durable slag. The robust nature of the technology lends itself to application of diverse mixed and hazardous wastestreams. Over 500 hours of testing have been completed at the Department of Energy`s Western Environmental Technology Office with a pilot-scale system. This testing was designed to demonstrate operability over a wide range of wastes and provide the data required to evaluate potential applications of the technology on both a technical and economic basis. In addition to characterization of the off gas for typical combustion products, the fate of radionuclide surrogates and hazardous elements within the Plasma Arc Centrifugal Treatment (PACT) system has been investigated extensively. Test results to date demonstrate that cerium, a plutonium surrogate, remains almost exclusively in the slag matrix. Hazardous elements such as chromium and lead volatilize to a greater extent and are captured by the off-gas system. Preliminary design work is underway to develop a minimum emissions off-gas system for demonstration on a engineering-scale plasma unit. The proposed system will filter particulate matter from the hot gas stream and treat them in an electric ceramic oxidizer, which replaces the conventional afterburner, prior to quenching and acid gas removal. 5 refs., 3 figs., 5 tabs.

  7. Development of Fast Measurement System of Neutron Emission Profile Using a Digital Signal Processing Technique in JT-60U

    SciTech Connect

    Ishikawa, M.; Shinohara, K.; Itoga, T.; Okuji, T.; Nakhostin, M.; Baba, M.; Nishitani, T.

    2008-03-12

    Neutron emission profiles are routinely measured in JT-60U Tokamak. Stinbene neuron detectors (SNDs), which combine a Stilbene organic crystal scintillation detector (Stilbene detector) with an analog neutron-gamma pulse shape discrimination (PSD) circuit, have been used to measure neutron flux efficiently. Although the SND has many advantages as a neutron detector, the maximum count rate is limited up to {approx}1x 10{sup 5} counts/s due to the dead time of the analog PSD circuit. To overcome this issue, a digital signal processing (DSP) system using a Flash-ADC has been developed. In this system, anode signals from the photomultiplier of the Stilbene detector are fed to the Flash ADC and digitized. Then, the PSD between neutrons and gamma-rays are performed using software. The photomultiplier tube is also modified to suppress and correct gain fluctuation of the photomultiplier. The DSP system has been installed in the center channel of the vertical neutron collimator system in JT-60U and applied to measurements of neutron flux in JT-60U experiments. Neutron flux are successfully measured with count rate up to {approx}1x 10{sup 6} counts/s without the effect of pile up of detected pulses. The performance of the DSP system as a neutron detector is demonstrated.

  8. Development of an image processing support system based on fluorescent dye to prevent elderly people with dementia from wandering.

    PubMed

    Nishigaki, Yutaka; Tanaka, Kentaro; Kim, Juhyon; Nakajima, Kazuki

    2013-01-01

    The wandering of elderly people with dementia is a significant behavioral problem and is a heavy burden on caregivers in residential and nursing homes. Thus, warning systems have been developed to prevent elderly people with dementia from leaving the premises. Some of these systems use radio waves. However, systems based on radio waves present several practical problems. For instance, the transmitter must be carried and may become lost; in addition, the battery of the transmitter must be changed. To solve these problems, we developed a support system that prevents elderly people with dementia from wandering. The system employs image processing technology based on fluorescent dye. The composition of the support system can be described as follows: fluorescent dye is painted in a simple shape on the clothes of an elderly person. The fluorescent color becomes visible by irradiation with a long wavelength of ultraviolet light. In the present paper, the relationship between the color of the dye and the cloth was investigated. A 3D video camera was used to acquire a 3D image and detect the simple shape. As a preliminary experiment, 3 colors (red, green and blue) of fluorescent dye were applied to cloths of 9 different colors. All fluorescent colors were detected on 6 of the cloths, but red and blue dye could not be detected on the other 3 cloths. In contrast, green dye was detectable on all 9 of the cloths. Additionally, we determined whether green dye could be detected in an actual environment. A rectangular shaped patch of green fluorescent dye was painted on the shoulder area of a subject, from the scapula to the clavicle. As a result, the green dye was detected on all 9 different colored cloths. PMID:24111431

  9. Development of an advanced continuous mild gasification process for the production of coproducts. Task 4, System integration studies: Char upgrading

    SciTech Connect

    Jha, M.C.; McCormick, R.L.; Hogsett, R.F.; Rowe, R.M.; Anast, K.R.

    1991-12-01

    This document describes the results of Task 4 under which a 50 pound/hour char-to-carbon (CTC) process research unit (PRU) was designed in the second half of 1989, with construction completed in June 1990. The CTC PRU at Golden was operated for nearly one year during which 35 runs were completed for a total of nearly 800 hours of operation. Char methanation and carbon production reactor development activities are detailed in this report, as well as the results of integrated runs of the CTC process. Evaluation of the process and the carbon product produced is also included. It was concluded that carbon could be produced from mild gasification char utilizing the CTC process. Char methanation and membrane separation steps performed reasonably well and can scaled up with confidence. However, the novel directly heated reactor system for methane cracking did not work satisfactorily due to materials of construction and heat transfer problems, which adversely affected the quantity and quality of the carbon product. Alternative reactor designs are recommended.

  10. Low cost solar aray project: Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1981-01-01

    This phase consists of the engineering design, fabrication, assembly, operation, economic analysis, and process support R&D for an Experimental Process System Development Unit (EPSDU). The mechanical bid package was issued and the bid responses are under evaluation. Similarly, the electrical bid package was issued, however, responses are not yet due. The majority of all equipment is on order or has been received at the EPSDU site. The pyrolysis/consolidation process design package was issued. Preparation of process and instrumentation diagram for the free-space reactor was started. In the area of melting/consolidation, Kayex successfully melted chunk silicon and have produced silicon shot. The free-space reactor powder was successfully transported pneumatically from a storage bin to the auger feeder twenty-five feet up and was melted. The fluid-bed PDU has successfully operated at silane feed concentrations up to 21%. The writing of the operating manual has started. Overall, the design phase is nearing completion.

  11. Laser material processing system

    DOEpatents

    Dantus, Marcos

    2015-04-28

    A laser material processing system and method are provided. A further aspect of the present invention employs a laser for micromachining. In another aspect of the present invention, the system uses a hollow waveguide. In another aspect of the present invention, a laser beam pulse is given broad bandwidth for workpiece modification.

  12. Development of Production PVD-AIN Buffer Layer System and Processes to Reduce Epitaxy Costs and Increase LED Efficiency

    SciTech Connect

    Cerio, Frank

    2013-09-14

    was analyzed and improvements implemented to the Veeco PVD-AlN prototype system to establish a specification and baseline PVD-AlN films on sapphire and in parallel the evaluation of PVD AlN on silicon substrates began. In Phase II of the project a Beta tool based on a scaled-up process module capable of depositing uniform films on batches of 4”or 6” diameter substrates in a production worthy operation was developed and qualified. In Phase III, the means to increase the throughput of the PVD-AlN system was evaluated and focused primarily on minimizing the impact of the substrate heating and cooling times that dominated the overall cycle time.

  13. LIMB PROCESS DEVELOPMENT STUDIES

    EPA Science Inventory

    The report covers basic and applied studies concerned with three Limestone Injection Multistage Burner (LIMB) process objectives: (1) avoiding degradation of collection efficiency in the electrostatic precipitator (ESP) during LIMB, (2) achieving satisfactory sulfur dioxide (SO2)...

  14. Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.

    1981-01-01

    The silicon web process takes advantage of natural crystallographic stabilizing forces to grow long, thin single crystal ribbons directly from liquid silicon. The ribbon, or web, is formed by the solidification of a liquid film supported by surface tension between two silicon filaments, called dendrites, which border the edges of the growing strip. The ribbon can be propagated indefinitely by replenishing the liquid silicon as it is transformed to crystal. The dendritic web process has several advantages for achieving low cost, high efficiency solar cells. These advantages are discussed.

  15. Development of an alternating magnetic-field-assisted finishing process for microelectromechanical systems micropore x-ray optics

    SciTech Connect

    Riveros, Raul E.; Yamaguchi, Hitomi; Mitsuishi, Ikuyuki; Takagi, Utako; Ezoe, Yuichiro; Kato, Fumiki; Sugiyama, Susumu; Yamasaki, Noriko; Mitsuda, Kazuhisa

    2010-06-20

    X-ray astronomy research is often limited by the size, weight, complexity, and cost of functioning x-ray optics. Micropore optics promises an economical alternative to traditional (e.g., glass or foil) x-ray optics; however, many manufacturing difficulties prevent micropore optics from being a viable solution. Ezoe et al. introduced microelectromechanical systems (MEMS) micropore optics having curvilinear micropores in 2008. Made by either deep reactive ion etching or x-ray lithography, electroforming, and molding (LIGA), MEMS micropore optics suffer from high micropore sidewall roughness (10-30nmrms) which, by current standards, cannot be improved. In this research, a new alternating magnetic-field-assisted finishing process was developed using a mixture of ferrofluid and microscale abrasive slurry. A machine was built, and a set of working process parameters including alternating frequency, abrasive size, and polishing time was selected. A polishing experiment on a LIGA-fabricated MEMS micropore optic was performed, and a change in micropore sidewall roughness of 9.3{+-}2.5nmrms to 5.7{+-}0.7nmrms was measured. An improvement in x-ray reflectance was also seen. This research shows the feasibility and confirms the effects of this new polishing process on MEMS micropore optics.

  16. Chemical waterflood process development

    SciTech Connect

    Chang, H.L.

    1980-04-01

    A waterflood process is claimed wherein a slug of biopolymer is injected into a formation, followed by a slug of synthetic polymer. The biopolymer slug protects the synthetic polymer from degradation due to presence of salts or surfactants in the formation.

  17. Automated system for the on-line monitoring of powder blending processes using near-infrared spectroscopy. Part I. System development and control.

    PubMed

    Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K

    1996-03-01

    An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity. PMID:8738184

  18. Concurrent systems for knowledge processing

    SciTech Connect

    Hewitt, C. ); Agha, G. )

    1989-01-01

    Actors have catalyzed the development of a new programming methodology and architecture for ultra-concurrent systems. This sourcebook on the development and impact of the actor paradigm brings together more than 20 milestone contributions on the actor concept and its application to knowledge processing. Each contribution is placed in its historical context and explained. This book is divided into four major areas: Foundations of Concurrent Systems covers actors, laws of concurrent systems and mathematical models. Languages takes up design principles, actor languages, meta-interpreters, and comparison with other programming languages. Systems and Architectures discusses monitoring and debugging environments, and multicomputers. Knowledge Processing examines the scientific community metaphor, open systems, and organizational semantics. Future prospects for actors and knowledge processing are discussed in the concluding section.

  19. Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Blais, P. D.; Davis, J. R., Jr.

    1977-01-01

    Thirty-five (35) furnace runs were carried out during this quarter, of which 25 produced a total of 120 web crystals. The two main thermal models for the dendritic growth process were completed and are being used to assist the design of the thermal geometry of the web growth apparatus. The first model, a finite element representation of the susceptor and crucible, was refined to give greater precision and resolution in the critical central region of the melt. The second thermal model, which describes the dissipation of the latent heat to generate thickness-velocity data, was completed. Dendritic web samples were fabricated into solar cells using a standard configuration and a standard process for a N(+) -P-P(+) configuration. The detailed engineering design was completed for a new dendritic web growth facility of greater width capability than previous facilities.

  20. Transparent materials processing system

    NASA Technical Reports Server (NTRS)

    Hetherington, J. S.

    1977-01-01

    A zero gravity processing furnace system was designed that will allow acquisition of photographic or other visual information while the sample is being processed. A low temperature (30 to 400 C) test model with a flat specimen heated by quartz-halide lamps was constructed. A high temperature (400 to 1000 C) test model heated by resistance heaters, utilizing a cylindrical specimen and optics, was also built. Each of the test models is discussed in detail. Recommendations are given.

  1. Development of metallization process

    NASA Technical Reports Server (NTRS)

    Garcia, A., III

    1983-01-01

    Solar cells were produced using a Mo/Sn/TiH screen printed paste with a lead/borosilicate frit that are electrically comparable to control silver cells. The process is currently unsuccessful because the soldering of interconnects to these cells has proved difficult. Future work will investigate using CO instead of H2 as the reducing gas and putting an ITO coating on the cell prior to metallization.

  2. Development of metallization process

    NASA Astrophysics Data System (ADS)

    Garcia, A., III

    1983-04-01

    Solar cells were produced using a Mo/Sn/TiH screen printed paste with a lead/borosilicate frit that are electrically comparable to control silver cells. The process is currently unsuccessful because the soldering of interconnects to these cells has proved difficult. Future work will investigate using CO instead of H2 as the reducing gas and putting an ITO coating on the cell prior to metallization.

  3. Development of an on-line fuzzy expert system for integrated alarm processing in nuclear power plants

    SciTech Connect

    Choi, S.S.; Kang, K.S.; Kim, H.G.; Chang, S.H.

    1995-08-01

    An on-line fuzzy expert system, called alarm filtering and diagnostic system (AFDS), was developed to provide the operator with clean alarm pictures and system-wide failure information during abnormal states through alarm filtering and diagnosis. In addition, it carries out alarm prognosis to warn the operator of process abnormalities. Clean alarm pictures that have no information overlapping are generated from multiple activated alarms at the alarm filtering stage. The meta rules for dynamic filtering were established on the basis of the alarm relationship network. In the case of alarm diagnosis, the relations between alarms and abnormal states are represented by means of fuzzy relations, and the compositional inference rule of fuzzy logic is utilized to infer abnormal states from the fuzzy relations. The AFDS offers the operator related operating procedures as well as diagnostic results. At the stage of alarm prognosis, the future values of some important critical safety parameters are predicted by means of Levinson algorithm selected from the comparative experiments, and the global trends of these parameters are estimated using data smoothing and fuzzy membership. This information enables early failure detection and is also used to supplement diagnostic symptoms.

  4. Low cost solar array project. Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Astrophysics Data System (ADS)

    1980-03-01

    Technical activities are reported in the design of process, facilities, and equipment for producing silicon at a rate and price comensurate with production goals for low cost solar cell modules. The silane-silicone process has potential for providing high purity poly-silicon on a commercial scale at a price of fourteen dollars per kilogram by 1986, (1980 dollars). Commercial process, economic analysis, process support research and development, and quality control are discussed.

  5. Low cost solar array project. Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Technical activities are reported in the design of process, facilities, and equipment for producing silicon at a rate and price comensurate with production goals for low cost solar cell modules. The silane-silicone process has potential for providing high purity poly-silicon on a commercial scale at a price of fourteen dollars per kilogram by 1986, (1980 dollars). Commercial process, economic analysis, process support research and development, and quality control are discussed.

  6. SIRU development. Volume 1: System development

    NASA Technical Reports Server (NTRS)

    Gilmore, J. P.; Cooper, R. J.

    1973-01-01

    A complete description of the development and initial evaluation of the Strapdown Inertial Reference Unit (SIRU) system is reported. System development documents the system mechanization with the analytic formulation for fault detection and isolation processing structure; the hardware redundancy design and the individual modularity features; the computational structure and facilities; and the initial subsystem evaluation results.

  7. Application of high-throughput mini-bioreactor system for systematic scale-down modeling, process characterization, and control strategy development.

    PubMed

    Janakiraman, Vijay; Kwiatkowski, Chris; Kshirsagar, Rashmi; Ryll, Thomas; Huang, Yao-Ming

    2015-01-01

    High-throughput systems and processes have typically been targeted for process development and optimization in the bioprocessing industry. For process characterization, bench scale bioreactors have been the system of choice. Due to the need for performing different process conditions for multiple process parameters, the process characterization studies typically span several months and are considered time and resource intensive. In this study, we have shown the application of a high-throughput mini-bioreactor system viz. the Advanced Microscale Bioreactor (ambr15(TM) ), to perform process characterization in less than a month and develop an input control strategy. As a pre-requisite to process characterization, a scale-down model was first developed in the ambr system (15 mL) using statistical multivariate analysis techniques that showed comparability with both manufacturing scale (15,000 L) and bench scale (5 L). Volumetric sparge rates were matched between ambr and manufacturing scale, and the ambr process matched the pCO2 profiles as well as several other process and product quality parameters. The scale-down model was used to perform the process characterization DoE study and product quality results were generated. Upon comparison with DoE data from the bench scale bioreactors, similar effects of process parameters on process yield and product quality were identified between the two systems. We used the ambr data for setting action limits for the critical controlled parameters (CCPs), which were comparable to those from bench scale bioreactor data. In other words, the current work shows that the ambr15(TM) system is capable of replacing the bench scale bioreactor system for routine process development and process characterization. PMID:26317495

  8. Technology development life cycle processes.

    SciTech Connect

    Beck, David Franklin

    2013-05-01

    This report and set of appendices are a collection of memoranda originally drafted in 2009 for the purpose of providing motivation and the necessary background material to support the definition and integration of engineering and management processes related to technology development. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. As presented herein, the material begins with a survey of open literature perspectives on technology development life cycles, including published data on %E2%80%9Cwhat went wrong.%E2%80%9D The main thrust of the material presents a rational expose%CC%81 of a structured technology development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of the systems engineering process. The material concludes with a discussion on the use of multiple measures to assess technology maturity, including consideration of the viewpoint of potential users.

  9. Developments in Signature Process Control

    NASA Astrophysics Data System (ADS)

    Keller, L. B.; Dominski, Marty

    1993-01-01

    Developments in the adaptive process control technique known as Signature Process Control for Advanced Composites (SPCC) are described. This computer control method for autoclave processing of composites was used to develop an optimum cure cycle for AFR 700B polyamide and for an experimental poly-isoimide. An improved process cycle was developed for Avimid N polyamide. The potential for extending the SPCC technique to pre-preg quality control, press modeling, pultrusion and RTM is briefly discussed.

  10. Quartz resonator processing system

    DOEpatents

    Peters, Roswell D. M.

    1983-01-01

    Disclosed is a single chamber ultra-high vacuum processing system for the oduction of hermetically sealed quartz resonators wherein electrode metallization and sealing are carried out along with cleaning and bake-out without any air exposure between the processing steps. The system includes a common vacuum chamber in which is located a rotatable wheel-like member which is adapted to move a plurality of individual component sets of a flat pack resonator unit past discretely located processing stations in said chamber whereupon electrode deposition takes place followed by the placement of ceramic covers over a frame containing a resonator element and then to a sealing stage where a pair of hydraulic rams including heating elements effect a metallized bonding of the covers to the frame.

  11. The development of a coal-fired combustion system for industrial process heating applications. Quarterly technical progress report, January 1992--March 1992

    SciTech Connect

    Not Available

    1992-07-16

    PETC has implemented a number of advanced combustion research projects that will lead to the establishment of a broad, commercially acceptable engineering data base for the advancement of coal as the fuel of choice for boilers, furnaces, and process heaters. Vortec Corporation`s Coal-Fired Combustion System for Industrial Process Heating Applications has been selected for Phase III development under contract DE-AC22-91PC91161. This advanced combustion system research program is for the development of innovative coal-fired process heaters which can be used for high temperature melting, smelting, recycling, and refining processes. The process heater concepts to be developed are based on advanced glass melting and ore smelting furnaces developed and patented by Vortec Corporation. The process heater systems to be developed have multiple use applications; however, the Phase HI research effort is being focused on the development of a process heater system to be used for producing glass frits and wool fiber from boiler and incinerator ashes. The primary objective of the Phase III project is to develop and integrate all the system components, from fuel through total system controls, and then test the complete system in order to evaluate its potential marketability. The economic evaluation of commercial scale CMS processes has begun. In order to accurately estimate the cost of the primary process vessels, preliminary designs for 25, 50, and 100 ton/day systems have been started under Task 1. This data will serve as input data for life cycle cost analysis performed as part of techno-economic evaluations. The economic evaluations of commercial CMS systems will be an integral part of the commercialization plan.

  12. Intelligent Work Process Engineering System

    NASA Technical Reports Server (NTRS)

    Williams, Kent E.

    2003-01-01

    Optimizing performance on work activities and processes requires metrics of performance for management to monitor and analyze in order to support further improvements in efficiency, effectiveness, safety, reliability and cost. Information systems are therefore required to assist management in making timely, informed decisions regarding these work processes and activities. Currently information systems regarding Space Shuttle maintenance and servicing do not exist to make such timely decisions. The work to be presented details a system which incorporates various automated and intelligent processes and analysis tools to capture organize and analyze work process related data, to make the necessary decisions to meet KSC organizational goals. The advantages and disadvantages of design alternatives to the development of such a system will be discussed including technologies, which would need to bedesigned, prototyped and evaluated.

  13. Developing a model system for studying the ozone processing of atmospheric aerosols by following changes in surface properties

    NASA Astrophysics Data System (ADS)

    Gonzalez-Labrada, Erick

    Atmospheric aerosols have a significant organic composition as determined by field measurement studies. This organic material is released to the atmosphere from both natural and anthropogenic sources, such as wind bursting of the ocean surface, car exhausts, and meat cooking, among others. An inverted micelle model has been proposed in order to explain the high concentration of organic compounds in aerosol particles. The model describes an organic film coating the air-liquid interface of an aqueous aerosol core. Chemical processing of this organic film by atmospheric oxidants (such as OH radicals, O3, and NO3) through heterogeneous and multiphase reactions can activate the aerosol to participate in atmospheric chemistry. After reaction, the particle has an increased role in the absorption and scattering of incoming solar radiation and cloud formation. Another consequence of this oxidation is the decrease of the atmospheric budget of gas-phase trace species, as well as the formation of volatile products. Several studies have proposed that the ozonolysis of organic films in aerosols takes place mainly at the surface. Therefore, the objective of this research was to develop a suitable model system for following the reaction through quantitative changes of a property inherent to the surface. Several attempts were made to examine the ozonolysis of organic monolayers at either solid or liquid surfaces. The studied monolayers contained unsaturated organic compounds as the only component or as part of a binary mixture with saturated compounds. The study of the ozone processing of monolayers deposited on solid substrates revealed information about changes in the hydrophobic character of the surface that occurred because of the reaction. On the other hand, the processing of a monolayer spread on a pendant drop allowed a real-time monitoring of surface pressure changes. This permitted a kinetic study of the reaction that yielded parameters related exclusively to processes

  14. Annotation methods to develop and evaluate an expert system based on natural language processing in electronic medical records.

    PubMed

    Gicquel, Quentin; Tvardik, Nastassia; Bouvry, Côme; Kergourlay, Ivan; Bittar, André; Segond, Frédérique; Darmoni, Stefan; Metzger, Marie-Hélène

    2015-01-01

    The objective of the SYNODOS collaborative project was to develop a generic IT solution, combining a medical terminology server, a semantic analyser and a knowledge base. The goal of the project was to generate meaningful epidemiological data for various medical domains from the textual content of French medical records. In the context of this project, we built a care pathway oriented conceptual model and corresponding annotation method to develop and evaluate an expert system's knowledge base. The annotation method is based on a semi-automatic process, using a software application (MedIndex). This application exchanges with a cross-lingual multi-termino-ontology portal. The annotator selects the most appropriate medical code proposed for the medical concept in question by the multi-termino-ontology portal and temporally labels the medical concept according to the course of the medical event. This choice of conceptual model and annotation method aims to create a generic database of facts for the secondary use of electronic health records data. PMID:26262366

  15. Processes and process development in Japan

    NASA Technical Reports Server (NTRS)

    Noda, T.

    1986-01-01

    The commercialization of solar power generation necessitates the development of low cost manufacturing method of silicon suitable for solar cells. The manufacturing methods of semiconductor grade silicon (SEG-Si) and the development of solar grade silicon (SOG-Si) in foreign countries was investigated. It was concluded that the most efficient method of developing such materials was the hydrogen reduction process of trichlorosilane (TCS), using a fluidized bed reactor. The low cost reduction of polysilicon requires cost reductions of raw materials, energy, labor, and capital. These conditions were carefully reviewed. The overall conclusion was that a development program should be based on the TCS-FBR process and that the experimental program should be conducted in test facilities capable of producing 10 tons of silicon granules per year.

  16. Liga developer apparatus system

    DOEpatents

    Boehme, Dale R.; Bankert, Michelle A.; Christenson, Todd R.

    2003-01-01

    A system to fabricate precise, high aspect ratio polymeric molds by photolithograpic process is described. The molds for producing micro-scale parts from engineering materials by the LIGA process. The invention is a developer system for developing a PMMA photoresist having exposed patterns comprising features having both very small sizes, and very high aspect ratios. The developer system of the present invention comprises a developer tank, an intermediate rinse tank and a final rinse tank, each tank having a source of high frequency sonic agitation, temperature control, and continuous filtration. It has been found that by moving a patterned wafer, through a specific sequence of developer/rinse solutions, where an intermediate rinse solution completes development of those portions of the exposed resist left undeveloped after the development solution, by agitating the solutions with a source of high frequency sonic vibration, and by adjusting and closely controlling the temperatures and continuously filtering and recirculating these solutions, it is possible to maintain the kinetic dissolution of the exposed PMMA polymer as the rate limiting step.

  17. Integrated water system simulation by considering hydrological and biogeochemical processes: model development, with parameter sensitivity and autocalibration

    NASA Astrophysics Data System (ADS)

    Zhang, Y. Y.; Shao, Q. X.; Ye, A. Z.; Xing, H. T.; Xia, J.

    2016-02-01

    Integrated water system modeling is a feasible approach to understanding severe water crises in the world and promoting the implementation of integrated river basin management. In this study, a classic hydrological model (the time variant gain model: TVGM) was extended to an integrated water system model by coupling multiple water-related processes in hydrology, biogeochemistry, water quality, and ecology, and considering the interference of human activities. A parameter analysis tool, which included sensitivity analysis, autocalibration and model performance evaluation, was developed to improve modeling efficiency. To demonstrate the model performances, the Shaying River catchment, which is the largest highly regulated and heavily polluted tributary of the Huai River basin in China, was selected as the case study area. The model performances were evaluated on the key water-related components including runoff, water quality, diffuse pollution load (or nonpoint sources) and crop yield. Results showed that our proposed model simulated most components reasonably well. The simulated daily runoff at most regulated and less-regulated stations matched well with the observations. The average correlation coefficient and Nash-Sutcliffe efficiency were 0.85 and 0.70, respectively. Both the simulated low and high flows at most stations were improved when the dam regulation was considered. The daily ammonium-nitrogen (NH4-N) concentration was also well captured with the average correlation coefficient of 0.67. Furthermore, the diffuse source load of NH4-N and the corn yield were reasonably simulated at the administrative region scale. This integrated water system model is expected to improve the simulation performances with extension to more model functionalities, and to provide a scientific basis for the implementation in integrated river basin managements.

  18. Advanced information processing system

    NASA Technical Reports Server (NTRS)

    Lala, J. H.

    1984-01-01

    Design and performance details of the advanced information processing system (AIPS) for fault and damage tolerant data processing on aircraft and spacecraft are presented. AIPS comprises several computers distributed throughout the vehicle and linked by a damage tolerant data bus. Most I/O functions are available to all the computers, which run in a TDMA mode. Each computer performs separate specific tasks in normal operation and assumes other tasks in degraded modes. Redundant software assures that all fault monitoring, logging and reporting are automated, together with control functions. Redundant duplex links and damage-spread limitation provide the fault tolerance. Details of an advanced design of a laboratory-scale proof-of-concept system are described, including functional operations.

  19. Developing Data System Engineers

    NASA Astrophysics Data System (ADS)

    Behnke, J.; Byrnes, J. B.; Kobler, B.

    2011-12-01

    In the early days of general computer systems for science data processing, staff members working on NASA's data systems would most often be hired as mathematicians. Computer engineering was very often filled by those with electrical engineering degrees. Today, the Goddard Space Flight Center has special position descriptions for data scientists or as they are more commonly called: data systems engineers. These staff members are required to have very diverse skills, hence the need for a generalized position description. There is always a need for data systems engineers to develop, maintain and operate the complex data systems for Earth and space science missions. Today's data systems engineers however are not just mathematicians, they are computer programmers, GIS experts, software engineers, visualization experts, etc... They represent many different degree fields. To put together distributed systems like the NASA Earth Observing Data and Information System (EOSDIS), staff are required from many different fields. Sometimes, the skilled professional is not available and must be developed in-house. This paper will address the various skills and jobs for data systems engineers at NASA. Further it explores how to develop staff to become data scientists.

  20. Network command processing system overview

    NASA Technical Reports Server (NTRS)

    Nam, Yon-Woo; Murphy, Lisa D.

    1993-01-01

    The Network Command Processing System (NCPS) developed for the National Aeronautics and Space Administration (NASA) Ground Network (GN) stations is a spacecraft command system utilizing a MULTIBUS I/68030 microprocessor. This system was developed and implemented at ground stations worldwide to provide a Project Operations Control Center (POCC) with command capability for support of spacecraft operations such as the LANDSAT, Shuttle, Tracking and Data Relay Satellite, and Nimbus-7. The NCPS consolidates multiple modulation schemes for supporting various manned/unmanned orbital platforms. The NCPS interacts with the POCC and a local operator to process configuration requests, generate modulated uplink sequences, and inform users of the ground command link status. This paper presents the system functional description, hardware description, and the software design.

  1. Development of an image processing system at the Technology Applications Center, UNM: Landsat image processing in mineral exploration and related activities. Final report

    SciTech Connect

    Budge, T.K.

    1980-09-01

    This project was a demonstration of the capabilities of Landsat satellite image processing applied to the monitoring of mining activity in New Mexico. Study areas included the Navajo coal surface mine, the Jackpile uranium surface mine, and the potash mining district near Carlsbad, New Mexico. Computer classifications of a number of land use categories in these mines were presented and discussed. A literature review of a number of case studies concerning the use of Landsat image processing in mineral exploration and related activities was prepared. Included in this review is a discussion of the Landsat satellite system and the basics of computer image processing. Topics such as destriping, contrast stretches, atmospheric corrections, ratioing, and classification techniques are addressed. Summaries of the STANSORT II and ELAS software packages and the Technology Application Center's Digital Image Processing System (TDIPS) are presented.

  2. Chemical optimization of resist/developer systems tuned for sub-0.4-um process window expansion

    NASA Astrophysics Data System (ADS)

    Toukhy, Medhat A.; Schlicht, Karin R.; Tarbox, Kimberly

    1997-07-01

    In order to perform 0.2 micrometer processes, one needs to study the diffusion of photoacid generators within the photoresist system, since diffusion during post exposure bake time has an influence on the critical dimension (CD). We have developed a new method to study the diffusion of photoacid generators within a polymer film. This new method is based on monitoring the change of the fluorescence intensity of a pH- sensitive fluorescent dye caused by the reaction with photoacid. A simplified version of this experiment has been conducted by introducing acid vapor to quench the fluorescence intensity of this pH sensor. A thin polymer film is spin cast onto the sensor to create a barrier to the acid diffusion process. During the acid diffusion process, the fluorescence intensity of this pH sensor is measured in situ, using excitation and emission wavelengths at 466 nm and 516 nm, respectively. Fluoresceinamine, the pH sensitive fluorescent dye, is covalently bonded onto the treated quartz substrate to form a single dye layer. Poly(hydroxystyrene) (Mn equals 13k, Tg equals 180 degrees Celsius) in PGMEA (5% - 18% by weight) is spin cast onto this quartz substrate to form films with varying thickness. The soft bake time is 60 seconds at 90 degrees Celsius and a typical film has a thickness of 1.4 micrometers. Trifluoroacetic acid is introduced into a small chamber while the fluorescence from this quartz window is observed. Our study focuses on finding the diffusion constant of the vaporized acid (trifluoroacetic acid) in the poly(hydroxystyrene) polymer film. By applying the Fick's second law, (It - Io)/(I(infinity ) - Io) equals erfc [L/(Dt)1/2] is obtained. The change of fluorescence intensity with respect to the diffusion time is monitored. The above equation is used for the data analysis, where L represents the film thickness and t represents the average time for the acid to diffuse through the film. The diffusion constant is calculated to be at the order of 10

  3. Developing the JPL Engineering Processes

    NASA Technical Reports Server (NTRS)

    Linick, Dave; Briggs, Clark

    2004-01-01

    This paper briefly recounts the recent history of process reengineering at the NASA Jet Propulsion Laboratory, with a focus on the engineering processes. The JPL process structure is described and the process development activities of the past several years outlined. The main focus of the paper is on the current process structure, the emphasis on the flight project life cycle, the governance approach that lead to Flight Project Practices, and the remaining effort to capture process knowledge at the detail level of the work group.

  4. A Prototyping Environment for Research on Human-Machine Interfaces in Process Control: Use of Microsoft WPF for Microworld and Distributed Control System Development

    SciTech Connect

    Roger Lew; Ronald L. Boring; Thomas A. Ulrich

    2014-08-01

    Operators of critical processes, such as nuclear power production, must contend with highly complex systems, procedures, and regulations. Developing human-machine interfaces (HMIs) that better support operators is a high priority for ensuring the safe and reliable operation of critical processes. Human factors engineering (HFE) provides a rich and mature set of tools for evaluating the performance of HMIs, but the set of tools for developing and designing HMIs is still in its infancy. Here we propose that Microsoft Windows Presentation Foundation (WPF) is well suited for many roles in the research and development of HMIs for process control.

  5. WRAP process area development control work plan

    SciTech Connect

    Leist, K.L., Fluor Daniel Hanford

    1997-02-27

    This work plan defines the manner in which the Waste Receiving and Processing Facility, Module I Process Area will be maintained under development control status. This status permits resolution of identified design discrepancies, control system changes, as-building of equipment, and perform modifications to increase process operability and maintainability as parallel efforts. This work plan maintains configuration control as these efforts are undertaken. This task will end with system testing and reissue of field verified design drawings.

  6. Development of Educational Support System for Learning Image Processing Enabling Client-Side Programming Aided by Java Servlet Technology

    NASA Astrophysics Data System (ADS)

    Furukawa, Tatsuya; Aoki, Noriyuki; Ohchi, Masashi; Nakao, Masaki

    The image proccessing has become a useful and important technology in various reserch and development fields. According to such demands for engineering problems, we have designed and implemented the educational support system for that using a Java Applet technology. However in the conventional system, it required the tedious procedure for the end user to code his own programs. Therefore, in this study, we have improved the defect in the previous system by using a Java Servlet technology. The new system will make it possible for novice user to experience a practical digital image proccessing and an advanced programming with ease. We will describe the architecture of the proposed system function, that has been introduced to facilitate the client-side programming.

  7. Mars Aqueous Processing System

    NASA Technical Reports Server (NTRS)

    Berggren, Mark; Wilson, Cherie; Carrera, Stacy; Rose, Heather; Muscatello, Anthony; Kilgore, James; Zubrin, Robert

    2012-01-01

    The goal of the Mars Aqueous Processing System (MAPS) is to establish a flexible process that generates multiple products that are useful for human habitation. Selectively extracting useful components into an aqueous solution, and then sequentially recovering individual constituents, can obtain a suite of refined or semi-refined products. Similarities in the bulk composition (although not necessarily of the mineralogy) of Martian and Lunar soils potentially make MAPS widely applicable. Similar process steps can be conducted on both Mars and Lunar soils while tailoring the reaction extents and recoveries to the specifics of each location. The MAPS closed-loop process selectively extracts, and then recovers, constituents from soils using acids and bases. The emphasis on Mars involves the production of useful materials such as iron, silica, alumina, magnesia, and concrete with recovery of oxygen as a byproduct. On the Moon, similar chemistry is applied with emphasis on oxygen production. This innovation has been demonstrated to produce high-grade materials, such as metallic iron, aluminum oxide, magnesium oxide, and calcium oxide, from lunar and Martian soil simulants. Most of the target products exhibited purities of 80 to 90 percent or more, allowing direct use for many potential applications. Up to one-fourth of the feed soil mass was converted to metal, metal oxide, and oxygen products. The soil residue contained elevated silica content, allowing for potential additional refining and extraction for recovery of materials needed for photovoltaic, semiconductor, and glass applications. A high-grade iron oxide concentrate derived from lunar soil simulant was used to produce a metallic iron component using a novel, combined hydrogen reduction/metal sintering technique. The part was subsequently machined and found to be structurally sound. The behavior of the lunar-simulant-derived iron product was very similar to that produced using the same methods on a Michigan iron

  8. A Quality Improvement Customer Service Process and CSS [Customer Service System]. Burlington County College Employee Development Series, Volumes I & II.

    ERIC Educational Resources Information Center

    Burlington County Coll., Pemberton, NJ.

    Prepared for use by staff in development workshops at Burlington County College (BCC), in New Jersey, this handbook offers college-wide guidelines for improving the quality of service provided to internal and external customers, and reviews key elements of BCC's Customer Service System (CSS), a computerized method of recording and following-up on…

  9. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  10. Monolithic Fuel Fabrication Process Development

    SciTech Connect

    C. R. Clark; N. P. Hallinan; J. F. Jue; D. D. Keiser; J. M. Wight

    2006-05-01

    The pursuit of a high uranium density research reactor fuel plate has led to monolithic fuel, which possesses the greatest possible uranium density in the fuel region. Process developments in fabrication development include friction stir welding tool geometry and cooling improvements and a reduction in the length of time required to complete the transient liquid phase bonding process. Annealing effects on the microstructures of the U-10Mo foil and friction stir welded aluminum 6061 cladding are also examined.

  11. Processes of Expressive Behavior Development.

    ERIC Educational Resources Information Center

    Zivin, Gail

    1986-01-01

    Seventeen processes in the development of expressive behavior are reviewed and coordinated in a framework that is shown to accommodate current perspectives on expressive behavior development. Works of Ekman, Izard, Lewis and Michalson, and Sroufe are briefly reviewed. Neglected areas of research are indicated and the course of expressive behavior…

  12. Lunar materials processing system integration

    NASA Technical Reports Server (NTRS)

    Sherwood, Brent

    1992-01-01

    The theme of this paper is that governmental resources will not permit the simultaneous development of all viable lunar materials processing (LMP) candidates. Choices will inevitably be made, based on the results of system integration trade studies comparing candidates to each other for high-leverage applications. It is in the best long-term interest of the LMP community to lead the selection process itself, quickly and practically. The paper is in five parts. The first part explains what systems integration means and why the specialized field of LMP needs this activity now. The second part defines the integration context for LMP -- by outlining potential lunar base functions, their interrelationships and constraints. The third part establishes perspective for prioritizing the development of LMP methods, by estimating realistic scope, scale, and timing of lunar operations. The fourth part describes the use of one type of analytical tool for gaining understanding of system interactions: the input/output model. A simple example solved with linear algebra is used to illustrate. The fifth and closing part identifies specific steps needed to refine the current ability to study lunar base system integration. Research specialists have a crucial role to play now in providing the data upon which this refinement process must be based.

  13. Lunar materials processing system integration

    NASA Astrophysics Data System (ADS)

    Sherwood, Brent

    1992-02-01

    The theme of this paper is that governmental resources will not permit the simultaneous development of all viable lunar materials processing (LMP) candidates. Choices will inevitably be made, based on the results of system integration trade studies comparing candidates to each other for high-leverage applications. It is in the best long-term interest of the LMP community to lead the selection process itself, quickly and practically. The paper is in five parts. The first part explains what systems integration means and why the specialized field of LMP needs this activity now. The second part defines the integration context for LMP -- by outlining potential lunar base functions, their interrelationships and constraints. The third part establishes perspective for prioritizing the development of LMP methods, by estimating realistic scope, scale, and timing of lunar operations. The fourth part describes the use of one type of analytical tool for gaining understanding of system interactions: the input/output model. A simple example solved with linear algebra is used to illustrate. The fifth and closing part identifies specific steps needed to refine the current ability to study lunar base system integration. Research specialists have a crucial role to play now in providing the data upon which this refinement process must be based.

  14. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  15. Course Development: Industrial or Social Process.

    ERIC Educational Resources Information Center

    Kaufman, David

    The development of course materials at the Open Learning Institute, British Columbia, Canada, is examined from two perspectives: as an industrial process and as a social process. The public institute provides distance education through paced home-study courses. The course team model used at the Institute is a system approach. Course development…

  16. Automated satellite telemetry processing system

    NASA Astrophysics Data System (ADS)

    Parunakian, David; Kalegaev, Vladimir; Barinova, Vera

    In this paper we describe the design and important implementation details of the new automated system for processing satellite telemetry developedat Skobeltsyn Institute of Nuclear Physics of Moscow State University (SINP MSU) . We discuss the most common tasks and pitfall for such systems built around data stream from a single spacecraft or a single instrument, and suggest a solution that allows to quickly develop telemetry processing modules and to integrate them with an existing polling mechanism, support infrastructure and data storage in Oracle or MySQL database systems. We also demonstrate the benefits of this approach using modules for processing three different spacecraft data streams: Coronas-Photon (2009-003A), Tatiana-2 (2009-049D) and Meteor-M no.1 (2009-049A). The data format and protocols used by each of these spacecraft have distinct peculiarities, which nevertheless did not pose a problem for integrating their modules into the main system. Remote access via web interface to Oracle databases and sophisticated visualization tools create a possibility of efficient scientific exploitation of satellite data. Such a system is already deployed at the web portal of the Space Monitoring Data Center (SMDC) of SINP MSU (http://smdc.sinp.msu.ru).

  17. Software Model Of Software-Development Process

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Synott, Debra J.; Levary, Reuven R.

    1990-01-01

    Collection of computer programs constitutes software tool for simulation of medium- to large-scale software-development projects. Necessary to include easily identifiable and more-readily quantifiable characteristics like costs, times, and numbers of errors. Mathematical model incorporating these and other factors of dynamics of software-development process implemented in the Software Life Cycle Simulator (SLICS) computer program. Simulates dynamics of software-development process. In combination with input and output expert software systems and knowledge-based management software system, develops information for use in managing large software-development project. Intended to aid managers in planning, managing, and controlling software-development processes by reducing uncertainties in budgets, required personnel, and schedules.

  18. The standards process: X3 information processing systems

    NASA Technical Reports Server (NTRS)

    Emard, Jean-Paul

    1993-01-01

    The topics are presented in viewgraph form and include the following: International Organization for Standards (ISO); International Electrotechnical Committee (IEC); ISO/IEC Joint Technical Committee 1 (JTC-1); U.S. interface to JTC-1; ANSI; national organizations; U.S. standards development processes; national and international standards developing organizations; regional organizations; and X3 information processing systems.

  19. Developing a dynamic framework to examine the interplay between environmental stress, stakeholder participation processes and hydrological systems

    NASA Astrophysics Data System (ADS)

    Carr, G.; Blöschl, G.; Loucks, D. P.

    2014-09-01

    Stakeholder participation is increasingly discussed as essential for sustainable water resource management. Yet detailed understanding of the factors driving its use, the processes by which it is employed, and the outcomes or achievements it can realise remains highly limited, and often contested. This understanding is essential to enable water policy to be shaped for efficient and effective water management. This research proposes and applies a dynamic framework that can explore in which circumstances environmental stress events, such as floods, droughts or pollution, drive changes in water governance towards a more participatory approach, and how this shapes the processes by which participation or stakeholder engagement takes place, and the subsequent water management outcomes that emerge. The framework is able to assess the extent to which environmental events in combination with favourable contextual factors (e.g. institutional support for participatory activities) lead to good participatory processes (e.g. well facilitated and representative) that then lead to good outcomes (e.g. improved ecological conditions). Through applying the framework to case studies from the literature it becomes clear that environmental stress events can stimulate participatory governance changes, when existing institutional conditions promote participatory approaches. The work also suggests that intermediary outcomes, which may be tangible (such as reaching an agreement) or non-tangible (such as developing shared knowledge and understanding among participants, or creating trust), may provide a crucial link between processes and resource management outcomes. If this relationship can be more strongly confirmed, the presence or absence of intermediary outcomes may even be used as a valuable proxy to predict future resource management outcomes.

  20. Central waste processing system

    NASA Technical Reports Server (NTRS)

    Kester, F. L.

    1973-01-01

    A new concept for processing spacecraft type wastes has been evaluated. The feasibility of reacting various waste materials with steam at temperatures of 538 - 760 C in both a continuous and batch reactor with residence times from 3 to 60 seconds has been established. Essentially complete gasification is achieved. Product gases are primarily hydrogen, carbon dioxide, methane, and carbon monoxide. Water soluble synthetic wastes are readily processed in a continuous tubular reactor at concentrations up to 20 weight percent. The batch reactor is able to process wet and dry wastes at steam to waste weight ratios from 2 to 20. Feces, urine, and synthetic wastes have been successfully processed in the batch reactor.

  1. Managing Risk in Systems Development.

    ERIC Educational Resources Information Center

    DePaoli, Marilyn M.; And Others

    Stanford University's use of a risk assessment methodology to improve the management of systems development projects is discussed. After examining the concepts of hazard, peril, and risk as they relate to the system development process, three ways to assess risk are covered: size, structure, and technology. The overall objective for Stanford…

  2. Development of a coal-fired combustion system for industrial process heating applications. Quarterly technical progress report, January--March 1994

    SciTech Connect

    Not Available

    1994-04-30

    This advanced combustion system research program is for the development of innovative coal-fired process heaters which can be used for high temperature melting, smelting and waste vitrification processes. The process heater systems to be developed have multiple use applications; however, the Phase III research effort is being focused on the development of a process heater system to be used for producing value added vitrified glass products from boiler/incinerator ashes and industrial wastes. The primary objective of the Phase III project is to develop and integrate all the system components, from fuel through total system, controls, and then test the complete system in order to evaluate its potential marketability. The past quarter began with a two-day test performed in January to determine the cause of pulsations in the batch feed system observed during pilot-scale testing of surrogate TSCA incinerator ash performed in December of 1993. Two different batch feedstocks were used during this test: flyash and cullet. The cause of the pulsations was traced to a worn part in the feeder located at the bottom of the batch feed tank. The problem was corrected by replacing the wom part with the corresponding part on the existing coal feed tank. A new feeder for the existing coal tank, which had previously been ordered as part of the new coal handling system, was procured and installed. The data from the pilot-scale tests performed on surrogate TSCA incinerator ash during December of 1993 was collected and analyzed. All of the glass produced during the test passed both the Toxicity characteristics Leach Procedure (TCLP) and the Product Consistency Test (PCT) by approximately two orders of magnitude.

  3. From Process Models to Decision Making: The Use of Data Mining Techniques for Developing Effect Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Conrads, P. A.; Roehl, E. A.

    2010-12-01

    Natural-resource managers face the difficult problem of controlling the interactions between hydrologic and man-made systems in ways that preserve resources while optimally meeting the needs of disparate stakeholders. Finding success depends on obtaining and employing detailed scientific knowledge about the cause-effect relations that govern the physics of these hydrologic systems. This knowledge is most credible when derived from large field-based datasets that encompass the wide range of variability in the parameters of interest. The means of converting data into knowledge of the hydrologic system often involves developing computer models that predict the consequences of alternative management practices to guide resource managers towards the best path forward. Complex hydrologic systems are typically modeled using computer programs that implement traditional, generalized, physical equations, which are calibrated to match the field data as closely as possible. This type of model commonly is limited in terms of demonstrable predictive accuracy, development time, and cost. The science of data mining presents a powerful complement to physics-based models. Data mining is a relatively new science that assists in converting large databases into knowledge and is uniquely able to leverage the real-time, multivariate data now being collected for hydrologic systems. In side-by-side comparisons with state-of-the-art physics-based hydrologic models, the authors have found data-mining solutions have been substantially more accurate, less time consuming to develop, and embeddable into spreadsheets and sophisticated decision support systems (DSS), making them easy to use by regulators and stakeholders. Three data-mining applications will be presented that demonstrate how data-mining techniques can be applied to existing environmental databases to address regional concerns of long-term consequences. In each case, data were transformed into information, and ultimately, into

  4. Software Development to Assist in the Processing and Analysis of Data Obtained Using Fiber Bragg Grating Interrogation Systems

    NASA Technical Reports Server (NTRS)

    Hicks, Rebecca

    2010-01-01

    capable of processing massive amounts of data in both real-time and post-flight settings, and to produce software segments that can be integrated to assist in the task as well. The selected software must be able to: (1) process massive amounts of data (up to 4GB) at a speed useful in a real-time settings (small fractions of a second); (2) process data in post-flight settings to allow test reproduction or further data analysis, inclusive; (3) produce, or make easier to produce, three-dimensional plots/graphs to make the data accessible to flight test engineers; and (4) be customized to allow users to use their own processing formulas or functions and display the data in formats they prefer. Several software programs were evaluated to determine their utility in completing the research objectives. These programs include: OriginLab, Graphis, 3D Grapher, Visualization Sciences Group (VSG) Avizo Wind, Interactive Analysis and Display System (IADS), SigmaPlot, and MATLAB.

  5. Phase equilibria in fullerene-containing systems as a basis for development of manufacture and application processes for nanocarbon materials

    NASA Astrophysics Data System (ADS)

    Semenov, K. N.; Charykov, N. A.; Postnov, V. N.; Sharoyko, V. V.; Murin, I. V.

    2016-01-01

    This review is the first attempt to integrate the available data on all types of phase equilibria (solubility, extraction and sorption) in systems containing light fullerenes (C60 and C70). In the case of solubility diagrams, the following types of phase equilibria are considered: individual fullerene (C60 or C70)-solvent under polythermal and polybaric conditions; C60-C70-solvent, individual fullerene-solvent(1)-solvent(2), as well as multicomponent systems comprising a single fullerene or an industrial mixture of fullerenes and vegetable oils, animal fats or essential oils under polythermal conditions. All published experimental data on the extraction equilibria in C60-C70-liquid phase(1)-liquid phase(2) systems are described systematically and the sorption characteristics of various materials towards light fullerenes are estimated. The possibility of application of these experimental data for development of pre-chromatographic and chromatographic methods for separation of fullerene mixtures and application of fullerenes as nanomodifiers are described. The bibliography includes 87 references.

  6. Phase equilibria in fullerene-containing systems as a basis for development of manufacture and application processes for nanocarbon materials

    NASA Astrophysics Data System (ADS)

    Semenov, K. N.; Charykov, N. A.; Postnov, V. N.; Sharoyko, V. V.; Murin, I. V.

    2016-01-01

    This review is the first attempt to integrate the available data on all types of phase equilibria (solubility, extraction and sorption) in systems containing light fullerenes (C60 and C70). In the case of solubility diagrams, the following types of phase equilibria are considered: individual fullerene (C60 or C70)–solvent under polythermal and polybaric conditions; C60–C70–solvent, individual fullerene–solvent(1)–solvent(2), as well as multicomponent systems comprising a single fullerene or an industrial mixture of fullerenes and vegetable oils, animal fats or essential oils under polythermal conditions. All published experimental data on the extraction equilibria in C60–C70–liquid phase(1)–liquid phase(2) systems are described systematically and the sorption characteristics of various materials towards light fullerenes are estimated. The possibility of application of these experimental data for development of pre-chromatographic and chromatographic methods for separation of fullerene mixtures and application of fullerenes as nanomodifiers are described. The bibliography includes 87 references.

  7. Multipurpose Vacuum Induction Processing System

    NASA Astrophysics Data System (ADS)

    Govindaraju, M.; Kulkarni, Deepak; Balasubramanian, K.

    2012-11-01

    Multipurpose vacuum processing systems are cost effective; occupy less space, multiple functional under one roof and user friendly. A multipurpose vacuum induction system was designed, fabricated and installed in a record time of 6 months time at NFTDC Hyderabad. It was designed to function as a) vacuum induction melting/refining of oxygen free electronic copper/pure metals, b) vacuum induction melting furnace for ferrous materials c) vacuum induction melting for non ferrous materials d) large vacuum heat treatment chamber by resistance heating (by detachable coil and hot zone) e) bottom discharge vacuum induction melting system for non ferrous materials f) Induction heat treatment system and g) directional solidification /investment casting. It contains provision for future capacity addition. The attachments require to manufacture multiple shaped castings and continuous rod casting can be added whenever need arises. Present capacity is decided on the requirement for 10years of development path; presently it has 1.2 ton liquid copper handling capacity. It is equipped with provision for capacity addition up to 2 ton liquid copper handling capacity in future. Provision is made to carry out the capacity addition in easy steps quickly. For easy operational maintenance and troubleshooting, design was made in easily detachable sections. High vacuum system is also is detachable, independent and easily movable which is first of its kind in the country. Detailed design parameters, advantages and development history are presented in this paper.

  8. Challenges and opportunities for policy decisions to address health equity in developing health systems: case study of the policy processes in the Indian state of Orissa

    PubMed Central

    2011-01-01

    Introduction Achieving health equity is a pertinent need of the developing health systems. Though policy process is crucial for planning and attaining health equity, the existing evidences on policy processes are scanty in this regard. This article explores the magnitude, determinants, challenges and prospects of 'health equity approach' in various health policy processes in the Indian State of Orissa - a setting comparable with many other developing health systems. Methods A case-study involving 'Walt-Gilson Policy Triangle' employed key-informant interviews and documentary reviews. Key informants (n = 34) were selected from the departments of Health and Family Welfare, Rural Development, and Women and Child Welfare, and civil societies. The documentary reviews involved various published and unpublished reports, policy pronouncements and articles on health equity in Orissa and similar settings. Results The 'health policy agenda' of Orissa was centered on 'health equity' envisaging affordable and equitable healthcare to all, integrated with public health interventions. However, the subsequent stages of policy process such as 'development, implementation and evaluation' experienced leakage in the equity approach. The impediment for a comprehensive approach towards health equity was the nexus among the national and state health priorities; role, agenda and capacity of actors involved; and existing constraints of the healthcare delivery system. Conclusion The health equity approach of policy processes was incomprehensive, often inadequately coordinated, and largely ignored the right blend of socio-medical determinants. A multi-sectoral, unified and integrated approach is required with technical, financial and managerial resources from different actors for a comprehensive 'health equity approach'. If carefully geared, the ongoing health sector reforms centered on sector-wide approaches, decentralization, communitization and involvement of non-state actors can

  9. Development of a user-friendly system for image processing of electron microscopy by integrating a web browser and PIONE with Eos.

    PubMed

    Tsukamoto, Takafumi; Yasunaga, Takuo

    2014-11-01

    Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also

  10. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  11. Development of a Design Supporting System for Nano-Materials based on a Framework for Integrated Knowledge of Functioning-Manufacturing Process

    NASA Astrophysics Data System (ADS)

    Tarumi, Shinya; Kozaki, Kouji; Kitamura, Yoshinobu; Mizoguchi, Riichiro

    In the recent materials research, much work aims at realization of ``functional materials'' by changing structure and/or manufacturing process with nanotechnology. However, knowledge about the relationship among function, structure and manufacturing process is not well organized. So, material designers have to consider a lot of things at the same time. It would be very helpful for them to support their design process by a computer system. In this article, we discuss a conceptual design supporting system for nano-materials. Firstly, we consider a framework for representing functional structures and manufacturing processes of nano-materials with relationships among them. We expand our former framework for representing functional knowledge based on our investigation through discussion with experts of nano-materials. The extended framework has two features: 1) it represents functional structures and manufacturing processes comprehensively, 2) it expresses parameters of function and ways with their dependencies because they are important for material design. Next, we describe a conceptual design support system we developed based on the framework with its functionalities. Lastly, we evaluate the utility of our system in terms of functionality for design supports. For this purpose, we tried to represent two real examples of material design. And then we did an evaluation experiment on conceptual design of material using our system with the collaboration of domain experts.

  12. Applications of text processing using natural processing system in Printer

    NASA Astrophysics Data System (ADS)

    Saito, Tadashi

    DAI NIPPON PRINTING CO., Ltd. developed a natural language processing system for the automatic indexing and assorting readable kana characters to kanji characters, which is called ruby. This system based on the automatic indexing system called INDEXER produced by NTT Communications and Information Processing Laboratories and NTT Data Communications Co., Ltd. This paper describes some applications using the system. This system creates kana characters for kanji characters which is useful for address books, name lists and books. Further we apply this system for an automatic indexing on CD-ROM.

  13. Development of an Operation Support System for the Blast Furnace in the Ironmaking Process: Large-scale Database-based Online Modeling and Integrated Simulators

    NASA Astrophysics Data System (ADS)

    Ogai, Harutoshi; Ogawa, Masatoshi; Uchida, Kenko; Matsuzaki, Shinroku; Ito, Masahiro

    In the pig-ironmaking process, factors that cause operation malfunctions have increased with both the enlargement of the blast furnace and the increasing use of low quality ore. Therefore, an operation support system that predicts blast furnace performance is demanded. This paper reports the development of a blast furnace operation support system with an integrated simulator and “Large-scale database-based Online Modeling (LOM).” To develop the integrated simulator, a sophisticated burden distribution model is integrated with a two-dimensional total internal phenomenon model for the stationary state by using Java technology. Moreover, an integrated simulator for the partial non-stationary state is developed by modifying the two-dimensional total internal phenomenon model for the stationary state. To incorporate the LOM system into the operation support system, a cross-platform LOM system with general versatility is rebuilt by an existing LOM system. The operation support system is realized by the simulator of the physical modeling method and the LOM of the local modeling method. As a result, the operation support system predicts a dynamic molten pig-iron temperature in the blast furnace. The operation support system is expected to provide staff with useful information.

  14. Software Development Standard Processes (SDSP)

    NASA Technical Reports Server (NTRS)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; Crean, Kathleen A.; Rinker, George C.; Smith, Thomas P.; Lum, Karen T.; Hanna, Robert A.; Erickson, Daniel E.; Gamble, Edward B., Jr.; Morgan, Scott C.; Kelsay, Michael G.; Newport, Brian J.; Lewicki, Scott A.; Stipanuk, Jeane G.; Cooper, Tonja M.; Meshkat, Leila

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  15. Processes and process development in Taiwan

    NASA Technical Reports Server (NTRS)

    Hwang, H. L.

    1986-01-01

    Silicon material research in the Republic of China (ROC) parallels its development in the electronic industry. A brief outline of the historical development in ROC silicon material research is given. Emphasis is placed on the recent Silane Project managed by the National Science Council, ROC, including project objectives, task forces, and recent accomplishments. An introduction is also given to industrialization of the key technologies developed in this project.

  16. Process development for production of medium chain triglycerides using immobilized lipase in a solvent-free system.

    PubMed

    Langone, Marta A P; Sant'Anna, Geraldo L

    2002-01-01

    The synthesis of tricaprylin, tricaprin, trilaurin, and trimyristin in a solvent-free system was conducted by mixing a commercial immobilized lipase with the organic reagents (glycerol and fatty acid) in a 20-mL batch reactor with constant stirring. The effects of temperature, fatty acid/glycerol molar ratio, and enzyme concentration on the reaction conversion were determined. The reactions were carried out for 26 h and the nonpolar phase was analyzed by gas chromatography. Appreciable levels of medium chain triglycerides were achieved, except for tricaprylin. The higher selectivity values for the production of triglycerides were attained under the following conditions: a fatty acid/glycerol molar ratio of 5; enzyme concentration of 5 or 9% (w/w); and temperatures of 70 degrees C (tricaprin), 80 degrees C (trilaurin), and 90 degrees C (trimyristin). After completion of the esterification reaction under these conditions, the recovery of the triglyceride and fatty acids, and the reusability of the enzyme were studied. The unreacted fatty acid and the produced triglyceride were satisfactorily recovered. The commercial immobilized lipase was used in 10 consecutive batch reactions at 80 degrees C, with 100% selectivity in the trilaurin and trimyristin synthesis. The possibility of enzyme reuse and the recovery of residual fatty acid are relevant results that contribute to increasing the viability of the process. PMID:12018320

  17. Works carried out by ZAO NPK Del'fin-Informatika on developing distributed and hybrid structures of technical means for automated control systems of process equipment at thermal power stations

    NASA Astrophysics Data System (ADS)

    Shapiro, V. I.; Chausov, Yu. N.; Borisova, E. V.; Pshenichnikova, O. A.; Tolmachev, A. L.

    2011-10-01

    The field for applying distributed structures of technical means is identified on the basis of experience gained with development of information-computation systems and fully functional automated process control systems. Functions of automated process control systems are pointed out for which centralized processing of data is preferable or necessary in order to support their speed of response and reliability. Experience gained from development of hybrid systems with centralized and distributed processing of information is presented and advisability of constructing them is shown.

  18. Research and Development of a New Power Processing Control Unit of Ion Engine System for the Super Low Altitude Test Satellite

    NASA Astrophysics Data System (ADS)

    Nagano, Hiroshi; Kajiwara, Kenichi; Osuga, Hiroyuki; Ozaki, Toshiyuki; Nakagawa, Takafumi

    JAXA has researched the Super Low Altitude Test Satellite (SLATS), which orbits the Earth at the altitude of nearly 200 km, for use in next-generation Earth observation satellites. An electric propulsion system is very useful to compensate for air drag of the SLATS since it has low thrust and long lifetime. Therefore, based on the Kiku-8 ion engine system, we started research and development of a new ion engine system to apply to the SLATS program. The SLATS ion engine system must be small and light. So, we decided to develop a power processing control unit (PPCU) which combines power supplies and a controller. According to the SLATS requirements, the performance requirements for the PPCU were determined. Then, we manufactured and tested the breadboard model of the high-voltage converter in the PPCU. The test results showed that the power efficiency is over 90 percent, which meets the performance requirements for the PPCU.

  19. Digital TV processing system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Two digital video data compression systems directly applicable to the Space Shuttle TV Communication System were described: (1) For the uplink, a low rate monochrome data compressor is used. The compression is achieved by using a motion detection technique in the Hadamard domain. To transform the variable source rate into a fixed rate, an adaptive rate buffer is provided. (2) For the downlink, a color data compressor is considered. The compression is achieved first by intra-color transformation of the original signal vector, into a vector which has lower information entropy. Then two-dimensional data compression techniques are applied to the Hadamard transformed components of this last vector. Mathematical models and data reliability analyses were also provided for the above video data compression techniques transmitted over a channel encoded Gaussian channel. It was shown that substantial gains can be achieved by the combination of video source and channel coding.

  20. POWER SYSTEMS DEVELOPMENT FACILITY

    SciTech Connect

    Unknown

    2002-11-01

    This report discusses test campaign GCT4 of the Kellogg Brown & Root, Inc. (KBR) transport reactor train with a Siemens Westinghouse Power Corporation (Siemens Westinghouse) particle filter system at the Power Systems Development Facility (PSDF) located in Wilsonville, Alabama. The transport reactor is an advanced circulating fluidized-bed reactor designed to operate as either a combustor or a gasifier using one of two possible particulate control devices (PCDs). The transport reactor was operated as a pressurized gasifier during GCT4. GCT4 was planned as a 250-hour test run to continue characterization of the transport reactor using a blend of several Powder River Basin (PRB) coals and Bucyrus limestone from Ohio. The primary test objectives were: Operational Stability--Characterize reactor loop and PCD operations with short-term tests by varying coal-feed rate, air/coal ratio, riser velocity, solids-circulation rate, system pressure, and air distribution. Secondary objectives included the following: Reactor Operations--Study the devolatilization and tar cracking effects from transient conditions during transition from start-up burner to coal. Evaluate the effect of process operations on heat release, heat transfer, and accelerated fuel particle heat-up rates. Study the effect of changes in reactor conditions on transient temperature profiles, pressure balance, and product gas composition. Effects of Reactor Conditions on Synthesis Gas Composition--Evaluate the effect of air distribution, steam/coal ratio, solids-circulation rate, and reactor temperature on CO/CO{sub 2} ratio, synthesis gas Lower Heating Value (LHV), carbon conversion, and cold and hot gas efficiencies. Research Triangle Institute (RTI) Direct Sulfur Recovery Process (DSRP) Testing--Provide syngas in support of the DSRP commissioning. Loop Seal Operations--Optimize loop seal operations and investigate increases to previously achieved maximum solids-circulation rate.

  1. Precision Pointing System Development

    SciTech Connect

    BUGOS, ROBERT M.

    2003-03-01

    The development of precision pointing systems has been underway in Sandia's Electronic Systems Center for over thirty years. Important areas of emphasis are synthetic aperture radars and optical reconnaissance systems. Most applications are in the aerospace arena, with host vehicles including rockets, satellites, and manned and unmanned aircraft. Systems have been used on defense-related missions throughout the world. Presently in development are pointing systems with accuracy goals in the nanoradian regime. Future activity will include efforts to dramatically reduce system size and weight through measures such as the incorporation of advanced materials and MEMS inertial sensors.

  2. Development of mild gasification process

    SciTech Connect

    Chu, C.I.C.; Derting, T.M.

    1988-07-01

    Under a previous contract with Morgantown Energy Technology Center (METC), Department of Energy (DOE) Contract No. AC21-84MC21108, UCC Research Corporation (UCCRC) built and tested a 1500 lb/day Mild Gasification Process Development Unit (MGU). The MGU, as tested under the previous contract, is shown in Figure 1. Testing completed under the previous contract showed that good quality hydrocarbon liquids and good quality char can be produced in the MGU. However, the MGU is not optimized. The primary objectives of the current project are to optimize the MGU and determine the suitability of char for several commercial applications. The program consists of four tasks; Task 1 -- Test Plan; Task 2 -- Optimization of Mild Gasification Process; Task 3 -- Evaluation of Char and Char/Coal Blends as a Boiler/Blast Furnace Fuel; and Task 4 -- Analysis of Data and Preparation of Final Report. Task 1 has been completed while work continued on Task 2.

  3. Development of mild gasification process

    SciTech Connect

    Chu, C.I.C.; Gillespie, B.L.

    1988-02-01

    Under a previous contract with Morgantown Energy Technology Center (METC), Department of Energy (DOE) Contract No. DE-AC21-84MC21108, UCC Research Corporation (UCCRC) built and tested a 1500 lb/day Mild Gasification Process Development Unit (MGU). The MGU, as tested under the previous contract, is shown in Figure 1. Testing completed under the previous contract showed that good quality hydrocarbon liquids and good quality char can be produced in the MGU. However, the MGU is not optimized. The primary objectives of the current project are to optimize the MGU and determine the suitability of char for several commercial applications. The program consists of four tasks; Task 1-Test Plan; Task 2-Optimization of Mild Gasification Process; Task 3-Evaluation of Char and Char/Coal Blends as a Boiler/Blast Furnace Fuel; and Task 4-Analysis of Data and Preparation of Final Report. Task 1 has been completed while work continued on Task 2.

  4. Development of mild gasification process

    SciTech Connect

    Chu, C.I.C.; Williams, S.W.

    1989-01-01

    Under a previous contract with Morgantown Energy Technology Center (METC), Department of Energy (DOE) Contract No. AC21-84MC21108, UCC Research Corporation (UCCRC) built and tested a 1500 lb/day Mild Gasification Process Development Unit (MGU). The MGU, as tested under the previous contract, is shown in Figure 1. Testing completed under the previous contract showed that good quality hydrocarbon liquids and good quality char can be produced in the MGU. However, the MGU is not optimized. The primary objectives of the current project are to optimize the MGU and determine the suitability of char for several commercial applications. The program consists of four tasks; Task 1 -- Test Plan; Task 2 -- Optimization of Mild Gasification Process; Task 3 -- Evaluation of Char and Char/Coal Blends as a Boiler/Blast Furnace Fuel; and Task 4 -- Analysis of Data and Preparation of Final Report. Task 1 has been completed while work continued on Task 2.

  5. Development of mild gasification process

    SciTech Connect

    Chu, C.I.C.; Gillespie, B.L.

    1987-11-01

    Under a previous contract with Morgantown Energy Technology Center (METC), Department of Energy (DOE) Contract No. AC21-84MC21108, UCC Research Corporation (UCCRC) built and tested a 1500 lb/day Mild Gasification Process Development Unit (MGU). The MGU, as tested under the previous contract, is shown in Figure 1. Testing completed under the previous contract showed that good quality hydrocarbon liquids and good quality char can be produced in the MGU. However, the MGU is not optimized. The primary objectives of the current project are to optimize the MGU and determine the suitability of char for several commercial applications. The program consists of four tasks; Task 1 -- Test Plan; Task 2 -- Optimization of Mild Gasification Process; Task 3 -- Evaluation of Char and Char/Coal Blends as a Boiler/Blast Furnace Fuel; and Task 4 -- Analysis of Data and Preparation of Final Report. Task 1 has been completed while work continued on Task 2.

  6. Process gas solidification system

    DOEpatents

    Fort, William G. S.; Lee, Jr., William W.

    1978-01-01

    It has been the practice to (a) withdraw hot, liquid UF.sub.6 from various systems, (b) direct the UF.sub.6 into storage cylinders, and (c) transport the filled cylinders to another area where the UF.sub.6 is permitted to solidify by natural cooling. However, some hazard attends the movement of cylinders containing liquid UF.sub.6, which is dense, toxic, and corrosive. As illustrated in terms of one of its applications, the invention is directed to withdrawing hot liquid UF.sub.6 from a system including (a) a compressor for increasing the pressure and temperature of a stream of gaseous UF.sub.6 to above its triple point and (b) a condenser for liquefying the compressed gas. A network containing block valves and at least first and second portable storage cylinders is connected between the outlet of the condenser and the suction inlet of the compressor. After an increment of liquid UF.sub.6 from the condenser has been admitted to the first cylinder, the cylinder is connected to the suction of the compressor to flash off UF.sub.6 from the cylinder, thus gradually solidifying UF.sub.6 therein. While the first cylinder is being cooled in this manner, an increment of liquid UF.sub.6 from the condenser is transferred into the second cylinder. UF.sub.6 then is flashed from the second cylinder while another increment of liquid UF.sub.6 is being fed to the first. The operations are repeated until both cylinders are filled with solid UF.sub.6, after which they can be moved safely. As compared with the previous technique, this procedure is safer, faster, and more economical. The method also provides the additional advantage of removing volatile impurities from the UF.sub.6 while it is being cooled.

  7. Developing Software Requirements for a Knowledge Management System That Coordinates Training Programs with Business Processes and Policies in Large Organizations

    ERIC Educational Resources Information Center

    Kiper, J. Richard

    2013-01-01

    For large organizations, updating instructional programs presents a challenge to keep abreast of constantly changing business processes and policies. Each time a process or policy changes, significant resources are required to locate and modify the training materials that convey the new content. Moreover, without the ability to track learning…

  8. Development of an Integrated Multi-Contaminant Removal Process Applied to Warm Syngas Cleanup for Coal-Based Advanced Gasification Systems

    SciTech Connect

    Meyer, Howard

    2010-11-30

    This project met the objective to further the development of an integrated multi-contaminant removal process in which H2S, NH3, HCl and heavy metals including Hg, As, Se and Cd present in the coal-derived syngas can be removed to specified levels in a single/integrated process step. The process supports the mission and goals of the Department of Energy's Gasification Technologies Program, namely to enhance the performance of gasification systems, thus enabling U.S. industry to improve the competitiveness of gasification-based processes. The gasification program will reduce equipment costs, improve process environmental performance, and increase process reliability and flexibility. Two sulfur conversion concepts were tested in the laboratory under this project, i.e., the solventbased, high-pressure University of California Sulfur Recovery Process High Pressure (UCSRP-HP) and the catalytic-based, direct oxidation (DO) section of the CrystaSulf-DO process. Each process required a polishing unit to meet the ultra-clean sulfur content goals of <50 ppbv (parts per billion by volume) as may be necessary for fuel cells or chemical production applications. UCSRP-HP was also tested for the removal of trace, non-sulfur contaminants, including ammonia, hydrogen chloride, and heavy metals. A bench-scale unit was commissioned and limited testing was performed with simulated syngas. Aspen-Plus®-based computer simulation models were prepared and the economics of the UCSRP-HP and CrystaSulf-DO processes were evaluated for a nominal 500 MWe, coal-based, IGCC power plant with carbon capture. This report covers the progress on the UCSRP-HP technology development and the CrystaSulf-DO technology.

  9. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  10. Process Development for Nanostructured Photovoltaics

    SciTech Connect

    Elam, Jeffrey W.

    2015-01-01

    Photovoltaic manufacturing is an emerging industry that promises a carbon-free, nearly limitless source of energy for our nation. However, the high-temperature manufacturing processes used for conventional silicon-based photovoltaics are extremely energy-intensive and expensive. This high cost imposes a critical barrier to the widespread implementation of photovoltaic technology. Argonne National Laboratory and its partners recently invented new methods for manufacturing nanostructured photovoltaic devices that allow dramatic savings in materials, process energy, and cost. These methods are based on atomic layer deposition, a thin film synthesis technique that has been commercialized for the mass production of semiconductor microelectronics. The goal of this project was to develop these low-cost fabrication methods for the high efficiency production of nanostructured photovoltaics, and to demonstrate these methods in solar cell manufacturing. We achieved this goal in two ways: 1) we demonstrated the benefits of these coatings in the laboratory by scaling-up the fabrication of low-cost dye sensitized solar cells; 2) we used our coating technology to reduce the manufacturing cost of solar cells under development by our industrial partners.

  11. Development of a parallel detection and processing system using a multidetector array for wave field restoration in scanning transmission electron microscopy

    SciTech Connect

    Taya, Masaki; Matsutani, Takaomi; Ikuta, Takashi; Saito, Hidekazu; Ogai, Keiko; Harada, Yoshihito; Tanaka, Takeo; Takai, Yoshizo

    2007-08-15

    A parallel image detection and image processing system for scanning transmission electron microscopy was developed using a multidetector array consisting of a multianode photomultiplier tube arranged in an 8x8 square array. The system enables the taking of 64 images simultaneously from different scattered directions with a scanning time of 2.6 s. Using the 64 images, phase and amplitude contrast images of gold particles on an amorphous carbon thin film could be separately reconstructed by applying respective 8 shaped bandpass Fourier filters for each image and multiplying the phase and amplitude reconstructing factors.

  12. Development of a parallel detection and processing system using a multidetector array for wave field restoration in scanning transmission electron microscopy.

    PubMed

    Taya, Masaki; Matsutani, Takaomi; Ikuta, Takashi; Saito, Hidekazu; Ogai, Keiko; Harada, Yoshihito; Tanaka, Takeo; Takai, Yoshizo

    2007-08-01

    A parallel image detection and image processing system for scanning transmission electron microscopy was developed using a multidetector array consisting of a multianode photomultiplier tube arranged in an 8 x 8 square array. The system enables the taking of 64 images simultaneously from different scattered directions with a scanning time of 2.6 s. Using the 64 images, phase and amplitude contrast images of gold particles on an amorphous carbon thin film could be separately reconstructed by applying respective 8 shaped bandpass Fourier filters for each image and multiplying the phase and amplitude reconstructing factors. PMID:17764327

  13. Development of a coal-fired combustion system for industrial process heating applications. Phase 3 final report, November 1992--December 1994

    SciTech Connect

    1995-09-26

    A three phase research and development program has resulted in the development and commercialization of a Cyclone Melting System (CMS{trademark}), capable of being fueled by pulverized coal, natural gas, and other solid, gaseous, or liquid fuels, for the vitrification of industrial wastes. The Phase 3 research effort focused on the development of a process heater system to be used for producing value added glass products from the vitrification of boiler/incinerator ashes and industrial wastes. The primary objective of the Phase 3 project was to develop and integrate all the system components, from fuel through total system controls, and then test the complete system in order to evaluate its potential for successful commercialization. The demonstration test consisted of one test run with a duration of 105 hours, approximately one-half (46 hours) performed with coal as the primary fuel source (70% to 100%), the other half with natural gas. Approximately 50 hours of melting operation were performed vitrifying approximately 50,000 lbs of coal-fired utility boiler flyash/dolomite mixture, producing a fully-reacted vitrified product.

  14. A Study to Develop an Industrial-Scale, Computer-Controlled High Magnetic Field Processing (HMFP) System to Assist in Commercializing the Novel, Enabling HMFP Manufacturing Technology

    SciTech Connect

    Ludtka, Gail Mackiewicz-; Chourey, Aashish

    2010-08-01

    As the original magnet designer and manufacturer of ORNL s 9T, 5-inch ID bore magnet, American Magnetics Inc. (AMI) has collaborated with ORNL s Materials Processing Group s and this partnership has been instrumental in the development of our unique thermo-magnetic facilities and expertise. Consequently, AMI and ORNL have realized that the commercial implementation of the High Magnetic Field Processing (HMFP) technology will require the evolution of robust, automated superconducting (SC) magnet systems that will be cost-effective and easy to operate in an industrial environment. The goal of this project and CRADA is to significantly expedite the timeline for implementing this revolutionary and pervasive cross-cutting technology for future US produced industrial components. The successful completion of this project is anticipated to significantly assist in the timely commercialization and licensing of our HMFP intellectual property for a broad spectrum of industries; and to open up a new market for AMI. One notable outcome of this project is that the ThermoMagnetic Processing Technology WON a prestigious 2009 R&D 100 Awards. This award acknowledges and recognizes our TMP Technology as one of the top 100 innovative US technologies in 2009. By successfully establishing the design requirements for a commercial scale magnetic processing system, this project effort has accomplished a key first step in facilitating the building and demonstration of a superconducting magnetic processing coil, enabling the transition of the High Magnetic Field Processing Technology beyond a laboratory novelty into a commercially viable and industrially scalable Manufacturing Technology.

  15. SIMULATION OF HEAT AND MASS TRANSFER PROCESSES IN A SURROGATE BRONCHIAL SYSTEM DEVELOPED FOR HYGROSCOPIC AEROSOL STUDIES

    EPA Science Inventory

    A surrogate tracheobronchial (TB) system, capable of simulating the in vivo atmosphere (i.e., temperature and relative humidity) in a physiologically realistic manner, is reported here. his surrogate Toxicology Branch system is a tubular, multicomponent physical model where the a...

  16. Cascade Distillation System Development

    NASA Technical Reports Server (NTRS)

    Callahan, Michael R.; Sargushingh, Miriam; Shull, Sarah

    2014-01-01

    NASA's Advanced Exploration Systems (AES) Life Support System (LSS) Project is chartered with de-veloping advanced life support systems that will ena-ble NASA human exploration beyond low Earth orbit (LEO). The goal of AES is to increase the affordabil-ity of long-duration life support missions, and to re-duce the risk associated with integrating and infusing new enabling technologies required to ensure mission success. Because of the robust nature of distillation systems, the AES LSS Project is pursuing develop-ment of the Cascade Distillation Subsystem (CDS) as part of its technology portfolio. Currently, the system is being developed into a flight forward Generation 2.0 design.

  17. Development of a continuous roll-to-roll processing system for mass production of plastic optical film

    NASA Astrophysics Data System (ADS)

    Chang, Chih-Yuan; Tsai, Meng-Hsun

    2015-12-01

    This paper reports a highly effective method for the mass production of large-area plastic optical films with a microlens array pattern based on a continuous roll-to-roll film extrusion and roller embossing process. In this study, a thin steel mold with a micro-circular hole array pattern is fabricated by photolithography and a wet chemical etching process. The thin steel mold was then wrapped onto a metal cylinder to form an embossing roller mold. During the roll-to-roll process operation, a thermoplastic raw material (polycarbonate grains) was put into the barrel of the plastic extruder with a flat T-die. Then, the molten polymer film was extruded and immediately pressed against the surface of the embossing roller mold. Under the proper processing conditions, the molten polymer will just partially fill the micro-circular holes of the mold and due to surface tension form a convex lens surface. A continuous plastic optical film with a microlens array pattern was obtained. Experiments are carried out to investigate the effect of plastic microlens formation on the roll-to-roll process. Finally, the geometrical and optical properties of the fabricated plastic optical film were measured and proved satisfactory. This technique shows great potential for the mass production of large-area plastic optical films with a microlens array pattern.

  18. The Process of Systemic Change

    ERIC Educational Resources Information Center

    Duffy, Francis M.; Reigeluth, Charles M.; Solomon, Monica; Caine, Geoffrey; Carr-Chellman, Alison A.; Almeida, Luis; Frick, Theodore; Thompson, Kenneth; Koh, Joyce; Ryan, Christopher D.; DeMars, Shane

    2006-01-01

    This paper presents several brief papers about the process of systemic change. These are: (1) Step-Up-To-Excellence: A Protocol for Navigating Whole-System Change in School Districts by Francis M. Duffy; (2) The Guidance System for Transforming Education by Charles M. Reigeluth; (3) The Schlechty Center For Leadership In School Reform by Monica…

  19. TECHNOLOGY DEVELOPMENT AND DEPLOYMENT OF SYSTEMS FOR THE RETRIEVAL AND PROCESSING OF REMOTE-HANDLED SLUDGE FROM HANFORD K-WEST FUEL STORAGE BASIN

    SciTech Connect

    RAYMOND RE

    2011-12-27

    In 2011, significant progress was made in developing and deploying technologies to remove, transport, and interim store remote-handled sludge from the 105-K West Fuel Storage Basin on the Hanford Site in south-central Washington State. The sludge in the 105-K West Basin is an accumulation of degraded spent nuclear fuel and other debris that collected during long-term underwater storage of the spent fuel. In 2010, an innovative, remotely operated retrieval system was used to successfully retrieve over 99.7% of the radioactive sludge from 10 submerged temporary storage containers in the K West Basin. In 2011, a full-scale prototype facility was completed for use in technology development, design qualification testing, and operator training on systems used to retrieve, transport, and store highly radioactive K Basin sludge. In this facility, three separate systems for characterizing, retrieving, pretreating, and processing remote-handled sludge were developed. Two of these systems were successfully deployed in 2011. One of these systems was used to pretreat knockout pot sludge as part of the 105-K West Basin cleanup. Knockout pot sludge contains pieces of degraded uranium fuel ranging in size from 600 {mu}m to 6350 {mu}m mixed with pieces of inert material, such as aluminum wire and graphite, in the same size range. The 2011 pretreatment campaign successfully removed most of the inert material from the sludge stream and significantly reduced the remaining volume of knockout pot product material. Removing the inert material significantly minimized the waste stream and reduced costs by reducing the number of transportation and storage containers. Removing the inert material also improved worker safety by reducing the number of remote-handled shipments. Also in 2011, technology development and final design were completed on the system to remove knockout pot material from the basin and transport the material to an onsite facility for interim storage. This system is

  20. Development of a portable hyperspectral imaging system for monitoring the efficacy of sanitation procedures in food processing facilities

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cleaning and sanitation in food processing facilities is a critical step in reducing the risk of transfer of pathogenic organisms to food consumed by the public. Current methods to check the effectiveness of sanitation procedures rely on visual observation and sub-sampling tests such as ATP biolumin...

  1. The Beady Eye of the Professional Development Appraisal System: A Foucauldian Cross-Case Analysis of the Teacher Evaluation Process

    ERIC Educational Resources Information Center

    Torres, Dalia

    2012-01-01

    The purpose of this deconstructive case study was to conduct a Foucauldian power/knowledge analysis constructed from the perceptions of three teachers at an intermediate school in South Texas regarding the role of the teacher evaluation process and its influence on instructional practices. Using Foucault's (1977a) work on power/knowledge, of…

  2. ECOSTATIC CANE PROCESSING SYSTEM PROTOTYPE PHASE

    EPA Science Inventory

    The overall objective of this project was to demonstrate a systems environmental management approach, from field to final product, for the processing of raw cane sugar. Specific sub-systems which were to be developed and demonstrated as part of this systems approach were: (a) har...

  3. Open source clinical portals: a model for healthcare information systems to support care processes and feed clinical research. An Italian case of design, development, reuse, and exploitation.

    PubMed

    Locatelli, Paolo; Baj, Emanuele; Restifo, Nicola; Origgi, Gianni; Bragagia, Silvia

    2011-01-01

    Open source is a still unexploited chance for healthcare organizations and technology providers to answer to a growing demand for innovation and to join economical benefits with a new way of managing hospital information systems. This chapter will present the case of the web enterprise clinical portal developed in Italy by Niguarda Hospital in Milan with the support of Fondazione Politecnico di Milano, to enable a paperless environment for clinical and administrative activities in the ward. This represents also one rare case of open source technology and reuse in the healthcare sector, as the system's porting is now taking place at Besta Neurological Institute in Milan. This institute is customizing the portal to feed researchers with structured clinical data collected in its portal's patient records, so that they can be analyzed, e.g., through business intelligence tools. Both organizational and clinical advantages are investigated, from process monitoring, to semantic data structuring, to recognition of common patterns in care processes. PMID:21431608

  4. An integrated health care system's approach to development of a process to collect patient functional outcomes on total joint replacement procedures.

    PubMed

    Topel, Amy M; Schini, Cynthia A

    2014-01-01

    Health care organizations are challenged to find ways to measure not only process of care but also outcomes of care. Gundersen Health System's Orthopaedic Surgery Department in the La Crosse, Wisconsin area developed a process to collect outcomes of care for patients having hip or knee arthroplasty procedures and planned to use these data to determine impact on patients' lives. The Hip Osteoarthritis Outcomes Score and Knee Osteoarthritis Outcomes Score, adapted from the widely used Western Ontario and McMaster Universities Osteoarthritis Index, were collected preoperatively and at 1 year postoperatively. From these data, the health system determined that patients were experiencing significant improvement in 4 of 5 scales. Further recommendations include evaluating the impact of patients' age, sex, and preoperative body mass index on outcomes, as well as evaluating the impact of more patient involvement in goal setting on recovery time and functional outcomes. PMID:23687238

  5. Development of data processing, interpretation and analysis system for the remote sensing of trace atmospheric gas species

    NASA Technical Reports Server (NTRS)

    Casas, Joseph C.; Saylor, Mary S.; Kindle, Earl C.

    1987-01-01

    The major emphasis is on the advancement of remote sensing technology. In particular, the gas filter correlation radiometer (GFCR) technique was applied to the measurement of trace gas species, such as carbon monoxide (CO), from airborne and Earth orbiting platforms. Through a series of low altitude aircraft flights, high altitude aircraft flights, and orbiting space platform flights, data were collected and analyzed, culminating in the first global map of carbon monoxide concentration in the middle troposphere and stratosphere. The four major areas of this remote sensing program, known as the Measurement of Air Pollution from Satellites (MAPS) experiment, are: (1) data acquisition, (2) data processing, analysis, and interpretation algorithms, (3) data display techniques, and (4) information processing.

  6. Launch processing system concept to reality

    NASA Technical Reports Server (NTRS)

    Bailey, W. W.

    1985-01-01

    The Launch Processing System represents Kennedy Space Center's role in providing a major integrated hardware and software system for the test, checkout and launch of a new space vehicle. Past programs considered the active flight vehicle to ground interfaces as part of the flight systems and therefore the related ground system was provided by the Development Center. The major steps taken to transform the Launch Processing System from a concept to reality with the successful launches of the Shuttle Programs Space Transportation System are addressed.

  7. Thermal processing systems for TRU mixed waste

    SciTech Connect

    Eddy, T.L.; Raivo, B.D.; Anderson, G.L.

    1992-08-01

    This paper presents preliminary ex situ thermal processing system concepts and related processing considerations for remediation of transuranic (TRU)-contaminated wastes (TRUW) buried at the Radioactive Waste Management Complex (RWMC) of the Idaho National Engineering Laboratory (INEL). Anticipated waste stream components and problems are considered. Thermal processing conditions required to obtain a high-integrity, low-leachability glass/ceramic final waste form are considered. Five practical thermal process system designs are compared. Thermal processing of mixed waste and soils with essentially no presorting and using incineration followed by high temperature melting is recommended. Applied research and development necessary for demonstration is also recommended.

  8. Thermal processing systems for TRU mixed waste

    SciTech Connect

    Eddy, T.L.; Raivo, B.D.; Anderson, G.L.

    1992-01-01

    This paper presents preliminary ex situ thermal processing system concepts and related processing considerations for remediation of transuranic (TRU)-contaminated wastes (TRUW) buried at the Radioactive Waste Management Complex (RWMC) of the Idaho National Engineering Laboratory (INEL). Anticipated waste stream components and problems are considered. Thermal processing conditions required to obtain a high-integrity, low-leachability glass/ceramic final waste form are considered. Five practical thermal process system designs are compared. Thermal processing of mixed waste and soils with essentially no presorting and using incineration followed by high temperature melting is recommended. Applied research and development necessary for demonstration is also recommended.

  9. TOUGH2Biot - A simulator for coupled thermal-hydrodynamic-mechanical processes in subsurface flow systems: Application to CO2 geological storage and geothermal development

    NASA Astrophysics Data System (ADS)

    Lei, Hongwu; Xu, Tianfu; Jin, Guangrong

    2015-04-01

    Coupled thermal-hydrodynamic-mechanical processes have become increasingly important in studying the issues affecting subsurface flow systems, such as CO2 sequestration in deep saline aquifers and geothermal development. In this study, a mechanical module based on the extended Biot consolidation model was developed and incorporated into the well-established thermal-hydrodynamic simulator TOUGH2, resulting in an integrated numerical THM simulation program TOUGH2Biot. A finite element method was employed to discretize space for rock mechanical calculation and the Mohr-Coulomb failure criterion was used to determine if the rock undergoes shear-slip failure. Mechanics is partly coupled with the thermal-hydrodynamic processes and gives feedback to flow through stress-dependent porosity and permeability. TOUGH2Biot was verified against analytical solutions for the 1D Terzaghi consolidation and cooling-induced subsidence. TOUGH2Biot was applied to evaluate the thermal, hydrodynamic, and mechanical responses of CO2 geological sequestration at the Ordos CCS Demonstration Project, China and geothermal exploitation at the Geysers geothermal field, California. The results demonstrate that TOUGH2Biot is capable of analyzing change in pressure and temperature, displacement, stress, and potential shear-slip failure caused by large scale underground man-made activity in subsurface flow systems. TOUGH2Biot can also be easily extended for complex coupled process problems in fractured media and be conveniently updated to parallel versions on different platforms to take advantage of high-performance computing.

  10. Power Systems Development Facility

    SciTech Connect

    Southern Company Services

    2009-01-31

    In support of technology development to utilize coal for efficient, affordable, and environmentally clean power generation, the Power Systems Development Facility (PSDF), located in Wilsonville, Alabama, has routinely demonstrated gasification technologies using various types of coals. The PSDF is an engineering scale demonstration of key features of advanced coal-fired power systems, including a Transport Gasifier, a hot gas particulate control device, advanced syngas cleanup systems, and high-pressure solids handling systems. This final report summarizes the results of the technology development work conducted at the PSDF through January 31, 2009. Twenty-one major gasification test campaigns were completed, for a total of more than 11,000 hours of gasification operation. This operational experience has led to significant advancements in gasification technologies.

  11. An ecological vegetation-activated sludge process (V-ASP) for decentralized wastewater treatment: system development, treatment performance, and mathematical modeling.

    PubMed

    Yuan, Jiajia; Dong, Wenyi; Sun, Feiyun; Li, Pu; Zhao, Ke

    2016-05-01

    An environment-friendly decentralized wastewater treatment process that is comprised of activated sludge process (ASP) and wetland vegetation, named as vegetation-activated sludge process (V-ASP), was developed for decentralized wastewater treatment. The long-term experimental results evidenced that the vegetation sequencing batch reactor (V-SBR) process had consistently stable higher removal efficiencies of organic substances and nutrients from domestic wastewater compared with traditional sequencing batch reactor (SBR). The vegetation allocated into V-SBR system could not only remove nutrients through its vegetation transpiration ratio but also provide great surface area for microorganism activity enhancement. This high vegetation transpiration ratio enhanced nutrients removal effectiveness from wastewater mainly by flux enhancement, oxygen and substrate transportation acceleration, and vegetation respiration stimulation. A mathematical model based on ASM2d was successfully established by involving the specific function of vegetation to simulate system performance. The simulation results on the influence of operational parameters on V-ASP treatment effectiveness demonstrated that V-SBR had a high resistance to seasonal temperature fluctuations and influent loading shocking. PMID:26880524

  12. Hybrid systems process mixed wastes

    SciTech Connect

    Chertow, M.R.

    1989-10-01

    Some technologies, developed recently in Europe, combine several processes to separate and reuse materials from solid waste. These plants have in common, generally, that they are reasonably small, have a composting component for the organic portion, and often have a refuse-derived fuel component for combustible waste. Many European communities also have very effective drop-off center programs for recyclables such as bottles and cans. By maintaining the integrity of several different fractions of the waste, there is a less to landfill and less to burn. The importance of these hybrid systems is that they introduce in one plant an approach that encompasses the key concept of today's solid waste planning; recover as much as possible and landfill as little as possible. The plants also introduce various risks, particularly of finding secure markets. There are a number of companies offering various combinations of materials recovery, composting, and waste combustion. Four examples are included: multiple materials recovery and refuse-derived fuel production in Eden Prairie, Minnesota; multiple materials recovery, composting and refuse-derived fuel production in Perugia, Italy; composting, refuse-derived fuel, and gasification in Tolmezzo, Italy; and a front-end system on a mass burning waste-to-energy plant in Neuchatel, Switzerland.

  13. Development of a Versatile Laser Ultrasonic System and Application to On-Line Measurement for Process Control of Wall Thickness and Eccentrictiy of Steel Seamless Mechanical Tubing

    SciTech Connect

    Kisner, R.A.; Kercel, S.W.; Damiano, B.; Bingham, P.R.; Gee, T.F.; Tucker, R.W.; Moore, M.R.; Hileman, M.; Emery, M.; Lenarduzzi, R.; Hardy, J.E.; Weaver, K.; Crutcher, R.; Kolarik, R.V., II; Vandervaart, R.H.

    2002-04-24

    Researchers at the Timken Company conceived a project to develop an on-line instrument for wall thickness measurement of steel seamless mechanical tubing based on laser ultrasonic technology. The instrument, which has been installed and tested at a piercing mill, provides data on tube eccentricity and concentricity. Such measurements permit fine-tuning of manufacturing processes to eliminate excess material in the tube wall and therefore provide a more precisely dimensioned product for their customers. The resulting process energy savings are substantial, as is lowered environmental burden. The expected savings are $85.8 million per year in seamless mechanical tube piercing alone. Applied across the industry, this measurement has a potential of reducing energy consumption by 6 x 10{sup 12} BTU per year, greenhouse gas emissions by 0.3 million metric tons carbon equivalent per year, and toxic waste by 0.255 million pounds per year. The principal technical contributors to the project were the Timken Company, Industrial Materials Institute (IMI, a contractor to Timken), and Oak Ridge National Laboratory (ORNL). Timken provided mill access as well as process and metallurgical understanding. Timken researchers had previously developed fundamental ultrasonic analysis methods on which this project is based. IMI developed and fabricated the laser ultrasonic generation and receiver systems. ORNL developed Bayesian and wavelet based real-time signal processing, spread-spectrum wireless communication, and explored feature extraction and pattern recognition methods. The resulting instrument has successfully measured production tubes at one of Timken's piercing mills. This report concentrates on ORNL's contribution through the CRADA mechanism. The three components of ORNL's contribution were met with mixed success. The real-time signal-processing task accomplished its goal of improvement in detecting time of flight information with a minimum of false data. The signal processing

  14. Software Development to Assist in the Processing and Analysis of Data Obtained Using Fiber Bragg Grating Interrogation Systems

    NASA Technical Reports Server (NTRS)

    Hicks, Rebecca

    2009-01-01

    A fiber Bragg grating is a portion of a core of a fiber optic strand that has been treated to affect the way light travels through the strand. Light within a certain narrow range of wavelengths will be reflected along the fiber by the grating, while light outside that range will pass through the grating mostly undisturbed. Since the range of wavelengths that can penetrate the grating depends on the grating itself as well as temperature and mechanical strain, fiber Bragg gratings can be used as temperature and strain sensors. This capability, along with the light-weight nature of the fiber optic strands in which the gratings reside, make fiber optic sensors an ideal candidate for flight testing and monitoring in which temperature and wing strain are factors. The purpose of this project is to research the availability of software capable of processing massive amounts of data in both real-time and post-flight settings, and to produce software segments that can be integrated to assist in the task as well.

  15. Low-Cost Solar Array Project. Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process. Quarterly progress report, October-December 1980

    SciTech Connect

    Not Available

    1980-01-01

    Progress is reported on the engineering design, fabrication, assembly, operation, economic analysis, and process support R and D for an Experimental Process System Development Unit (EPSDU) for producing semiconductor-grade silicon using the silane-to-silicon process. Most of the process related equipment has been ordered and is being fabricated. Equipment and building foundations have been completed at the EPSDU site, and all the steel was erected for the gantry. The switch gear/control building and the melter building will be completed during the next quarter. The data collection system design is progressing. Various computer programs are being written which will be used to convert electrical, pneumatic and other raw signals into engineering values. The free-space reactor development work was completed with a final 12-hour run in which the free-space reactor PDU ran flawlessly. Also, the quality control method development task was completed. Slim rods were grown from seed silicon rods for subsequent float zone operation and impurity characterization. An excellent quality epitaxial film was deposited on a silicon wafer. Both undoped ad doped films were deposited and the resistivity of the films have been measured. (WHK)

  16. Analysis of numerical methods for computer simulation of kinetic processes: development of KINSIM--a flexible, portable system.

    PubMed

    Barshop, B A; Wrenn, R F; Frieden, C

    1983-04-01

    A flexible and convenient computational method for the simulation of kinetic progress curves has been developed. A mechanism is represented in conventional chemical format with either kinetic or rapid equilibrium steps separating chemical species. A table describing the differential equations of the mechanism is generated and a direct numerical integration is performed. The same program can be used to simulate any number of mechanisms. The user may interactively set kinetic parameters to seek the optimal fit for a set of experiments, as determined by graphical superimposition of simulated curves with experimental data. Standard error analysis and automatic optimization may also be included. The program is computationally efficient and its interactive nature makes it a good teaching tool. The source code is written in FORTRAN IV and adheres closely with the ANSI 1966 standard, so as to make it maximally portable and machine independent. PMID:6688159

  17. Remote systems development

    NASA Technical Reports Server (NTRS)

    Olsen, R.; Schaefer, O.; Hussey, J.

    1992-01-01

    Potential space missions of the nineties and the next century require that we look at the broad category of remote systems as an important means to achieve cost-effective operations, exploration and colonization objectives. This paper addresses such missions, which can use remote systems technology as the basis for identifying required capabilities which must be provided. The relationship of the space-based tasks to similar tasks required for terrestrial applications is discussed. The development status of the required technology is assessed and major issues which must be addressed to meet future requirements are identified. This includes the proper mix of humans and machines, from pure teleoperation to full autonomy; the degree of worksite compatibility for a robotic system; and the required design parameters, such as degrees-of-freedom. Methods for resolution are discussed including analysis, graphical simulation and the use of laboratory test beds. Grumman experience in the application of these techniques to a variety of design issues are presented utilizing the Telerobotics Development Laboratory which includes a 17-DOF robot system, a variety of sensing elements, Deneb/IRIS graphics workstations and control stations. The use of task/worksite mockups, remote system development test beds and graphical analysis are discussed with examples of typical results such as estimates of task times, task feasibility and resulting recommendations for design changes. The relationship of this experience and lessons-learned to future development of remote systems is also discussed.

  18. Chemiluminescence development after initiation of Maillard reaction in aqueous solutions of glycine and glucose: nonlinearity of the process and cooperative properties of the reaction system

    NASA Astrophysics Data System (ADS)

    Voeikov, Vladimir L.; Naletov, Vladimir I.

    1998-06-01

    Nonenzymatic glycation of free or peptide bound amino acids (Maillard reaction, MR) plays an important role in aging, diabetic complications and atherosclerosis. MR taking place at high temperatures is accompanied by chemiluminescence (CL). Here kinetics of CL development in MR proceeding in model systems at room temperature has been analyzed for the first time. Brief heating of glycine and D-glucose solutions to t greater than 93 degrees Celsius results in their browning and appearance of fluorescencent properties. Developed In solutions rapidly cooled down to 20 degrees Celsius a wave of CL. It reached maximum intensity around 40 min after the reaction mixture heating and cooling it down. CL intensity elevation was accompanied by certain decoloration of the solution. Appearance of light absorbing substances and development of CL depended critically upon the temperature of preincubation (greater than or equal to 93 degrees Celsius), initial pH (greater than or equal to 11,2), sample volume (greater than or equal to 0.5 ml) and reagents concentrations. Dependence of total counts accumulation on a system volume over the critical volume was non-monotonous. After reaching maximum values CL began to decline, though only small part of glucose and glycin had been consumed. Brief heating of such solutions to the critical temperature resulted in emergence of a new CL wave. This procedure could be repeated in one and the same reaction system for several times. Whole CL kinetic curve best fitted to lognormal distribution. Macrokinetic properties of the process are characteristic of chain reactions with delayed branching. Results imply also, that self-organization occurs in this system, and that the course of the process strongly depends upon boundary conditions and periodic interference in its course.

  19. Advanced System for Process Engineering

    Energy Science and Technology Software Center (ESTSC)

    1992-02-01

    ASPEN (Advanced System for Process Engineering) is a state of the art process simulator and economic evaluation package which was designed for use in engineering fossil energy conversion processes. ASPEN can represent multiphase streams including solids, and handle complex substances such as coal. The system can perform steady state material and energy balances, determine equipment size and cost, and carry out preliminary economic evaluations. It is supported by a comprehensive physical property system for computationmore » of major properties such as enthalpy, entropy, free energy, molar volume, equilibrium ratio, fugacity coefficient, viscosity, thermal conductivity, and diffusion coefficient for specified phase conditions; vapor, liquid, or solid. The properties may be computed for pure components, mixtures, or components in a mixture, as appropriate. The ASPEN Input Language is oriented towards process engineers.« less

  20. A Study to Develop an Industrial-Scale, Computer-Controlled High Magnetic Field Processing (HMFP) System to Assist in Commercializing the Novel, Enabling HMFP Manufacturing Technology

    SciTech Connect

    Lutdka, G. M.; Chourey, A.

    2010-05-12

    As the original magnet designer and manufacturer of ORNL’s 9T, 5-inch ID bore magnet, American Magnetics Inc. (AMI) has collaborated with ORNL’s Materials Processing Group’s and this partnership has been instrumental in the development of our unique thermo-magnetic facilities and expertise. Consequently, AMI and ORNL have realized that the commercial implementation of the High Magnetic Field Processing (HMFP) technology will require the evolution of robust, automated superconducting (SC) magnet systems that will be cost-effective and easy to operate in an industrial environment. The goal of this project and CRADA is to significantly expedite the timeline for implementing this revolutionary and pervasive cross-cutting technology for future US produced industrial components. The successful completion of this project is anticipated to significantly assist in the timely commercialization and licensing of our HMFP intellectual property for a broad spectrum of industries; and to open up a new market for AMI. One notable outcome of this project is that the ThermoMagnetic Processing Technology WON a prestigious 2009 R&D 100 Awards. This award acknowledges and recognizes our TMP Technology as one of the top 100 innovative US technologies in 2009. By successfully establishing the design requirements for a commercial scale magnetic processing system, this project effort has accomplished a key first step in facilitating the building and demonstration of a superconducting magnetic processing coil, enabling the transition of the High Magnetic Field Processing Technology beyond a laboratory novelty into a commercially viable and industrially scalable Manufacturing Technology.

  1. Development and Testing of the Advanced CHP System Utilizing the Off-Gas from the Innovative Green Coke Calcining Process in Fluidized Bed

    SciTech Connect

    Chudnovsky, Yaroslav; Kozlov, Aleksandr

    2013-08-15

    Green petroleum coke (GPC) is an oil refining byproduct that can be used directly as a solid fuel or as a feedstock for the production of calcined petroleum coke. GPC contains a high amount of volatiles and sulfur. During the calcination process, the GPC is heated to remove the volatiles and sulfur to produce purified calcined coke, which is used in the production of graphite, electrodes, metal carburizers, and other carbon products. Currently, more than 80% of calcined coke is produced in rotary kilns or rotary hearth furnaces. These technologies provide partial heat utilization of the calcined coke to increase efficiency of the calcination process, but they also share some operating disadvantages. However, coke calcination in an electrothermal fluidized bed (EFB) opens up a number of potential benefits for the production enhancement, while reducing the capital and operating costs. The increased usage of heavy crude oil in recent years has resulted in higher sulfur content in green coke produced by oil refinery process, which requires a significant increase in the calcinations temperature and in residence time. The calorific value of the process off-gas is quite substantial and can be effectively utilized as an “opportunity fuel” for combined heat and power (CHP) production to complement the energy demand. Heat recovered from the product cooling can also contribute to the overall economics of the calcination process. Preliminary estimates indicated the decrease in energy consumption by 35-50% as well as a proportional decrease in greenhouse gas emissions. As such, the efficiency improvement of the coke calcinations systems is attracting close attention of the researchers and engineers throughout the world. The developed technology is intended to accomplish the following objectives: - Reduce the energy and carbon intensity of the calcined coke production process. - Increase utilization of opportunity fuels such as industrial waste off-gas from the novel

  2. System of video observation for electron beam welding process

    NASA Astrophysics Data System (ADS)

    Laptenok, V. D.; Seregin, Y. N.; Bocharov, A. N.; Murygin, A. V.; Tynchenko, V. S.

    2016-04-01

    Equipment of video observation system for electron beam welding process was developed. Construction of video observation system allows to reduce negative effects on video camera during the process of electron beam welding and get qualitative images of this process.

  3. POWER SYSTEMS DEVELOPMENT FACILITY

    SciTech Connect

    Unknown

    2002-05-01

    This report discusses test campaign GCT3 of the Halliburton KBR transport reactor train with a Siemens Westinghouse Power Corporation (Siemens Westinghouse) particle filter system at the Power Systems Development Facility (PSDF) located in Wilsonville, Alabama. The transport reactor is an advanced circulating fluidized-bed reactor designed to operate as either a combustor or a gasifier using one of two possible particulate control devices (PCDs). The transport reactor was operated as a pressurized gasifier during GCT3. GCT3 was planned as a 250-hour test run to commission the loop seal and continue the characterization of the limits of operational parameter variations using a blend of several Powder River Basin coals and Bucyrus limestone from Ohio. The primary test objectives were: (1) Loop Seal Commissioning--Evaluate the operational stability of the loop seal with sand and limestone as a bed material at different solids circulation rates and establish a maximum solids circulation rate through the loop seal with the inert bed. (2) Loop Seal Operations--Evaluate the loop seal operational stability during coal feed operations and establish maximum solids circulation rate. Secondary objectives included the continuation of reactor characterization, including: (1) Operational Stability--Characterize the reactor loop and PCD operations with short-term tests by varying coal feed, air/coal ratio, riser velocity, solids circulation rate, system pressure, and air distribution. (2) Reactor Operations--Study the devolatilization and tar cracking effects from transient conditions during transition from start-up burner to coal. Evaluate the effect of process operations on heat release, heat transfer, and accelerated fuel particle heat-up rates. Study the effect of changes in reactor conditions on transient temperature profiles, pressure balance, and product gas composition. (3) Effects of Reactor Conditions on Syngas Composition--Evaluate the effect of air distribution, steam

  4. LANL receiver system development

    SciTech Connect

    Laubscher, B.; Cooke, B.; Cafferty, M.; Olivas, N.

    1997-08-01

    The CALIOPE receiver system development at LANL is the story of two technologies. The first of these technologies consists of off-the-shelf mercury-cadmium-telluride (MCT) detectors and amplifiers. The vendor for this system is Kolmar Technologies. This system was fielded in the Tan Trailer I (TTI) in 1995 and will be referred to in this paper as GEN I. The second system consists of a MCT detector procured from Santa Barbara Research Center (SBRC) and an amplifier designed and built by LANL. This system was fielded in the Tan Trailer II (TTII) system at the NTS tests in 1996 and will be referred to as GEN II. The LANL CALIOPE experimental plan for 1996 was to improve the lidar system by progressing to a higher rep rate laser to perform many shots in a much shorter period of time. In keeping with this plan, the receiver team set a goal of developing a detector system that was background limited for the projected 100 nanosecond (ns) laser pulse. A set of detailed simulations of the DIAL lidar experiment was performed. From these runs, parameters such as optimal detector size, field of view of the receiver system, nominal laser return power, etc. were extracted. With this information, detector physics and amplifier electronic models were developed to obtain the required specifications for each of these components. These derived specs indicated that a substantial improvement over commercially available, off-the-shelf, amplifier and detector technologies would be needed to obtain the goals. To determine if the original GEN I detector was usable, the authors performed tests on a 100 micron square detector at cryogenic temperatures. The results of this test and others convinced them that an advanced detector was required. Eventually, a suitable detector was identified and a number of these single element detectors were procured from SBRC. These single element detectors were witness for the detector arrays built for another DOE project.

  5. Parallel processing spacecraft communication system

    NASA Technical Reports Server (NTRS)

    Bolotin, Gary S. (Inventor); Donaldson, James A. (Inventor); Luong, Huy H. (Inventor); Wood, Steven H. (Inventor)

    1998-01-01

    An uplink controlling assembly speeds data processing using a special parallel codeblock technique. A correct start sequence initiates processing of a frame. Two possible start sequences can be used; and the one which is used determines whether data polarity is inverted or non-inverted. Processing continues until uncorrectable errors are found. The frame ends by intentionally sending a block with an uncorrectable error. Each of the codeblocks in the frame has a channel ID. Each channel ID can be separately processed in parallel. This obviates the problem of waiting for error correction processing. If that channel number is zero, however, it indicates that the frame of data represents a critical command only. That data is handled in a special way, independent of the software. Otherwise, the processed data further handled using special double buffering techniques to avoid problems from overrun. When overrun does occur, the system takes action to lose only the oldest data.

  6. Oil well fluid processing system

    SciTech Connect

    Cobb, J.R.

    1988-10-25

    This patent describes an oil well fluid processing system, comprising: a skid having a first skid section and a second skid section separable from the first skid section; means for connecting one end of the first skid section to one end of the second skid section; a cylindrical fluid processing apparatus pivotally mounted at a lower end thereof on the first skid section for pivoting movement between a raised position wherein the fluid processing apparatus extends vertically from the first skid section and a lowered position wherein the fluid processing apparatus overlays the second skid section at such times that the two sections of the skid are connected together; and means mounted on the second skid section and connectable to the fluid processing apparatus for moving the fluid processing apparatus between the raised and lowered positions at such times that the two sections of the skid are connected together.

  7. Near real time data processing system

    NASA Astrophysics Data System (ADS)

    Mousessian, Ardvas; Vuu, Christina

    2008-08-01

    Raytheon recently developed and implemented a Near Real Time (NRT) data processing subsystem for Earth Observing System (EOS) Microwave Limb Sounder (MLS3) instrument on NASA Aura spacecraft. The NRT can be viewed as a customized Science Information Processing System (SIPS) where the measurements and information provided by the instrument are expeditiously processed, packaged, and delivered. The purpose of the MLS NRT is to process Level 0 data up through Level 2, and distribute standard data products to the customer within 3-5 hours of the first set of data arrival.

  8. Series Bosch System Development

    NASA Technical Reports Server (NTRS)

    Abney, Morgan B.; Evans, Christopher; Mansell, Matt; Swickrath, Michael

    2012-01-01

    State-of-the-art (SOA) carbon dioxide (CO2) reduction technology for the International Space Station produces methane as a byproduct. This methane is subsequently vented overboard. The associated loss of hydrogen ultimately reduces the mass of oxygen that can be recovered from CO2 in a closed-loop life support system. As an alternative to SOA CO2 reduction technology, NASA is exploring a Series-Bosch system capable of reducing CO2 with hydrogen to form water and solid carbon. This results in 100% theoretical recovery of oxygen from metabolic CO2. In the past, Bosch-based technology did not trade favorably against SOA technology due to a high power demand, low reaction efficiencies, concerns with carbon containment, and large resupply requirements necessary to replace expended catalyst cartridges. An alternative approach to Bosch technology, labeled "Series-Bosch," employs a new system design with optimized multi-stage reactors and a membrane-based separation and recycle capability. Multi-physics modeling of the first stage reactor, along with chemical process modeling of the integrated system, has resulted in a design with potential to trade significantly better than previous Bosch technology. The modeling process and resulting system architecture selection are discussed.

  9. Advanced Information Processing System (AIPS)

    NASA Technical Reports Server (NTRS)

    Pitts, Felix L.

    1993-01-01

    Advanced Information Processing System (AIPS) is a computer systems philosophy, a set of validated hardware building blocks, and a set of validated services as embodied in system software. The goal of AIPS is to provide the knowledgebase which will allow achievement of validated fault-tolerant distributed computer system architectures, suitable for a broad range of applications, having failure probability requirements of 10E-9 at 10 hours. A background and description is given followed by program accomplishments, the current focus, applications, technology transfer, FY92 accomplishments, and funding.

  10. Architecture for Survivable System Processing (ASSP)

    NASA Astrophysics Data System (ADS)

    Wood, Richard J.

    1991-11-01

    The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

  11. Architecture for Survivable System Processing (ASSP)

    NASA Technical Reports Server (NTRS)

    Wood, Richard J.

    1991-01-01

    The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

  12. PMIS: System Description. PMIS Project. Planning & Management Information System. A Project To Develop a Data Processing System for Support of the Planning and Management Needs of Local School Districts.

    ERIC Educational Resources Information Center

    Council of the Great City Schools, Washington, DC.

    PMIS (Planning and Management Information System) is an information system that supports the decisionmaking process of executive management in local school districts. The system is designed around a comprehensive, longitudinal, and interrelated data base. It utilizes a powerful real-time, interactive data management system for strategic planning;…

  13. XCPU2 process management system

    SciTech Connect

    Ionkov, Latchesar; Van Hensbergen, Eric

    2009-01-01

    Xcpu2 is a new process management system that allows the users to specify custom file system for a running job. Most cluster management systems enforce single software distribution running on all nodes. Xcpu2 allows programs running on the cluster to work in environment identical to the user's desktop, using the same versions of the libraries and tools the user installed locally, and accessing the configuration file in the same places they are located on the desktop. Xcpu2 builds on our earlier work with the Xcpu system. Like Xcpu, Xcpu2's process management interface is represented as a set of files exported by a 9P file server. It supports heterogeneous clusters and multiple head nodes. Unlike Xcpu, it uses pull instead of push model. In this paper we describe the Xcpu2 clustering model, its operation and how the per-job filesystem configuration can be used to solve some of the common problems when running a cluster.

  14. Advanced PPA Reactor and Process Development

    NASA Technical Reports Server (NTRS)

    Wheeler, Raymond; Aske, James; Abney, Morgan B.; Miller, Lee A.; Greenwood, Zachary

    2012-01-01

    Design and development of a second generation Plasma Pyrolysis Assembly (PPA) reactor is currently underway as part of NASA s Atmosphere Revitalization Resource Recovery effort. By recovering up to 75% of the hydrogen currently lost as methane in the Sabatier reactor effluent, the PPA helps to minimize life support resupply costs for extended duration missions. To date, second generation PPA development has demonstrated significant technology advancements over the first generation device by doubling the methane processing rate while, at the same time, more than halving the required power. One development area of particular interest to NASA system engineers is fouling of the PPA reactor with carbonaceous products. As a mitigation plan, NASA MSFC has explored the feasibility of using an oxidative plasma based upon metabolic CO2 to regenerate the reactor window and gas inlet ports. The results and implications of this testing are addressed along with the advanced PPA reactor development work.

  15. Parallel processing and expert systems

    NASA Technical Reports Server (NTRS)

    Lau, Sonie; Yan, Jerry C.

    1991-01-01

    Whether it be monitoring the thermal subsystem of Space Station Freedom, or controlling the navigation of the autonomous rover on Mars, NASA missions in the 1990s cannot enjoy an increased level of autonomy without the efficient implementation of expert systems. Merely increasing the computational speed of uniprocessors may not be able to guarantee that real-time demands are met for larger systems. Speedup via parallel processing must be pursued alongside the optimization of sequential implementations. Prototypes of parallel expert systems have been built at universities and industrial laboratories in the U.S. and Japan. The state-of-the-art research in progress related to parallel execution of expert systems is surveyed. The survey discusses multiprocessors for expert systems, parallel languages for symbolic computations, and mapping expert systems to multiprocessors. Results to date indicate that the parallelism achieved for these systems is small. The main reasons are (1) the body of knowledge applicable in any given situation and the amount of computation executed by each rule firing are small, (2) dividing the problem solving process into relatively independent partitions is difficult, and (3) implementation decisions that enable expert systems to be incrementally refined hamper compile-time optimization. In order to obtain greater speedups, data parallelism and application parallelism must be exploited.

  16. Process Accountability in Curriculum Development.

    ERIC Educational Resources Information Center

    Gooler, Dennis D.; Grotelueschen, Arden

    This paper urges the curriculum developer to assume the accountability for his decisions necessitated by the actual ways our society functions. The curriculum developer is encouraged to recognize that he is a salesman with a commodity (the curriculum). He is urged to realize that if he cannot market the package to the customers (the various…

  17. Trauma system development.

    PubMed

    Lendrum, R A; Lockey, D J

    2013-01-01

    The word 'trauma' describes the disease entity resulting from physical injury. Trauma is one of the leading causes of death worldwide and deaths due to injury look set to increase. As early as the 1970s, it became evident that centralisation of resources and expertise could reduce the mortality rate from serious injury and that organisation of trauma care delivery into formal systems could improve outcome further. Internationally, trauma systems have evolved in various forms, with widespread reports of mortality and functional outcome benefits when major trauma management is delivered in this way. The management of major trauma in England is currently undergoing significant change. The London Trauma System began operating in April 2010 and others throughout England became operational this year. Similar systems exist internationally and continue to be developed. Anaesthetists have been and continue to be involved with all levels of trauma care delivery, from the provision of pre-hospital trauma and retrieval teams, through to chronic pain management and rehabilitation of patients back into society. This review examines the international development of major trauma care delivery and the components of a modern trauma system. PMID:23210554

  18. Internal insulation system development

    NASA Technical Reports Server (NTRS)

    Gille, J. P.

    1973-01-01

    The development of an internal insulation system for cryogenic liquids is described. The insulation system is based on a gas layer concept in which capillary or surface tension effects are used to maintain a stable gas layer within a cellular core structure between the tank wall and the contained cryogen. In this work, a 1.8 meter diameter tank was insulated and tested with liquid hydrogen. Ability to withstand cycling of the aluminum tank wall to 450 K was a design and test condition.

  19. VLSI mixed signal processing system

    NASA Technical Reports Server (NTRS)

    Alvarez, A.; Premkumar, A. B.

    1993-01-01

    An economical and efficient VLSI implementation of a mixed signal processing system (MSP) is presented in this paper. The MSP concept is investigated and the functional blocks of the proposed MSP are described. The requirements of each of the blocks are discussed in detail. A sample application using active acoustic cancellation technique is described to demonstrate the power of the MSP approach.

  20. NOAO observing proposal processing system

    NASA Astrophysics Data System (ADS)

    Bell, David J.; Gasson, David; Hartman, Mia

    2002-12-01

    Since going electronic in 1994, NOAO has continued to refine and enhance its observing proposal handling system. Virtually all related processes are now handled electronically. Members of the astronomical community can submit proposals through email, web form or via Gemini's downloadable Phase-I Tool. NOAO staff can use online interfaces for administrative tasks, technical reviews, telescope scheduling, and compilation of various statistics. In addition, all information relevant to the TAC process is made available online. The system, now known as ANDES, is designed as a thin-client architecture (web pages are now used for almost all database functions) built using open source tools (FreeBSD, Apache, MySQL, Perl, PHP) to process descriptively-marked (LaTeX, XML) proposal documents.

  1. Skylab materials processing facility experiment developer's report

    NASA Technical Reports Server (NTRS)

    Parks, P. G.

    1975-01-01

    The development of the Skylab M512 Materials Processing Facility is traced from the design of a portable, self-contained electron beam welding system for terrestrial applications to the highly complex experiment system ultimately developed for three Skylab missions. The M512 experiment facility was designed to support six in-space experiments intended to explore the advantages of manufacturing materials in the near-zero-gravity environment of Earth orbit. Detailed descriptions of the M512 facility and related experiment hardware are provided, with discussions of hardware verification and man-machine interfaces included. An analysis of the operation of the facility and experiments during the three Skylab missions is presented, including discussions of the hardware performance, anomalies, and data returned to earth.

  2. Development of the auditory system

    PubMed Central

    Litovsky, Ruth

    2015-01-01

    Auditory development involves changes in the peripheral and central nervous system along the auditory pathways, and these occur naturally, and in response to stimulation. Human development occurs along a trajectory that can last decades, and is studied using behavioral psychophysics, as well as physiologic measurements with neural imaging. The auditory system constructs a perceptual space that takes information from objects and groups, segregates sounds, and provides meaning and access to communication tools such as language. Auditory signals are processed in a series of analysis stages, from peripheral to central. Coding of information has been studied for features of sound, including frequency, intensity, loudness, and location, in quiet and in the presence of maskers. In the latter case, the ability of the auditory system to perform an analysis of the scene becomes highly relevant. While some basic abilities are well developed at birth, there is a clear prolonged maturation of auditory development well into the teenage years. Maturation involves auditory pathways. However, non-auditory changes (attention, memory, cognition) play an important role in auditory development. The ability of the auditory system to adapt in response to novel stimuli is a key feature of development throughout the nervous system, known as neural plasticity. PMID:25726262

  3. Development of the selective coagulation process

    SciTech Connect

    Yoon, R.H.; Luttrell, G.H.

    1991-01-01

    Recent studies have resulted in the development of a novel agglomeration process for upgrading ultrafine coal. This process, which is known as selective hydrophobic coagulation (SHC), is based on the new finding that hydrophobic coal particles can be selectively coagulated in the presence of dispersed mineral matter. The driving force for the coagulation is believed to be due to the structural arrangement of water molecules near the coal surface. In most cases, simple pH control is all that is required to (1) induce the coagulation of the coal particles and (2) effectively disperse the particles of mineral matter. During the past quarter, several important aspects of the SHC process were examined. Direct measurements of the surface forces which control the selective coagulation process were conducted using a Mark 4 surface force apparatus. These preliminary measurements have provided irrefutable evidence for the existence of the hydrophobic force. Key expressions have been presented for a population balance model describing the hydrophobic coagulation process. In order to validate this model, experimental measurements of the size distributions of coal coagulation have been initiated. The liberation characteristics of samples obtained from the Elkhorn No. 3 and Pittsburgh No. 8 coal seams were determined using a SEM-IPS image processing system. Mixing studies were carried out to determine the effects of mixer-impeller configurations on the coagula size distributions. Bench-scale continuous testing has also been initiated during the past quarter using a rotating drum screen and sedimentation tank. 25 figs., 8 tabs.

  4. Text and Illustration Processing System (TIPS) User's Manual. Volume 2: Graphics Processing System.

    ERIC Educational Resources Information Center

    Cox, Ray; Braby, Richard

    This manual contains the procedures to teach the relatively inexperienced author how to enter and process graphic information on a graphics processing system developed by the Training Analysis and Evaluation Group. It describes the illustration processing routines, including scanning graphics into computer memory, displaying graphics, enhancing…

  5. On the development of a simple lumped system micro-model of ductile iron solidification for application to the control of molten metal processing

    SciTech Connect

    Vijayaraghavan, R.; Bradley, F.J.

    1995-12-31

    This paper presents results to date in a project concerning the application of micro-modeling of ductile iron solidification to the development of a process control methodology for the assessment of the effectiveness of magnesium treatment and post-inoculation and the prediction of shrinkage tendency. The approach is to utilize a simple lumped system heat transfer model which incorporates a new formulation for simulating eutectic ductile iron solidification to analyze and interpret cooling curve data routinely acquired for process control purposes. A key feature of the eutectic solidification micro-model is that the graphite kinetics is de-coupled from the austenite kinetics which are completely determined by thermodynamic and mass balance considerations. In addition, the graphite kinetics expression accounts for the evolution of a lognormally distributed nodule size distribution. An empirical two-parameter continuous nucleation model is employed and a new grain impingement function is proposed. Solute redistribution in the ternary Fe-C-Si system is considered. The influence of magnesium on undercooling the melt is simulated by introducing an additional term into the expression for the graphite liquidus surface. Preliminary results are presented which illustrate the time-dependence of the liquidus and eutectic temperatures, the concentration of carbon and silicon in the liquid, and volume changes occurring during solidification.

  6. Expert System Development Methodology (ESDM)

    NASA Technical Reports Server (NTRS)

    Sary, Charisse; Gilstrap, Lewey; Hull, Larry G.

    1990-01-01

    The Expert System Development Methodology (ESDM) provides an approach to developing expert system software. Because of the uncertainty associated with this process, an element of risk is involved. ESDM is designed to address the issue of risk and to acquire the information needed for this purpose in an evolutionary manner. ESDM presents a life cycle in which a prototype evolves through five stages of development. Each stage consists of five steps, leading to a prototype for that stage. Development may proceed to a conventional development methodology (CDM) at any time if enough has been learned about the problem to write requirements. ESDM produces requirements so that a product may be built with a CDM. ESDM is considered preliminary because is has not yet been applied to actual projects. It has been retrospectively evaluated by comparing the methods used in two ongoing expert system development projects that did not explicitly choose to use this methodology but which provided useful insights into actual expert system development practices and problems.

  7. Development of a comprehensive weld process model

    SciTech Connect

    Radhakrishnan, B.; Zacharia, T.; Paul, A.

    1997-05-01

    This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC`s expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three-dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above.

  8. Restructure Staff Development for Systemic Change

    ERIC Educational Resources Information Center

    Kelly, Thomas F.

    2012-01-01

    This paper presents a systems approach based on the work of W. Edwards Deming to system wide, high impact staff development. Deming has pointed out the significance of structure in systems. By restructuring the process of staff development we can bring about cost effective improvement of the whole system. We can improve student achievement while…

  9. Advanced System for Process Engineering

    Energy Science and Technology Software Center (ESTSC)

    1998-09-14

    PRO ASPEN/PC1.0 (Advanced System for Process Engineering) is a state of the art process simulator and economic evaluation package which was designed for use in engineering fossil energy conversion processes and has been ported to run on a PC. PRO ASPEN/PC1.0 can represent multiphase streams including solids, and handle complex substances such as coal. The system can perform steady state material and energy balances, determine equipment size and cost, and carry out preliminary economic evaluations.more » It is supported by a comprehensive physical property system for computation of major properties such as enthalpy, entropy, free energy, molar volume, equilibrium ratio, fugacity coefficient, viscosity, thermal conductivity, and diffusion coefficient for specified phase conditions; vapor, liquid, or solid. The properties may be computed for pure components, mixtures, or components in a mixture, as appropriate. The PRO ASPEN/PC1.0 Input Language is oriented towards process engineers.« less

  10. Development of the Selective Hydrophobic Coagulation process

    SciTech Connect

    Yoon, R.H.; Luttrell, G.H.

    1992-01-01

    A novel technique for selectively coagulating and separating coal from dispersed mineral matter has been developed at Virginia Tech. The process, Selective Hydrophobic Coagulation (SHC), has been studied since 1986 under the sponsorship of the US Department of Energy (Contracts AC22-86PC91221 and AC22-90PC90174). The SHC process differs from oil agglomeration, shear or polymer flocculation, and electrolytic coagulation processes in that it does not require reagents or additives to induce the formation of coagula. In most cases, simple pH control is all that is required to (1) induce the coagulation of coal particles and (2) effectively disperse particles of mineral matter. If the coal is oxidized, a small dosage of reagents can be used to enhance the process. During the quarter, the Anutech Mark IV surface force apparatus was used to generate surface force-distance data for the mica/dodecylamine hydrochloride system (Task 2.1.1). Work to characterize the hydrophobicity of this system and the mica/DDOA[sup [minus

  11. Advanced information processing system: Local system services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Alger, Linda; Whittredge, Roy; Stasiowski, Peter

    1989-01-01

    The Advanced Information Processing System (AIPS) is a multi-computer architecture composed of hardware and software building blocks that can be configured to meet a broad range of application requirements. The hardware building blocks are fault-tolerant, general-purpose computers, fault-and damage-tolerant networks (both computer and input/output), and interfaces between the networks and the computers. The software building blocks are the major software functions: local system services, input/output, system services, inter-computer system services, and the system manager. The foundation of the local system services is an operating system with the functions required for a traditional real-time multi-tasking computer, such as task scheduling, inter-task communication, memory management, interrupt handling, and time maintenance. Resting on this foundation are the redundancy management functions necessary in a redundant computer and the status reporting functions required for an operator interface. The functional requirements, functional design and detailed specifications for all the local system services are documented.

  12. Comparison of digital dental X-ray systems with self-developing film and manual processing for endodontic file length determination.

    PubMed

    Eikenberg, S; Vandre, R

    2000-02-01

    Human skulls were sectioned into 15 sextants. Teeth were then removed and 45 canals were instrumented to their apical foramina. Endodontic files were glued in place at random distances from the apical foramina. Image geometry was maintained by a custom mounting jig. Images were captured with self-developing film, manually processed D-speed film, and a digital radiographic system (Dexis). Digital images were read on a conventional color monitor (cathode ray tube) and a laptop screen (active-matrix liquid crystal display). Fifteen dentists measured the distance from the file tip to the apical foramen of the tooth. Results showed that the measurement error was significantly less for the digital images than for the film-based images. It is likely that these statistical differences may not be of great clinical significance because the digital images could be measured in increments < 0.25 mm. PMID:11194373

  13. Optimizing and developing a continuous separation system for the wet process separation of aluminum and polyethylene in aseptic composite packaging waste.

    PubMed

    Yan, Dahai; Peng, Zheng; Liu, Yuqiang; Li, Li; Huang, Qifei; Xie, Minghui; Wang, Qi

    2015-01-01

    The consumption of milk in China is increasing as living standards rapidly improve, and huge amounts of aseptic composite milk packaging waste are being generated. Aseptic composite packaging is composed of paper, polyethylene, and aluminum. It is difficult to separate the polyethylene and aluminum, so most of the waste is currently sent to landfill or incinerated with other municipal solid waste, meaning that enormous amounts of resources are wasted. A wet process technique for separating the aluminum and polyethylene from the composite materials after the paper had been removed from the original packaging waste was studied. The separation efficiency achieved using different separation reagents was compared, different separation mechanisms were explored, and the impacts of a range of parameters, such as the reagent concentration, temperature, and liquid-solid ratio, on the separation time and aluminum loss ratio were studied. Methanoic acid was found to be the optimal separation reagent, and the suitable conditions were a reagent concentration of 2-4 mol/L, a temperature of 60-80°C, and a liquid-solid ratio of 30 L/kg. These conditions allowed aluminum and polyethylene to be separated in less than 30 min, with an aluminum loss ratio of less than 3%. A mass balance was produced for the aluminum-polyethylene separation system, and control technique was developed to keep the ion concentrations in the reaction system stable. This allowed a continuous industrial-scale process for separating aluminum and polyethylene to be developed, and a demonstration facility with a capacity of 50t/d was built. The demonstration facility gave polyethylene and aluminum recovery rates of more than 98% and more than 72%, respectively. Separating 1t of aluminum-polyethylene composite packaging material gave a profit of 1769 Yuan, meaning that an effective method for recycling aseptic composite packaging waste was achieved. PMID:25458854

  14. Atmospheric and Oceanographic Information Processing System (AOIPS) system description

    NASA Technical Reports Server (NTRS)

    Bracken, P. A.; Dalton, J. T.; Billingsley, J. B.; Quann, J. J.

    1977-01-01

    The development of hardware and software for an interactive, minicomputer based processing and display system for atmospheric and oceanographic information extraction and image data analysis is described. The major applications of the system are discussed as well as enhancements planned for the future.

  15. Development of lysozyme-combined antibacterial system to reduce sulfur dioxide and to stabilize Italian Riesling ice wine during aging process

    PubMed Central

    Chen, Kai; Han, Shun-yu; Zhang, Bo; Li, Min; Sheng, Wen-jun

    2015-01-01

    For the purpose of SO2 reduction and stabilizing ice wine, a new antibacterial technique was developed and verified in order to reduce the content of sulfur dioxide (SO2) and simultaneously maintain protein stability during ice wine aging process. Hazardous bacterial strain (lactic acid bacteria, LAB) and protein stability of Italian Riesling ice wine were evaluated in terms of different amounts of lysozyme, SO2, polyphenols, and wine pH by single-factor experiments. Subsequently, a quadratic rotation-orthogonal composite design with four variables was conducted to establish the multiple linear regression model that demonstrated the influence of different treatments on synthesis score between LAB inhibition and protein stability of ice wine. The results showed that, synthesis score can be influenced by lysozyme and SO2 concentrations on an extremely significant level (P < 0.01). Furthermore, the lysozyme-combined antibacterial system, which is specially designed for ice wine aging, was optimized step by step by response surface methodology and ridge analysis. As a result, the optimal proportion should be control in ice wine as follows: 179.31 mg L−1 lysozyme, 177.14 mg L−1 SO2, 0.60 g L−1 polyphenols, and 4.01 ice wine pH. Based on this system, the normalized synthesis score between LAB inhibition and protein stability can reach the highest point 0.920. Finally, by the experiments of verification and comparison, it was indicated that lysozyme-combined antibacterial system, which was a practical and prospective method to reduce SO2 concentration and effectively prevent contamination from hazardous LAB, can be used to stabilize ice wine during aging process. PMID:26405531

  16. NETWORK ANALYSIS OF PLAZA-STREET SYSTEM BASED ON THE HISTORICAL DEVELOPMENT PROCESS OF THE OLD CITY OF BARCELONA IN CONSIDERING THE RANGE OF WALKING DISTANCE

    NASA Astrophysics Data System (ADS)

    Fukuyama, Sachiyo; Hato, Eiji

    In this study, we analyzed the network structure of the old city of Barcelona in considering the historical development of its plaza-street system. We proposed the index based on the betweenness centrality of street networks, which was calculated in subnetworks constituted within the range of walking distance, 250m, 500m and 1km. As a result, we got the distribution of centrality in each range which was explainable referred to historical development process. We found that the network was characterized by three types of streets, main streets formed in very early steps of the development in Middle Age which had high betweenness centrality in the range of 1km, streets of neighborhood in the range of 250m and links between districts in the range of 500m. Open spaces were located at the connection points of these streets and had the characteristics of hubs. Also we saw that new open spaces constructed in recent urban redevelopment were placed in the low centrality areas intended to make these areas transparent.

  17. Developing an Expert System for Nursing Practice

    PubMed Central

    Ozbolt, Judy G.; Schultz, Samuel; Swain, Mary Ann P.; Abraham, Ivo L.; Farchaus-Stein, Karen

    1984-01-01

    The American Nurses' Association has set eight Standards of Nursing Practice related to the nursing process. Computer-aided information systems intended to facilitate the nursing process must be designed to promote adherence to these professional standards. For each of the eight standards, the paper tells how a hypothetical expert system could help nurses to meet the standard. A prototype of such an expert system is being developed. The paper describes issues in conceptualizing clinical decision-making and developing decision strategies for the prototype system. The process of developing the prototype system is described.

  18. ERIPS: Earth Resource Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Quinn, M. J.

    1975-01-01

    The ERIPS is an interactive computer system used in the analysis of remotely sensed data. It consists of a set of software programs which are executed on an IBM System/360 Model 75J computer under the direction of a trained analyst. The software was a derivative of the Purdue LARSYS program and has evolved to include an extensive pattern recognition system and a number of manipulative, preprocessing routines which prepare the imagery for the pattern recognition application. The original purpose of the system was to analyze remotely sensed data, to develop and perfect techniques to process the data, and to determine the feasibility of applying the data to significant earth resources problems. The System developed into a production system. Error recovery and multi-jobbing capabilities were added to the system.

  19. Mobile processing in open systems

    SciTech Connect

    Sapaty, P.S.

    1996-12-31

    A universal spatial automaton, called WAVE, for highly parallel processing in arbitrary distributed systems is described. The automaton is based on a virus principle where recursive programs, or waves, self-navigate in networks of data or processes in multiple cooperative parts while controlling and modifying the environment they exist in and move through. The layered general organization of the automaton as well as its distributed implementation in computer networks have been discussed. As the automaton dynamically creates, modifies, activates and processes any knowledge networks arbitrarily distributed in computer networks, it can easily model any other paradigms for parallel and distributed computing. Comparison of WAVE with some known programming models and languages, and ideas of their possible integration have also been given.

  20. Parallel processing and expert systems

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Lau, Sonie

    1991-01-01

    Whether it be monitoring the thermal subsystem of Space Station Freedom, or controlling the navigation of the autonomous rover on Mars, NASA missions in the 90's cannot enjoy an increased level of autonomy without the efficient use of expert systems. Merely increasing the computational speed of uniprocessors may not be able to guarantee that real time demands are met for large expert systems. Speed-up via parallel processing must be pursued alongside the optimization of sequential implementations. Prototypes of parallel expert systems have been built at universities and industrial labs in the U.S. and Japan. The state-of-the-art research in progress related to parallel execution of expert systems was surveyed. The survey is divided into three major sections: (1) multiprocessors for parallel expert systems; (2) parallel languages for symbolic computations; and (3) measurements of parallelism of expert system. Results to date indicate that the parallelism achieved for these systems is small. In order to obtain greater speed-ups, data parallelism and application parallelism must be exploited.

  1. Managing the Software Development Process

    NASA Astrophysics Data System (ADS)

    Lubelczyk, J.; Parra, A.

    The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.

  2. Managing the Software Development Process

    NASA Technical Reports Server (NTRS)

    Lubelczky, Jeffrey T.; Parra, Amy

    1999-01-01

    The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.

  3. ASI-Volcanic Risk System (SRV): a pilot project to develop EO data processing modules and products for volcanic activity monitoring, first results.

    NASA Astrophysics Data System (ADS)

    Silvestri, M.; Musacchio, M.; Buongiorno, M. F.; Dini, L.

    2009-04-01

    The Project called Sistema Rischio Vulcanico (SRV) is funded by the Italian Space Agency (ASI) in the frame of the National Space Plan 2003-2005 under the Earth Observations section for natural risks management. The SRV Project is coordinated by the Istituto Nazionale di Geofisica e Vulcanologia (INGV) which is responsible at national level for the volcanic monitoring. The project philosophy is to implement, by incremental versions, specific modules which allow to process, store and visualize through Web GIS tools geophysical parameters suitable for volcanic risk management. The ASI-SRV is devoted to the development of an integrated system based on Earth Observation (EO) data to respond to specific needs of the Italian Civil Protection Department (DPC) and improve the monitoring of Italian active volcanoes during all the risk phases (Pre Crisis, Crisis and Post Crisis). The ASI-SRV system provides support to risk managers during the different volcanic activity phases and its results are addressed to the Italian Civil Protection Department (DPC). SRV provides the capability to manage the import many different EO data into the system, it maintains a repository where the acquired data have to be stored and generates selected volcanic products. The processing modules for EO Optical sensors data are based on procedures jointly developed by INGV and University of Modena. This procedures allow to estimate a number of parameters such as: surface thermal proprieties, gas, aerosol and ash emissions and to characterize the volcanic products in terms of composition and geometry. For the analysis of the surface thermal characteristics, the available algorithms allow to extract information during the prevention phase and during the Warning and Crisis phase. In the prevention phase the thermal analysis is directed to the identification of temperature variation on volcanic structure which may indicate a change in the volcanic activity state. At the moment the only sensor that

  4. Dissection of Bacterial Wilt on Medicago truncatula Revealed Two Type III Secretion System Effectors Acting on Root Infection Process and Disease Development[C][W][OA

    PubMed Central

    Turner, Marie; Jauneau, Alain; Genin, Stéphane; Tavella, Marie-José; Vailleau, Fabienne; Gentzbittel, Laurent; Jardinaud, Marie-Françoise

    2009-01-01

    Ralstonia solanacearum is the causal agent of the devastating bacterial wilt disease, which colonizes susceptible Medicago truncatula via the intact root tip. Infection involves four steps: appearance of root tip symptoms, root tip cortical cell invasion, vessel colonization, and foliar wilting. We examined this pathosystem by in vitro inoculation of intact roots of susceptible or resistant M. truncatula with the pathogenic strain GMI1000. The infection process was type III secretion system dependent and required two type III effectors, Gala7 and AvrA, which were shown to be involved at different stages of infection. Both effectors were involved in development of root tip symptoms, and Gala7 was the main determinant for bacterial invasion of cortical cells. Vessel invasion depended on the host genetic background and was never observed in the resistant line. The invasion of the root tip vasculature in the susceptible line caused foliar wilting. The avrA mutant showed reduced aggressiveness in all steps of the infection process, suggesting a global role in R. solanacearum pathogenicity. The roles of these two effectors in subsequent stages were studied using an assay that bypassed the penetration step; with this assay, the avrA mutant showed no effect compared with the GMI1000 strain, indicating that AvrA is important in early stages of infection. However, later disease symptoms were reduced in the gala7 mutant, indicating a key role in later stages of infection. PMID:19493968

  5. Chemical production processes and systems

    DOEpatents

    Holladay, Johnathan E.; Muzatko, Danielle S.; White, James F.; Zacher, Alan H.

    2014-06-17

    Hydrogenolysis systems are provided that can include a reactor housing an Ru-comprising hydrogenolysis catalyst and wherein the contents of the reactor is maintained at a neutral or acidic pH. Reactant reservoirs within the system can include a polyhydric alcohol compound and a base, wherein a weight ratio of the base to the compound is less than 0.05. Systems also include the product reservoir comprising a hydrogenolyzed polyhydric alcohol compound and salts of organic acids, and wherein the moles of base are substantially equivalent to the moles of salts or organic acids. Processes are provided that can include an Ru-comprising catalyst within a mixture having a neutral or acidic pH. A weight ratio of the base to the compound can be between 0.01 and 0.05 during exposing.

  6. Chemical production processes and systems

    SciTech Connect

    Holladay, Johnathan E; Muzatko, Danielle S; White, James F; Zacher, Alan H

    2015-04-21

    Hydrogenolysis systems are provided that can include a reactor housing an Ru-comprising hydrogenolysis catalyst and wherein the contents of the reactor is maintained at a neutral or acidic pH. Reactant reservoirs within the system can include a polyhydric alcohol compound and a base, wherein a weight ratio of the base to the compound is less than 0.05. Systems also include the product reservoir comprising a hydrogenolyzed polyhydric alcohol compound and salts of organic acids, and wherein the moles of base are substantially equivalent to the moles of salts or organic acids. Processes are provided that can include an Ru-comprising catalyst within a mixture having a neutral or acidic pH. A weight ratio of the base to the compound can be between 0.01 and 0.05 during exposing.

  7. NDMAS System and Process Description

    SciTech Connect

    Larry Hull

    2012-10-01

    Experimental data generated by the Very High Temperature Reactor Program need to be more available to users in the form of data tables on Web pages that can be downloaded to Excel or in delimited text formats that can be used directly for input to analysis and simulation codes, statistical packages, and graphics software. One solution that can provide current and future researchers with direct access to the data they need, while complying with records management requirements, is the Nuclear Data Management and Analysis System (NDMAS). This report describes the NDMAS system and its components, defines roles and responsibilities, describes the functions the system performs, describes the internal processes the NDMAS team uses to carry out the mission, and describes the hardware and software used to meet Very High Temperature Reactor Program needs.

  8. Development of novel microencapsulation processes

    NASA Astrophysics Data System (ADS)

    Yin, Weisi

    of polymer solution suspended in water or from a spray. Hollow PS particles were obtained by swelling PS latex with solvent, freezing in liquid nitrogen, and drying in vacuum. It is shown that the particle morphology is due to phase separation in the polymer emulsion droplets upon freezing in liquid nitrogen, and that morphological changes are driven largely by lowering interfacial free energy. The dried hollow particles were resuspended in a dispersing media and exposed to a plasticizer, which imparts mobility to polymer chains, to close the surface opening and form microcapsules surrounding an aqueous core. The interfacial free energy difference between the hydrophobic inside and hydrophilic outside surfaces is the major driving force for closing the hole on the surface. A controlled release biodegradable vehicle for drug was made by encapsulating procaine hydrochloride, a water-soluble drug, into the core of poly(DL-lactide) (PLA) microcapsules, which were made by the freeze-drying and subsequent closing process. The encapsulation efficiency is affected by the hollow particle morphology, amount of closing agent, exposure time, surfactant, and method of dispersing the hollow particles in water. Controlled release of procaine hydrochloride from the microcapsules into phosphate buffer was observed. The use of benign solvents dimethyl carbonate in spray/freeze-drying and CO2 for closing would eliminate concerns of residual harmful solvent in the product. The ease of separation of CO2 from the drug solution may also enable recycling of the drug solution to increase the overall encapsulation efficiency using these novel hollow particles.

  9. Power Systems Development Facility

    SciTech Connect

    2003-07-01

    This report discusses Test Campaign TC12 of the Kellogg Brown & Root, Inc. (KBR) Transport Gasifier train with a Siemens Westinghouse Power Corporation (SW) particle filter system at the Power Systems Development Facility (PSDF) located in Wilsonville, Alabama. The Transport Gasifier is an advanced circulating fluidized-bed reactor designed to operate as either a combustor or a gasifier using a particulate control device (PCD). While operating as a gasifier, either air or oxygen can be used as the oxidant. Test run TC12 began on May 16, 2003, with the startup of the main air compressor and the lighting of the gasifier start-up burner. The Transport Gasifier operated until May 24, 2003, when a scheduled outage occurred to allow maintenance crews to install the fuel cell test unit and modify the gas clean-up system. On June 18, 2003, the test run resumed when operations relit the start-up burner, and testing continued until the scheduled end of the run on July 14, 2003. TC12 had a total of 733 hours using Powder River Basin (PRB) subbituminous coal. Over the course of the entire test run, gasifier temperatures varied between 1,675 and 1,850 F at pressures from 130 to 210 psig.

  10. Development of a Sample Processing System (SPS) for the in situ search of organic compounds on Mars : application to the Mars Organic Molecule Analyzer (MOMA) experiment

    NASA Astrophysics Data System (ADS)

    Buch, A.; Sternberg, R.; Garnier, C.; Fressinet, C.; Szopa, C.; El Bekri, J.; Coll, P.; Rodier, C.; Raulin, F.; Goesmann, F.

    2008-09-01

    The search for past or present life signs is one of the primary goals of the future Mars exploratory missions. With this aim the Mars Organic Molecule Analyzer (MOMA) module of the ExoMars 2013 next coming European space mission is designed to the in situ analysis, in the Martian soil, of organic molecules of exobiological interest such as amino acids, carboxylic acids, nucleobases or polycyclic aromatic hydrocarbons (PAHs). In the frame of the MOMA experiment we have been developing a Sample Processing System (SPS) compatible with gas chromatography (GC) analysis. The main goal of SPS is to allow the extraction and the gas chromatography separation of the refractory organic compounds from a solid matrix at trace level within space compatible operating conditions. The SPS is a mini-reactor, containing the solid sample (~500mg), able to increase (or decrease) the internal temperature from 20 to 500 °C within 13 sec. The extraction step is therefore performed by using thermodesorption, the best yield of extraction being obtained at 300°C for 10 to 20 min. It has to be noticed that the temperature could be increased up to 500°C without a significant lost of efficiency if the heating run time is kept below 3 min. After the thermodesorption the chemical derivatization of the extracted compounds is performed directly on the soil with a mixture of MTBSTFA and DMF [buch et al.]. By decreasing the polarity of the target molecules, this step allows their volatilization at a temperature below 250°C without any chemical degradation. Once derivatized, the targeted volatile molecules are transferred through a heated transfer line in the gas chromatograph coupled with a mass spectrometer for the detection. The SPS is a "one step/one pot" sample preparation system which should allow the MOMA experiment to detect the refractory molecules absorbed in the Martian soil at a detection limit below the ppb level. A. Buch, R. Sternberg, C. Szopa, C. Freissinet, C. Garnier, J. El Bekri

  11. Propellant injection systems and processes

    NASA Technical Reports Server (NTRS)

    Ito, Jackson I.

    1995-01-01

    The previous 'Art of Injector Design' is maturing and merging with the more systematic 'Science of Combustion Device Analysis.' This technology can be based upon observation, correlation, experimentation and ultimately analytical modeling based upon basic engineering principles. This methodology is more systematic and far superior to the historical injector design process of 'Trial and Error' or blindly 'Copying Past Successes.' The benefit of such an approach is to be able to rank candidate design concepts for relative probability of success or technical risk in all the important combustion device design requirements and combustion process development risk categories before committing to an engine development program. Even if a single analytical design concept cannot be developed to predict satisfying all requirements simultaneously, a series of risk mitigation key enabling technologies can be identified for early resolution. Lower cost subscale or laboratory experimentation to demonstrate proof of principle, critical instrumentation requirements, and design discriminating test plans can be developed based on the physical insight provided by these analyses.

  12. Development leading to a 200 kV, 20 kA, 30 hertz radar-like modulator system for intense ion beam processing

    SciTech Connect

    Reass, W.; Davis, H.; Olson, J.; Coates, D.; Schleinitz, H.

    1996-10-01

    This paper presents the electrical system design methodology being developed for use in the Los Alamos CHAMP (Continuous High-Average Power Microsecond Pulser) program. CHAMP is a Magnetically confined Anode Plasma diode (MAP diode) intense ion source. In addition to overall CHAMP diode system requirements, the design of the pertinent electrical pulse modulator systems are presented.

  13. FLIPS: Friendly Lisp Image Processing System

    NASA Astrophysics Data System (ADS)

    Gee, Shirley J.

    1991-08-01

    The Friendly Lisp Image Processing System (FLIPS) is the interface to Advanced Target Detection (ATD), a multi-resolutional image analysis system developed by Hughes in conjunction with the Hughes Research Laboratories. Both menu- and graphics-driven, FLIPS enhances system usability by supporting the interactive nature of research and development. Although much progress has been made, fully automated image understanding technology that is both robust and reliable is not a reality. In situations where highly accurate results are required, skilled human analysts must still verify the findings of these systems. Furthermore, the systems often require processing times several orders of magnitude greater than that needed by veteran personnel to analyze the same image. The purpose of FLIPS is to facilitate the ability of an image analyst to take statistical measurements on digital imagery in a timely fashion, a capability critical in research environments where a large percentage of time is expended in algorithm development. In many cases, this entails minor modifications or code tinkering. Without a well-developed man-machine interface, throughput is unduly constricted. FLIPS provides mechanisms which support rapid prototyping for ATD. This paper examines the ATD/FLIPS system. The philosophy of ATD in addressing image understanding problems is described, and the capabilities of FLIPS are discussed, along with a description of the interaction between ATD and FLIPS. Finally, an overview of current plans for the system is outlined.

  14. Advanced Dewatering Systems Development

    SciTech Connect

    R.H. Yoon; G.H. Luttrell

    2008-07-31

    A new fine coal dewatering technology has been developed and tested in the present work. The work was funded by the Solid Fuels and Feedstocks Grand Challenge PRDA. The objective of this program was to 'develop innovative technical approaches to ensure a continued supply of environmentally sound solid fuels for existing and future combustion systems with minimal incremental fuel cost.' Specifically, this solicitation is aimed at developing technologies that can (i) improve the efficiency or economics of the recovery of carbon when beneficiating fine coal from both current production and existing coal slurry impoundments and (ii) assist in the greater utilization of coal fines by improving the handling characteristics of fine coal via dewatering and/or reconstitution. The results of the test work conducted during Phase I of the current project demonstrated that the new dewatering technologies can substantially reduce the moisture from fine coal, while the test work conducted during Phase II successfully demonstrated the commercial viability of this technology. It is believed that availability of such efficient and affordable dewatering technology is essential to meeting the DOE's objectives.

  15. Development of the selective hydrophobic coagulation process

    SciTech Connect

    Yoon, R.H.; Luttrell, G.H.

    1992-01-01

    A novel technique for selectively coagulating and separating coal from dispersed mineral matter has been developed at Virginia Tech. The process, Selective Hydrophobic Coagulation (SHC), has been studied since 1986 under the sponsorship of the US Department of Energy. The SHC process differs from oil agglomeration, shear or polymer flocculation, and electrolytic coagulation processes in that it does not require reagents or additives to induce the formation of coagula. In most cases, simple pH control is all that is required to (i) induce the coagulation of coal particles and (ii) effectively disperse particles of mineral matter. If the coal is oxidized, a small dosage of reagents can be used to enhance the process. The technical work program was initiated on July 1, 1992. Force-distance curves were generated for DDOA Br-coated mica surfaces in water and used to calculate hydrophobicity constants and decay lengths for this system; and a new device for the measurement of water contact angles, similar to the Wilhelmy plate balance, has been built 225 kg samples of Pittsburgh No. 8 and Elkhom No. 3 seam coals were obtained; a static mixer test facility for the study of coagula growth was set up and was undergoing shakedown tests at the end of the quarter; a bench-scale lamella thickener was being constructed; and preliminary coagula/ mineral separation tests were being conducted in a bench-scale continuous drum filter.

  16. Dynamic security assessment processing system

    NASA Astrophysics Data System (ADS)

    Tang, Lei

    The architecture of dynamic security assessment processing system (DSAPS) is proposed to address online dynamic security assessment (DSA) with focus of the dissertation on low-probability, high-consequence events. DSAPS upgrades current online DSA functions and adds new functions to fit into the modern power grid. Trajectory sensitivity analysis is introduced and its applications in power system are reviewed. An index is presented to assess transient voltage dips quantitatively using trajectory sensitivities. Then the framework of anticipatory computing system (ACS) for cascading defense is presented as an important function of DSAPS. ACS addresses various security problems and the uncertainties in cascading outages. Corrective control design is automated to mitigate the system stress in cascading progressions. The corrective controls introduced in the dissertation include corrective security constrained optimal power flow, a two-stage load control for severe under-frequency conditions, and transient stability constrained optimal power flow for cascading outages. With state-of-the-art computing facilities to perform high-speed extended-term time-domain simulation and optimization for large-scale systems, DSAPS/ACS efficiently addresses online DSA for low-probability, high-consequence events, which are not addressed by today's industrial practice. Human interference is reduced in the computationally burdensome analysis.

  17. Performance Monitoring of Distributed Data Processing Systems

    NASA Technical Reports Server (NTRS)

    Ojha, Anand K.

    2000-01-01

    Test and checkout systems are essential components in ensuring safety and reliability of aircraft and related systems for space missions. A variety of systems, developed over several years, are in use at the NASA/KSC. Many of these systems are configured as distributed data processing systems with the functionality spread over several multiprocessor nodes interconnected through networks. To be cost-effective, a system should take the least amount of resource and perform a given testing task in the least amount of time. There are two aspects of performance evaluation: monitoring and benchmarking. While monitoring is valuable to system administrators in operating and maintaining, benchmarking is important in designing and upgrading computer-based systems. These two aspects of performance evaluation are the foci of this project. This paper first discusses various issues related to software, hardware, and hybrid performance monitoring as applicable to distributed systems, and specifically to the TCMS (Test Control and Monitoring System). Next, a comparison of several probing instructions are made to show that the hybrid monitoring technique developed by the NIST (National Institutes for Standards and Technology) is the least intrusive and takes only one-fourth of the time taken by software monitoring probes. In the rest of the paper, issues related to benchmarking a distributed system have been discussed and finally a prescription for developing a micro-benchmark for the TCMS has been provided.

  18. A fuzzy classifier system for process control

    NASA Technical Reports Server (NTRS)

    Karr, C. L.; Phillips, J. C.

    1994-01-01

    A fuzzy classifier system that discovers rules for controlling a mathematical model of a pH titration system was developed by researchers at the U.S. Bureau of Mines (USBM). Fuzzy classifier systems successfully combine the strengths of learning classifier systems and fuzzy logic controllers. Learning classifier systems resemble familiar production rule-based systems, but they represent their IF-THEN rules by strings of characters rather than in the traditional linguistic terms. Fuzzy logic is a tool that allows for the incorporation of abstract concepts into rule based-systems, thereby allowing the rules to resemble the familiar 'rules-of-thumb' commonly used by humans when solving difficult process control and reasoning problems. Like learning classifier systems, fuzzy classifier systems employ a genetic algorithm to explore and sample new rules for manipulating the problem environment. Like fuzzy logic controllers, fuzzy classifier systems encapsulate knowledge in the form of production rules. The results presented in this paper demonstrate the ability of fuzzy classifier systems to generate a fuzzy logic-based process control system.

  19. Telemedicine optoelectronic biomedical data processing system

    NASA Astrophysics Data System (ADS)

    Prosolovska, Vita V.

    2010-08-01

    The telemedicine optoelectronic biomedical data processing system is created to share medical information for the control of health rights and timely and rapid response to crisis. The system includes the main blocks: bioprocessor, analog-digital converter biomedical images, optoelectronic module for image processing, optoelectronic module for parallel recording and storage of biomedical imaging and matrix screen display of biomedical images. Rated temporal characteristics of the blocks defined by a particular triggering optoelectronic couple in analog-digital converters and time imaging for matrix screen. The element base for hardware implementation of the developed matrix screen is integrated optoelectronic couples produced by selective epitaxy.

  20. SIT-5 system development.

    NASA Technical Reports Server (NTRS)

    Hyman, J., Jr.

    1972-01-01

    A 5-cm structurally integrated ion thruster (SIT-5) has been developed for attitude control and stationkeeping of synchronous satellites. With two-dimension thrust-vectoring grids, a first generation unit has demonstrated a thrust of 0.56 mlb at a beam voltage of 1200 V, total mass efficiency of 64%, and electrical efficiency of 46.8%. Structural integrity is demonstrated with a dielectric-coated grid for shock (30 G), sinusoidal (9 G), and random (19.9 G rms) accelerations. System envelope is 31.8 cm long by 13.9 cm flange bolt circle, with a mass of 8.5 kg, including 6.2 kg mercury propellant. Characteristics of a second-generation unit indicate significant performance gains.

  1. [Development of the affect system].

    PubMed

    Moser, U; Von Zeppelin, I

    1996-01-01

    The authors show that the development of the affect system commences with affects of an exclusively communicative nature. These regulate the relationship between subject and object. On a different plane they also provide information on the feeling of self deriving from the interaction. Affect is seen throughout as a special kind of information. One section of the article is given over to intensity regulation and early affect defenses. The development of cognitive processes leads to the integration of affect systems and cognitive structures. In the pre-conceptual concretistic phase, fantasies change the object relation in such a way as to make unpleasant affects disappear. Only at a later stage do fantasies acquire the capacity to deal with affects. Ultimately, the affect system is grounded on an invariant relationship feeling. On a variety of different levels it displays the features typical of situation theory and the theory of the representational world, thus making it possible to entertain complex object relations. In this process the various planes of the affect system are retained and practised. Finally, the authors discuss the consequences of their remarks for the understanding of psychic disturbances and the therapies brought to bear on them. PMID:8584745

  2. Power Systems Development Facility

    SciTech Connect

    Southern Company Services

    2004-04-30

    This report discusses Test Campaign TC15 of the Kellogg Brown & Root, Inc. (KBR) Transport Gasifier train with a Siemens Power Generation, Inc. (SPG) particle filter system at the Power Systems Development Facility (PSDF) located in Wilsonville, Alabama. The Transport Gasifier is an advanced circulating fluidized-bed reactor designed to operate as either a combustor or gasifier using a particulate control device (PCD). While operating as a gasifier, either air or oxygen can be used as the oxidant. Test run TC15 began on April 19, 2004, with the startup of the main air compressor and the lighting of the gasifier startup burner. The Transport Gasifier was shutdown on April 29, 2004, accumulating 200 hours of operation using Powder River Basin (PRB) subbituminous coal. About 91 hours of the test run occurred during oxygen-blown operations. Another 6 hours of the test run was in enriched-air mode. The remainder of the test run, approximately 103 hours, took place during air-blown operations. The highest operating temperature in the gasifier mixing zone mostly varied from 1,800 to 1,850 F. The gasifier exit pressure ran between 200 and 230 psig during air-blown operations and between 110 and 150 psig in oxygen-enhanced air operations.

  3. Spacelab output processing system architectural study

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Two different system architectures are presented. The two architectures are derived from two different data flows within the Spacelab Output Processing System. The major differences between these system architectures are in the position of the decommutation function (the first architecture performs decommutation in the latter half of the system and the second architecture performs that function in the front end of the system). In order to be examined, the system was divided into five stand-alone subsystems; Work Assembler, Mass Storage System, Output Processor, Peripheral Pool, and Resource Monitor. The work load of each subsystem was estimated independent of the specific devices to be used. The candidate devices were surveyed from a wide sampling of off-the-shelf devices. Analytical expressions were developed to quantify the projected workload in conjunction with typical devices which would adequately handle the subsystem tasks. All of the study efforts were then directed toward preparing performance and cost curves for each architecture subsystem.

  4. Development of the LICADO coal cleaning process

    SciTech Connect

    Not Available

    1990-07-31

    Development of the liquid carbon dioxide process for the cleaning of coal was performed in batch, variable volume (semi-continuous), and continuous tests. Continuous operation at feed rates up to 4.5 kg/hr (10-lb/hr) was achieved with the Continuous System. Coals tested included Upper Freeport, Pittsburgh, Illinois No. 6, and Middle Kittanning seams. Results showed that the ash and pyrite rejections agreed closely with washability data for each coal at the particle size tested (-200 mesh). A 0.91 metric ton (1-ton) per hour Proof-of-Concept Plant was conceptually designed. A 181 metric ton (200 ton) per hour and a 45 metric ton (50 ton) per hour plant were sized sufficiently to estimate costs for economic analyses. The processing costs for the 181 metric ton (200 ton) per hour and 45 metric ton (50 ton) per hour were estimated to be $18.96 per metric ton ($17.20 per ton) and $11.47 per metric ton ($10.40 per ton), respectively for these size plants. The costs for the 45 metric ton per hour plant are lower because it is assumed to be a fines recovery plant which does not require a grinding circuit of complex waste handling system.

  5. Development of the Concise Data Processing Assessment

    ERIC Educational Resources Information Center

    Day, James; Bonn, Doug

    2011-01-01

    The Concise Data Processing Assessment (CDPA) was developed to probe student abilities related to the nature of measurement and uncertainty and to handling data. The diagnostic is a ten question, multiple-choice test that can be used as both a pre-test and post-test. A key component of the development process was interviews with students, which…

  6. Development Process for Science Operation Software

    NASA Astrophysics Data System (ADS)

    Ballester, Pascal

    2015-12-01

    Scientific software development at ESO involves defined processes for the main phases of project inception, monitoring of development performed by instrument consortia, application maintenance, and application support. We discuss the lessons learnt and evolution of the process for the next generation of tools and observing facilities.

  7. Vision Systems Illuminate Industrial Processes

    NASA Technical Reports Server (NTRS)

    2013-01-01

    When NASA designs a spacecraft to undertake a new mission, innovation does not stop after the design phase. In many cases, these spacecraft are firsts of their kind, requiring not only remarkable imagination and expertise in their conception but new technologies and methods for their manufacture. In the realm of manufacturing, NASA has from necessity worked on the cutting-edge, seeking new techniques and materials for creating unprecedented structures, as well as capabilities for reducing the cost and increasing the efficiency of existing manufacturing technologies. From friction stir welding enhancements (Spinoff 2009) to thermoset composites (Spinoff 2011), NASA s innovations in manufacturing have often transferred to the public in ways that enable the expansion of the Nation s industrial productivity. NASA has long pursued ways of improving upon and ensuring quality results from manufacturing processes ranging from arc welding to thermal coating applications. But many of these processes generate blinding light (hence the need for special eyewear during welding) that obscures the process while it is happening, making it difficult to monitor and evaluate. In the 1980s, NASA partnered with a company to develop technology to address this issue. Today, that collaboration has spawned multiple commercial products that not only support effective manufacturing for private industry but also may support NASA in the use of an exciting, rapidly growing field of manufacturing ideal for long-duration space missions.

  8. Features, Events, and Processes: system Level

    SciTech Connect

    D. McGregor

    2004-10-15

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760).

  9. ASRM test report: Autoclave cure process development

    NASA Technical Reports Server (NTRS)

    Nachbar, D. L.; Mitchell, Suzanne

    1992-01-01

    ASRM insulated segments will be autoclave cured following insulation pre-form installation and strip wind operations. Following competitive bidding, Aerojet ASRM Division (AAD) Purchase Order 100142 was awarded to American Fuel Cell and Coated Fabrics Company, Inc. (Amfuel), Magnolia, AR, for subcontracted insulation autoclave cure process development. Autoclave cure process development test requirements were included in Task 3 of TM05514, Manufacturing Process Development Specification for Integrated Insulation Characterization and Stripwind Process Development. The test objective was to establish autoclave cure process parameters for ASRM insulated segments. Six tasks were completed to: (1) evaluate cure parameters that control acceptable vulcanization of ASRM Kevlar-filled EPDM insulation material; (2) identify first and second order impact parameters on the autoclave cure process; and (3) evaluate insulation material flow-out characteristics to support pre-form configuration design.

  10. Knowledge base for expert system process control/optimization

    NASA Astrophysics Data System (ADS)

    Lee, C. W.; Abrams, Frances L.

    An expert system based on the philosophy of qualitative process automation has been developed for the autonomous cure cycle development and control of the autoclave curing process. The system's knowledge base in the form of declarative rules is based on the qualitative understanding of the curing process. The knowledge base and examples of the resulting cure cycle are presented.

  11. EUV mask process specifics and development challenges

    NASA Astrophysics Data System (ADS)

    Nesladek, Pavel

    2014-07-01

    EUV lithography is currently the favorite and most promising candidate among the next generation lithography (NGL) technologies. Decade ago the NGL was supposed to be used for 45 nm technology node. Due to introduction of immersion 193nm lithography, double/triple patterning and further techniques, the 193 nm lithography capabilities was greatly improved, so it is expected to be used successfully depending on business decision of the end user down to 10 nm logic. Subsequent technology node will require EUV or DSA alternative technology. Manufacturing and especially process development for EUV technology requires significant number of unique processes, in several cases performed at dedicated tools. Currently several of these tools as e.g. EUV AIMS or actinic reflectometer are not available on site yet. The process development is done using external services /tools with impact on the single unit process development timeline and the uncertainty of the process performance estimation, therefore compromises in process development, caused by assumption about similarities between optical and EUV mask made in experiment planning and omitting of tests are further reasons for challenges to unit process development. Increased defect risk and uncertainty in process qualification are just two examples, which can impact mask quality / process development. The aim of this paper is to identify critical aspects of the EUV mask manufacturing with respect to defects on the mask with focus on mask cleaning and defect repair and discuss the impact of the EUV specific requirements on the experiments needed.

  12. Ground data systems resource allocation process

    NASA Technical Reports Server (NTRS)

    Berner, Carol A.; Durham, Ralph; Reilly, Norman B.

    1989-01-01

    The Ground Data Systems Resource Allocation Process at the Jet Propulsion Laboratory provides medium- and long-range planning for the use of Deep Space Network and Mission Control and Computing Center resources in support of NASA's deep space missions and Earth-based science. Resources consist of radio antenna complexes and associated data processing and control computer networks. A semi-automated system was developed that allows operations personnel to interactively generate, edit, and revise allocation plans spanning periods of up to ten years (as opposed to only two or three weeks under the manual system) based on the relative merit of mission events. It also enhances scientific data return. A software system known as the Resource Allocation and Planning Helper (RALPH) merges the conventional methods of operations research, rule-based knowledge engineering, and advanced data base structures. RALPH employs a generic, highly modular architecture capable of solving a wide variety of scheduling and resource sequencing problems. The rule-based RALPH system has saved significant labor in resource allocation. Its successful use affirms the importance of establishing and applying event priorities based on scientific merit, and the benefit of continuity in planning provided by knowledge-based engineering. The RALPH system exhibits a strong potential for minimizing development cycles of resource and payload planning systems throughout NASA and the private sector.

  13. Color Image Processing and Object Tracking System

    NASA Technical Reports Server (NTRS)

    Klimek, Robert B.; Wright, Ted W.; Sielken, Robert S.

    1996-01-01

    This report describes a personal computer based system for automatic and semiautomatic tracking of objects on film or video tape, developed to meet the needs of the Microgravity Combustion and Fluids Science Research Programs at the NASA Lewis Research Center. The system consists of individual hardware components working under computer control to achieve a high degree of automation. The most important hardware components include 16-mm and 35-mm film transports, a high resolution digital camera mounted on a x-y-z micro-positioning stage, an S-VHS tapedeck, an Hi8 tapedeck, video laserdisk, and a framegrabber. All of the image input devices are remotely controlled by a computer. Software was developed to integrate the overall operation of the system including device frame incrementation, grabbing of image frames, image processing of the object's neighborhood, locating the position of the object being tracked, and storing the coordinates in a file. This process is performed repeatedly until the last frame is reached. Several different tracking methods are supported. To illustrate the process, two representative applications of the system are described. These applications represent typical uses of the system and include tracking the propagation of a flame front and tracking the movement of a liquid-gas interface with extremely poor visibility.

  14. Robot development for nuclear material processing

    SciTech Connect

    Pedrotti, L.R.; Armantrout, G.A.; Allen, D.C.; Sievers, R.H. Sr.

    1991-07-01

    The Department of Energy is seeking to modernize its special nuclear material (SNM) production facilities and concurrently reduce radiation exposures and process and incidental radioactive waste generated. As part of this program, Lawrence Livermore National Laboratory (LLNL) lead team is developing and adapting generic and specific applications of commercial robotic technologies to SNM pyrochemical processing and other operations. A working gantry robot within a sealed processing glove box and a telerobot control test bed are manifestations of this effort. This paper describes the development challenges and progress in adapting processing, robotic, and nuclear safety technologies to the application. 3 figs.

  15. Development of the selective coagulation process

    SciTech Connect

    Yoon, R.H.; Luttrell, G.H.

    1991-01-01

    The aim of this project is to develop an economical method for producing low-sulfur and low-ash coals using the selective hydrophobic coagulation (SHC) process. This work has been divided into three tasks: (1) project planning and sample acquisition; (2) studies of the fundamental mechanism(s) of the selective coagulation process and the parameters that affect the process of separating coal from both the ash-forming minerals and pyritic sulfur; and (3) bench-scale process development test work to establish the best possible method(s) of separating the hydrophobic and coagula from the dispersed mineral matter.

  16. Pan-STARRS Moving Object Processing System

    NASA Astrophysics Data System (ADS)

    Jedicke, R.; Denneau, L.; Grav, T.; Heasley, J.; Kubica, Jeremy; Pan-STARRS Team

    2005-12-01

    The Institute for Astronomy at the University of Hawaii is developing a large optical astronomical surveying system - the Panoramic Survey Telescope and Rapid Response System (Pan-STARRS). The Moving Object Processing System (MOPS) client of the Pan-STARRS image processing pipeline is developing software to automatically discover and identify >90% of near-Earth objects (NEOs) 300m in diameter and >80% of other classes of asteroids and comets. In developing its software, MOPS has created a synthetic solar system model (SSM) with over 10 million objects whose distributions of orbital characteristics matches those expected for objects that Pan-STARRS will observe. MOPS verifies its correct operation by simulating the survey and subsequent discovery of synthetically generated objects. MOPS also employs novel techniques in handling the computationally difficult problem of linking large numbers of unknown asteroids in a field of detections. We will describe the creation and verification of the Pan-STARRS MOPS SSM, demonstrate synthetic detections and observations by the MOPS, describe the MOPS asteroid linking techniques, describe accuracy and throughput of the entire MOPS system, and provide predictions regarding the numbers and kinds of objects, including as yet undiscovered "extreme objects", that the MOPS expects to find over its 10-year lifetime. Pan-STARRS is funded under a grant from the U.S. Air Force.

  17. Development of a Comprehensive Weld Process Model

    SciTech Connect

    Radhakrishnan, B.; Zacharia, T.

    1997-05-01

    This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC's expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three- dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above. The following technical tasks have been accomplished as part of the CRADA. 1. The LMES welding code has been ported to the Intel Paragon parallel computer at ORNL

  18. A versatile scalable PET processing system

    SciTech Connect

    H. Dong, A. Weisenberger, J. McKisson, Xi Wenze, C. Cuevas, J. Wilson, L. Zukerman

    2011-06-01

    Positron Emission Tomography (PET) historically has major clinical and preclinical applications in cancerous oncology, neurology, and cardiovascular diseases. Recently, in a new direction, an application specific PET system is being developed at Thomas Jefferson National Accelerator Facility (Jefferson Lab) in collaboration with Duke University, University of Maryland at Baltimore (UMAB), and West Virginia University (WVU) targeted for plant eco-physiology research. The new plant imaging PET system is versatile and scalable such that it could adapt to several plant imaging needs - imaging many important plant organs including leaves, roots, and stems. The mechanical arrangement of the detectors is designed to accommodate the unpredictable and random distribution in space of the plant organs without requiring the plant be disturbed. Prototyping such a system requires a new data acquisition system (DAQ) and data processing system which are adaptable to the requirements of these unique and versatile detectors.

  19. Aviation System Analysis Capability Executive Assistant Development

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen; Villani, James A.; Anderson, Kevin; Book, Paul

    1999-01-01

    In this technical document, we describe the development of the Aviation System Analysis Capability (ASAC) Executive Assistant (EA) Proof of Concept (POC) and Beta version. We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models in the ASAC system, and describe the design process and the results of the ASAC EA POC and Beta system development. We also describe the evaluation process and results for applicable COTS software. The document has seven chapters, a bibliography, and two appendices.

  20. A Central Processing Facility within a Distributed Data Processing System

    NASA Astrophysics Data System (ADS)

    de Witte, S.; Rispens, S. M.; van Hees, R. M.

    2009-04-01

    In a complex scientific data processing project, where raw satellite data (Level 1) is processed to end products (Level 2), you may need specific expertise from various groups in different locations. Collaboration between these groups can lead to better results and give the opportunity to try several different scientific approaches and choose, objectively, the best result. Furthermore, such a distributed data processing system or DDPS can be used for independent validation before the end products are released. All participating groups need common and specific data products for their processing. This involves many interfaces needing and producing different data products. Without a central storage location all groups involved have to implement their own checking routines and transformations in order to use the data products. A central processing facility, acting as a single point of interface between the DDPS and the main data provider as well as for all groups within the DDPS, can facilitate in collecting all scientific data necessary for high-level processing, transforming the Level 1 input data to a DDPS internally agreed format, checking all data products on integrity, format and validity, distributing these data products within the DDPS, monitoring the whole data distribution chain and distributing all end products to the main data provider. A DDPS has been implemented for ESA's gravity mission, GOCE (Gravity field and steady-state Ocean Circulation Explorer). GOCE's DDPS is called the High-level Processing Facility (HPF) and is part of the GOCE Ground Segment, developed under ESA contract by the European GOCE Gravity consortium (EGG-c). The HPF is set up as a distributed facility consisting of several sub-processing centers for scientific pre-processing, orbit determination, gravity field analysis and validation. The sub-processing facilities are connected through a central node, the Central Processing Facility (CPF). The CPF has been thoroughly tested and is

  1. Process and control systems for composites manufacturing

    NASA Technical Reports Server (NTRS)

    Tsiang, T. H.; Wanamaker, John L.

    1992-01-01

    A precise control of composite material processing would not only improve part quality, but it would also directly reduce the overall manufacturing cost. The development and incorporation of sensors will help to generate real-time information for material processing relationships and equipment characteristics. In the present work, the thermocouple, pressure transducer, and dielectrometer technologies were investigated. The monitoring sensors were integrated with the computerized control system in three non-autoclave fabrication techniques: hot-press, self contained tool (self heating and pressurizing), and pressure vessel). The sensors were implemented in the parts and tools.

  2. SCHOOL CONSTRUCTION SYSTEMS DEVELOPMENT PROJECT.

    ERIC Educational Resources Information Center

    BOICE, JOHN,; AND OTHERS

    ONE-HUNDRED MANUFACTURERS EXPRESSED INTEREST IN BIDDING FOR A SYSTEM ON SCHOOL CONSTRUCTION CALLED SCSD OR SCHOOL CONSTRUCTION SYSTEMS DEVELOPMENT TO THE FIRST CALIFORNIA COMMISSION ON SCHOOL CONSTRUCTION SYSTEMS. TWENTY-TWO BUILDINGS COMPRISED THE PROJECT. THE OBJECTIVE WAS TO DEVELOP AN INTEGRATED SYSTEM OF STANDARD SCHOOL BUILDING COMPONENTS…

  3. SOURCE ASSESSMENT SAMPLING SYSTEM: DESIGN AND DEVELOPMENT

    EPA Science Inventory

    The report chronologically describes the design and development of the Source Assessment Sampling System (SASS). The SASS train is the principal sampling element for ducted sources when performing EPA's Level 1 environmental assessment studies. As such, it samples process streams...

  4. Scaleup of IGT MILDGAS Process to a process development unit

    SciTech Connect

    Campbell, J.A.L.; Longanbach, J.; Johnson, R.; Underwood, K.; Mead, J.; Carty, R.H.

    1992-12-31

    The MILDGAS process is capable of processing both eastern caking and western non-caking coals to yield a slate of liquid and solid products. The liquids can be processed to produce: feedstocks for chemicals; pitch for use as a binder for electrodes in the aluminum industry; and fuels. Depending on the feed coal characteristics and the operating conditions, the char can be used as an improved fuel for power generation or can be used to make form coke for steel-making blast furnaces or for foundry cupola operations. The specific objectives of the program are to: design, construct, and operate a 24-tons/day adiabatic process development unit (PDU) to obtain process performance data suitable for design scaleup; obtain large batches of coal-derived co-products for industrial evaluation; prepare a detailed design of a demonstration unit; and develop technical and economic plans for commercialization of the MILDGAS process. In this paper, the authors present the process design of the PDU facility, a description of the expected product distribution and the project test plan to be implemented in the program.

  5. EUV extendibility via dry development rinse process

    NASA Astrophysics Data System (ADS)

    Sayan, Safak; Zheng, Tao; De Simone, Danilo; Vandenberghe, Geert

    2016-03-01

    Conventional photoresist processing involves resist coating, exposure, post-exposure bake, development, rinse and spin drying of a wafer. DDRP mitigates pattern collapse by applying a special polymer material (DDRM) which replaces the exposed/developed part of the photoresist material before wafer is spin dried. As noted above, the main mechanism of pattern collapse is the capillary forces governed by surface tension of rinse water and its asymmetrical recession from both sides of the lines during the drying step of the develop process. DDRP essentially eliminates these failure mechanisms by replacing remaining rinse water with DDRM and providing a structural framework that support resist lines from both sides during spin dry process. Dry development rinse process (DDRP) eliminates the root causes responsible for pattern collapse of photoresist line structures. Since these collapse mechanisms are mitigated, without the need for changes in the photoresist itself, achievable resolution of the state-of-the-art EUV photoresists can further be improved.

  6. OSTA/ADS standards development process

    NASA Technical Reports Server (NTRS)

    Walton, B.

    1981-01-01

    A phased approach to data systems standards was developed. The standards survey, user requirements, methodology survey, and evaluation criteria were completed. Remaining to be done are data system planning interim standards, a concept for implementation of a core applications data service, a data systems policy definition, and full capability data services definition.

  7. Career Development: A Systems Approach.

    ERIC Educational Resources Information Center

    Slavenski, Lynn

    1987-01-01

    The author describes a comprehensive career development system implemented by Coca-Cola USA. The system's objectives are (1) to promote from within, (2) to develop talent for the future, (3) to make managers responsible for development efforts, and (4) to make individuals ultimately responsible for their development. (CH)

  8. Precision grinding process development for brittle materials

    SciTech Connect

    Blaedel, K L; Davis, P J; Piscotty, M A

    1999-04-01

    High performance, brittle materials are the materials of choice for many of today's engineering applications. This paper describes three separate precision grinding processes developed at Lawrence Liver-more National Laboratory to machine precision ceramic components. Included in the discussion of the precision processes is a variety of grinding wheel dressing, truing and profiling techniques.

  9. Process for Selecting System Level Assessments for Human System Technologies

    NASA Technical Reports Server (NTRS)

    Watts, James; Park, John

    2006-01-01

    The integration of many life support systems necessary to construct a stable habitat is difficult. The correct identification of the appropriate technologies and corresponding interfaces is an exhaustive process. Once technologies are selected secondary issues such as mechanical and electrical interfaces must be addressed. The required analytical and testing work must be approached in a piecewise fashion to achieve timely results. A repeatable process has been developed to identify and prioritize system level assessments and testing needs. This Assessment Selection Process has been defined to assess cross cutting integration issues on topics at the system or component levels. Assessments are used to identify risks, encourage future actions to mitigate risks, or spur further studies.

  10. Development of a Versatile Laser-Ultrasonic System and Application to the Online Measurement for Process Control of Wall Thickness and Eccentricity of Seamless Tubes

    SciTech Connect

    Robert V. Kolarik II

    2002-10-23

    A system for the online, non-contact measurement of wall thickness in steel seamless mechanical tubing has been developed and demonstrated at a tubing production line at the Timken Company in Canton, Ohio. The system utilizes laser-generation of ultrasound and laser-detection of time of flight with interferometry, laser-doppler velocimetry and pyrometry, all with fiber coupling. Accuracy (<1% error) and precision (1.5%) are at targeted levels. Cost and energy savings have exceeded estimates. The system has shown good reliability in measuring over 200,000 tubes in its first six months of deployment.

  11. Advanced alarm systems: Display and processing issues

    SciTech Connect

    O`Hara, J.M.; Wachtel, J.; Perensky, J.

    1995-05-01

    This paper describes a research program sponsored by the US Nuclear Regulatory Commission to address the human factors engineering (HFE) deficiencies associated with nuclear power plant alarm systems. The overall objective of the study is to develop HFE review guidance for alarm systems. In support of this objective, human performance issues needing additional research were identified. Among the important issues were alarm processing strategies and alarm display techniques. This paper will discuss these issues and briefly describe our current research plan to address them.

  12. Curriculum Development System for Navy Technical Training.

    ERIC Educational Resources Information Center

    Butler, Lucius

    Documentation for the U.S. Navy's curriculum development system is brought together in this paper, beginning with a description of the Naval Technical Training System. This description includes the Navy Training Plan (NTP) process, which is the current mechanism for introducing new courses; the organization and administration of the system; the…

  13. Process-Based Quality (PBQ) Tools Development

    SciTech Connect

    Cummins, J.L.

    2001-12-03

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts.

  14. Intelligent systems for KSC ground processing

    NASA Technical Reports Server (NTRS)

    Heard, Astrid E.

    1992-01-01

    The ground processing and launch of Shuttle vehicles and their payloads is the primary task of Kennedy Space Center. It is a process which is largely manual and contains little inherent automation. Business is conducted today much as it was during previous NASA programs such as Apollo. In light of new programs and decreasing budgets, NASA must find more cost effective ways in which to do business while retaining the quality and safety of activities. Advanced technologies including artificial intelligence could cut manpower and processing time. This paper is an overview of the research and development in Al technology at KSC with descriptions of the systems which have been implemented, as well as a few under development which are promising additions to ground processing software. Projects discussed cover many facets of ground processing activities, including computer sustaining engineering, subsystem monitor and diagnosis tools and launch team assistants. The deployed Al applications have proven an effectiveness which has helped to demonstrate the benefits of utilizing intelligent software in the ground processing task.

  15. System Design as a Three-Phase Dual-Loop (TPDL) Process: Types of Knowledge-Applied Sources of Feedback, and Student Development as Independent Learners

    ERIC Educational Resources Information Center

    Barak, Moshe

    2010-01-01

    This study aimed at exploring how high school students deal with designing an information system, for example, for a small business or a medical clinic, the extent to which students develop as independent learners while working on their projects, and the factors that help or hinder fostering students' design skills. The three-phase dual-loop…

  16. ASRM process development in aqueous cleaning

    NASA Technical Reports Server (NTRS)

    Swisher, Bill

    1992-01-01

    Viewgraphs are included on process development in aqueous cleaning which is taking place at the Aerojet Advanced Solid Rocket Motor (ASRM) Division under a NASA Marshall Space and Flight Center contract for design, development, test, and evaluation of the ASRM including new production facilities. The ASRM will utilize aqueous cleaning in several manufacturing process steps to clean case segments, nozzle metal components, and igniter closures. ASRM manufacturing process development is underway, including agent selection, agent characterization, subscale process optimization, bonding verification, and scale-up validation. Process parameters are currently being tested for optimization utilizing a Taguci Matrix, including agent concentration, cleaning solution temperature, agitation and immersion time, rinse water amount and temperature, and use/non-use of drying air. Based on results of process development testing to date, several observations are offered: aqueous cleaning appears effective for steels and SermeTel-coated metals in ASRM processing; aqueous cleaning agents may stain and/or attack bare aluminum metals to various extents; aqueous cleaning appears unsuitable for thermal sprayed aluminum-coated steel; aqueous cleaning appears to adequately remove a wide range of contaminants from flat metal surfaces, but supplementary assistance may be needed to remove clumps of tenacious contaminants embedded in holes, etc.; and hot rinse water appears to be beneficial to aid in drying of bare steel and retarding oxidation rate.

  17. ASRM process development in aqueous cleaning

    NASA Astrophysics Data System (ADS)

    Swisher, Bill

    1992-12-01

    Viewgraphs are included on process development in aqueous cleaning which is taking place at the Aerojet Advanced Solid Rocket Motor (ASRM) Division under a NASA Marshall Space and Flight Center contract for design, development, test, and evaluation of the ASRM including new production facilities. The ASRM will utilize aqueous cleaning in several manufacturing process steps to clean case segments, nozzle metal components, and igniter closures. ASRM manufacturing process development is underway, including agent selection, agent characterization, subscale process optimization, bonding verification, and scale-up validation. Process parameters are currently being tested for optimization utilizing a Taguci Matrix, including agent concentration, cleaning solution temperature, agitation and immersion time, rinse water amount and temperature, and use/non-use of drying air. Based on results of process development testing to date, several observations are offered: aqueous cleaning appears effective for steels and SermeTel-coated metals in ASRM processing; aqueous cleaning agents may stain and/or attack bare aluminum metals to various extents; aqueous cleaning appears unsuitable for thermal sprayed aluminum-coated steel; aqueous cleaning appears to adequately remove a wide range of contaminants from flat metal surfaces, but supplementary assistance may be needed to remove clumps of tenacious contaminants embedded in holes, etc.; and hot rinse water appears to be beneficial to aid in drying of bare steel and retarding oxidation rate.

  18. Using Reflection to Develop Higher Order Processes

    ERIC Educational Resources Information Center

    Lerch, Carol; Bilics, Andrea; Colley, Binta

    2006-01-01

    The main purpose of this study was to look at how we used specific writing assignments in our courses to encourage metacognitive reflection in order to increase the learning that takes place. The study also aimed to aid in the development of higher order processing skills through the development of student reflection. The students involved in the…

  19. Information Processing Theory and Conceptual Development.

    ERIC Educational Resources Information Center

    Schroder, H. M.

    An educational program based upon information processing theory has been developed at Southern Illinois University. The integrating theme was the development of conceptual ability for coping with social and personal problems. It utilized student information search and concept formation as foundations for discussion and judgment and was organized…

  20. The Cassini-Huygens Sequence Development Process

    NASA Technical Reports Server (NTRS)

    Long, Jennifer H.; Heventhal, William M., III; Javidnia, Shahram

    2006-01-01

    Each phase of the sequence development process had to overcome many operational challenges due to the immense complexity of the spacecraft, tour design, pointing capabilities, flight rules and software development. This paper will address the specific challenges related to each of those complexities and the methods used to overcome them during operation.

  1. Cognitive Process of Development in Children

    ERIC Educational Resources Information Center

    Boddington, Eulalee N.

    2009-01-01

    In this article we explored the theories of Arnold Gesell, Erik Erickson and Jean Piaget about how human beings development. In this component we will analyze the cognitive processes of how children perceive and develop, in particular children from a cross-cultural background. How learning takes place, and how the influences of culture, and…

  2. Preform Characterization in VARTM Process Model Development

    NASA Technical Reports Server (NTRS)

    Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal; Loos, Alfred C.; Kellen, Charles B.; Jensen, Brian J.

    2004-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) is a Liquid Composite Molding (LCM) process where both resin injection and fiber compaction are achieved under pressures of 101.3 kPa or less. Originally developed over a decade ago for marine composite fabrication, VARTM is now considered a viable process for the fabrication of aerospace composites (1,2). In order to optimize and further improve the process, a finite element analysis (FEA) process model is being developed to include the coupled phenomenon of resin flow, preform compaction and resin cure. The model input parameters are obtained from resin and fiber-preform characterization tests. In this study, the compaction behavior and the Darcy permeability of a commercially available carbon fabric are characterized. The resulting empirical model equations are input to the 3- Dimensional Infiltration, version 5 (3DINFILv.5) process model to simulate infiltration of a composite panel.

  3. Compact Microscope Imaging System Developed

    NASA Technical Reports Server (NTRS)

    McDowell, Mark

    2001-01-01

    The Compact Microscope Imaging System (CMIS) is a diagnostic tool with intelligent controls for use in space, industrial, medical, and security applications. The CMIS can be used in situ with a minimum amount of user intervention. This system, which was developed at the NASA Glenn Research Center, can scan, find areas of interest, focus, and acquire images automatically. Large numbers of multiple cell experiments require microscopy for in situ observations; this is only feasible with compact microscope systems. CMIS is a miniature machine vision system that combines intelligent image processing with remote control capabilities. The software also has a user-friendly interface that can be used independently of the hardware for post-experiment analysis. CMIS has potential commercial uses in the automated online inspection of precision parts, medical imaging, security industry (examination of currency in automated teller machines and fingerprint identification in secure entry locks), environmental industry (automated examination of soil/water samples), biomedical field (automated blood/cell analysis), and microscopy community. CMIS will improve research in several ways: It will expand the capabilities of MSD experiments utilizing microscope technology. It may be used in lunar and Martian experiments (Rover Robot). Because of its reduced size, it will enable experiments that were not feasible previously. It may be incorporated into existing shuttle orbiter and space station experiments, including glove-box-sized experiments as well as ground-based experiments.

  4. Adult Personality Development: Dynamics and Processes

    PubMed Central

    Diehl, Manfred; Hooker, Karen

    2013-01-01

    The focus of this special issue of Research in Human Development is on adult personality and how personality may contribute to and be involved in adult development. Specifically, the contributions in this issue focus on the links between personality structures (e.g., traits) and personality processes (e.g., goal pursuit, self--regulation) and emphasize the contributions that intensive repeated measurement approaches can make to the understanding of personality and development across the adult life span. PMID:24068889

  5. Multi-processing system study

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The results are summarized of a multiprocessor systems design review. The systems were reviewed against use in a space station environment. The proposed designs were evaluated from a systems viewpoint in general and from the systems software in particular. The recommendations resulting from this evaluation are anticipated to be considered for the design of a multiprocessing system built around a SUMC. Two multiprocessing system designs were reviewed. The designs reviewed were highly functional and many questions could not be answered. However, several major issues were uncovered which could be evaluated to some detail, and which could greatly impact the SUMC-MP design. The major issues relevant to a multiprocessing system design revolve around the following functions: (1) Storage Management, (2) Processor Management, (3) Intermodule Communication, (4) Memory Access Interference, (5) System Efficiency, and (6) System Recovery/Reliability.

  6. Framework for control system development

    SciTech Connect

    Cork, C.; Nishimura, Hiroshi.

    1991-11-01

    Control systems being developed for the present generation of accelerators will need to adapt to changing machine and operating state conditions. Such systems must also be capable of evolving over the life of the accelerator operation. In this paper we present a framework for the development of adaptive control systems.

  7. Arcjet system integration development

    NASA Technical Reports Server (NTRS)

    Zafran, Sidney

    1994-01-01

    Compatibility between an arcjet propulsion system and a communications satellite was verified by testing a Government-furnished, 1.4 kW hydrazine arcjet system with the FLTSATCOM qualification model satellite in a 9.1-meter (30-foot) diameter thermal-vacuum test chamber. Background pressure was maintained at 10(exp -5) torr during arcjet operation by cryopumping the thruster exhaust with an array of 5 K liquid helium cooled panels. Power for the arcjet system was obtained from the FLTSATCOM battery simulator. Spacecraft telemetry was monitored during each thruster firing period. No changes in telemetry data attributable to arcjet operation were detected in any of the tests. Electromagnetic compatibility data obtained included radiated emission measurements, conducted emission measurements, and cable coupling measurements. Significant noise was observed at lower frequencies. Above 500 MHz, radiated emissions were generally within limits, indicating that communication links at S-band and higher frequencies will not be affected. Other test data taken with a diagnostic array of calorimeters, radiometers, witness plates, and a residual gas analyzer evidenced compatible operation, and added to the data base for arcjet system integration. Two test series were conducted. The first series only included the arcjet and diagnostic array operating at approximately 0.1 torr background pressure. The second series added the qualification model spacecraft, a solar panel, and the helium cryopanels. Tests were conducted at 0.1 torr and 10(exp-5) torr. The arcjet thruster was canted 20 degrees relative to the solar panel axis, typical of the configuration used for stationkeeping thrusters on geosynchronous communications satellites.

  8. Arcjet system integration development

    NASA Astrophysics Data System (ADS)

    Zafran, Sidney

    1994-03-01

    Compatibility between an arcjet propulsion system and a communications satellite was verified by testing a Government-furnished, 1.4 kW hydrazine arcjet system with the FLTSATCOM qualification model satellite in a 9.1-meter (30-foot) diameter thermal-vacuum test chamber. Background pressure was maintained at 10(exp -5) torr during arcjet operation by cryopumping the thruster exhaust with an array of 5 K liquid helium cooled panels. Power for the arcjet system was obtained from the FLTSATCOM battery simulator. Spacecraft telemetry was monitored during each thruster firing period. No changes in telemetry data attributable to arcjet operation were detected in any of the tests. Electromagnetic compatibility data obtained included radiated emission measurements, conducted emission measurements, and cable coupling measurements. Significant noise was observed at lower frequencies. Above 500 MHz, radiated emissions were generally within limits, indicating that communication links at S-band and higher frequencies will not be affected. Other test data taken with a diagnostic array of calorimeters, radiometers, witness plates, and a residual gas analyzer evidenced compatible operation, and added to the data base for arcjet system integration. Two test series were conducted. The first series only included the arcjet and diagnostic array operating at approximately 0.1 torr background pressure. The second series added the qualification model spacecraft, a solar panel, and the helium cryopanels. Tests were conducted at 0.1 torr and 10(exp-5) torr. The arcjet thruster was canted 20 degrees relative to the solar panel axis, typical of the configuration used for stationkeeping thrusters on geosynchronous communications satellites.

  9. AOIPS - An interactive image processing system. [Atmospheric and Oceanic Information Processing System

    NASA Technical Reports Server (NTRS)

    Bracken, P. A.; Dalton, J. T.; Quann, J. J.; Billingsley, J. B.

    1978-01-01

    The Atmospheric and Oceanographic Information Processing System (AOIPS) was developed to help applications investigators perform required interactive image data analysis rapidly and to eliminate the inefficiencies and problems associated with batch operation. This paper describes the configuration and processing capabilities of AOIPS and presents unique subsystems for displaying, analyzing, storing, and manipulating digital image data. Applications of AOIPS to research investigations in meteorology and earth resources are featured.

  10. DEMONSTRATION BULLETIN: ZENOGEM™ WASTEWATER TREATMENT PROCESS - ZENON ENVIRONMENTAL SYSTEMS

    EPA Science Inventory

    Zenon Environmental Systems (Zenon) has developed the ZenoGem™ process to remove organic compounds from wastewater by integrating biological treatment and membrane-based ultrafiltration. This innovative system combines biological treatment to remove biodegradable organic compou...

  11. The application of intelligent process control to space based systems

    NASA Technical Reports Server (NTRS)

    Wakefield, G. Steve

    1990-01-01

    The application of Artificial Intelligence to electronic and process control can help attain the autonomy and safety requirements of manned space systems. An overview of documented applications within various industries is presented. The development process is discussed along with associated issues for implementing an intelligence process control system.

  12. NASA System Engineering Design Process

    NASA Technical Reports Server (NTRS)

    Roman, Jose

    2011-01-01

    This slide presentation reviews NASA's use of systems engineering for the complete life cycle of a project. Systems engineering is a methodical, disciplined approach for the design, realization, technical management, operations, and retirement of a system. Each phase of a NASA project is terminated with a Key decision point (KDP), which is supported by major reviews.

  13. Geodyn systems development

    NASA Technical Reports Server (NTRS)

    Putney, B. H.

    1984-01-01

    The purpose of the GEODYN Orbit Determination and Parameter Estimation, the SOLVE and ERODYN Programs is to recover geodetic and geophysical parameters from satellite and other data in a state-of-the-art manner. Continued solutions for gravity field, pole positions, Earth rotation, GM, and baselines were made as part of the Crustal Dynamics Project. Some tidal parameters were recovered as well. The eight digit station identification number was incorporated in the software and new techniques for constraining monthly station parameters to each other are being developed. This is allowing the analysts even more flexibility in the shaping of solutions from monthly sets of normal equations and right-hand sides.

  14. Development of flow systems by direct-milling on poly(methyl methacrylate) substrates using UV-photopolymerization as sealing process.

    PubMed

    Rodrigues, Eunice R G O; Lapa, Rui A S

    2009-03-01

    An alternative process for the design and construction of fluidic devices is presented. Several sealing processes were studied, as well as the hydrodynamic characteristics of the proposed fluidic devices. Manifolds were imprinted on polymeric substrates by direct-write milling, according to Computer Assisted Design (CAD) data. Poly(methyl methacrylate) (PMMA) was used as substrate due to its physical and chemical properties. Different bonding approaches for the imprinted channels were evaluated and UV-photopolymerization of acrylic acid (AA) was selected. The hydrodynamic characteristics of the proposed flow devices were assessed and compared to those obtained in similar flow systems using PTFE reactors and micro-pumps as propulsion units (multi-pumping approach). The applicability of the imprinted reactors was evaluated in the sequential determination of calcium and magnesium in water samples. Results obtained were in good agreement with those obtained by the reference procedure. PMID:19276605

  15. Reliable software and communication 2: Controlling the software development process

    NASA Astrophysics Data System (ADS)

    Dalal, Siddhartha R.; Horgan, Joseph R.; Kettenring, Jon R.

    1994-01-01

    The software created by industrial, educational, and research organizations is increasingly large and complex. It also occupies a central role in the reliability and safety of many essential services. We examine the software development process and suggest opportunities for improving the process by using a combination of statistical and other process control techniques. Data, analysis of data, and tools for collecting data are crucial to our approach. Although our views are based upon experiences with large telecommunications systems, they are likely to be useful to many other developers of large software systems.

  16. Development of an automated platform for the verification, testing, processing and benchmarking of Evaluated Nuclear Data at the NEA Data Bank. Status of the NDEC system

    NASA Astrophysics Data System (ADS)

    Michel-Sendis, F.; Díez, C. J.; Cabellos, O.

    2016-03-01

    Modern nuclear data Quality Assurance (QA) is, in practice, a multistage process that aims at establishing a thorough assessment of the validity of the physical information contained in an evaluated nuclear data file as compared to our best knowledge of available experimental data and theoretical models. It should systematically address the performance of the evaluated file against available pertinent integral experiments, with proper and prior verification that the information encoded in the evaluation is accurately processed and reconstructed for the application conditions. The aim of the NDEC (Nuclear Data Evaluation Cycle) platform currently being developed by the Data Bank is to provide a correct and automated handling of these diverse QA steps in order to facilitate the expert human assessment of evaluated nuclear data files, both by the evaluators and by the end users of nuclear data.

  17. Development of modified FT (MFT) process

    SciTech Connect

    Jinglai Zhou; Zhixin Zhang; Wenjie Shen

    1995-12-31

    Two-Stage Modified FT (MFT) process has been developed for producing high-octane gasoline from coal-based syngas. The main R&D are focused on the development of catalysts and technologies process. Duration tests were finished in the single-tube reactor, pilot plant (100T/Y), and industrial demonstration plant (2000T/Y). A series of satisfactory results has been obtained in terms of operating reliability of equipments, performance of catalysts, purification of coal - based syngas, optimum operating conditions, properties of gasoline and economics etc. Further scaling - up commercial plant is being considered.

  18. Apoptotic processes during mammalian preimplantation development.

    PubMed

    Fabian, Dusan; Koppel, Juraj; Maddox-Hyttel, Poul

    2005-07-15

    The paper provides a review of the current state of knowledge on apoptosis during normal preimplantation development based on the literature and on the authors' own findings. Information is focused on the occurrence and the characteristics of spontaneous apoptotic processes. Reports concerning the chronology and the incidence of programmed cell death in mouse, cow, pig and human embryos in early preimplantation stages up to the blastocyst stage are summarized. In addition, specific attributes of the apoptotic process in mammalian preimplantation development are provided, including the description of both morphological and biochemical features of cell death. PMID:15955348

  19. Development of superplastic steel processing. Final report

    SciTech Connect

    Goldberg, A.

    1995-04-01

    Objective was to provide basis for producing, processing, and forming UHCS (ultrahigh carbon steel) on a commercial scale. Business plans were developed for potential commercialization. Effort was directed at improving the combination of flow stress and forming rates in UHCS alloys in order to make near net shape superplastic forming competitive; the result was the development of a series of UHCS alloys and processing, the selection of which depends on the specific requirements of the commercial application. Useful ancillary properties of these materials include: improved mechanical properties, wear resistance, and oxidation resistance at elevated temperatures.

  20. Engineering monitoring expert system's developer

    NASA Technical Reports Server (NTRS)

    Lo, Ching F.

    1991-01-01

    This research project is designed to apply artificial intelligence technology including expert systems, dynamic interface of neural networks, and hypertext to construct an expert system developer. The developer environment is specifically suited to building expert systems which monitor the performance of ground support equipment for propulsion systems and testing facilities. The expert system developer, through the use of a graphics interface and a rule network, will be transparent to the user during rule constructing and data scanning of the knowledge base. The project will result in a software system that allows its user to build specific monitoring type expert systems which monitor various equipments used for propulsion systems or ground testing facilities and accrues system performance information in a dynamic knowledge base.

  1. Development of emission factors for polycarbonate processing.

    PubMed

    Rhodes, Verne L; Kriek, George; Lazear, Nelson; Kasakevich, Jean; Martinko, Marie; Heggs, R P; Holdren, M W; Wisbith, A S; Keigley, G W; Williams, J D; Chuang, J C; Satola, J R

    2002-07-01

    Emission factors for selected volatile organic compounds (VOCs) and particulate emissions were developed while processing eight commercial grades of polycarbonate (PC) and one grade of a PC/acrylonitrile-butadiene-styrene (ABS) blend. A small commercial-type extruder was used, and the extrusion temperature was held constant at 304 degrees C. An emission factor was calculated for each substance measured and is reported as pounds released to the atmosphere/million pounds of polymer resin processed [ppm (wt/wt)]. Scaled to production volumes, these emission factors can be used by processors to estimate emission quantities from similar PC processing operations. PMID:12139342

  2. Volcanic processes in the Solar System

    USGS Publications Warehouse

    Carr, M.H.

    1987-01-01

    This article stresses that terrestrial volcanism represents only part of the range of volcanism in the solar system. Earth processes of volcanicity are dominated by plate tectonics, which does not seem to operate on other planets, except possibly on Venus. Lunar volcanicity is dominated by lava effusion at enormous rates. Mars is similar, with the addition to huge shield volcanoes developed over fixed hotspots. Io, the moon closest to Jupiter, is the most active body in the Solar System and, for example, much sulphur and silicates are emitted. The eruptions of Io are generated by heating caused by tides induced by Jupiter. Europa nearby seems to emit water from fractures and Ganymede is similar. The satellites of Saturn and Uranus are also marked by volcanic craters, but they are of very low temperature melts, possibly of ammonia and water. The volcanism of the solar system is generally more exotic, the greater the distance from Earth. -A.Scarth

  3. The Khoros software development environment for image and signal processing.

    PubMed

    Konstantinides, K; Rasure, J R

    1994-01-01

    Data flow visual language systems allow users to graphically create a block diagram of their applications and interactively control input, output, and system variables. Khoros is an integrated software development environment for information processing and visualization. It is particularly attractive for image processing because of its rich collection of tools for image and digital signal processing. This paper presents a general overview of Khoros with emphasis on its image processing and DSP tools. Various examples are presented and the future direction of Khoros is discussed. PMID:18291923

  4. Common Rail Injection System Development

    SciTech Connect

    Electro-Motive,

    2005-12-30

    The collaborative research program between the Department of energy and Electro-Motive Diesels, Inc. on the development of common rail fuel injection system for locomotive diesel engines that can meet US EPA Tier 2 exhaust emissions has been completed. This final report summarizes the objectives of the program, work scope, key accomplishments and research findings. The major objectives of this project encompassed identification of appropriate injection strategies by using advanced analytical tools, development of required prototype hardware/controls, investigations of fuel spray characteristics including cavitation phenomena, and validation of hareware using a single-cylinder research locomotive diesel engine. Major milestones included: (1) a detailed modeling study using advanced mathematical models - several various injection profiles that show simultaneous reduction of NOx and particulates on a four stroke-cycle locomotive diesel engine were identified; (2) development of new common rail fuel injection hardware capable of providing these injection profiles while meeting EMD engine and injection performance specifications. This hardware was developed together with EMD's current fuel injection component supplier. (3) Analysis of fuel spray characteristics. Fuel spray numerical studies and high speed photographic imaging analyses were performed. (4) Validation of new hardware and fuel injection profiles. EMD's single-cylinder research diesel engine located at Argonne National Laboratory was used to confirm emissions and performacne predictions. These analytical ane experimental investigations resulted in optimized fuel injection profiles and engine operating conditions that yield reductions in NOx emissions from 7.8 g/bhp-hr to 5.0 g/bhp-hr at full (rated) load. Additionally, hydrocarbon and particulate emissions were reduced considerably when compared to baseline Tier I levels. The most significant finding from the injection optimization process was a 2% to 3

  5. Development of the onboard digital processing system for the soft x-ray spectrometer of ASTRO-H: performance in the engineering model tests

    NASA Astrophysics Data System (ADS)

    Seta, H.; Tashiro, M. S.; Ishisaki, Y.; Tsujimoto, M.; Shimoda, Y.; Takeda, S.; Yamaguchi, S.; Mitsuda, K.; Fujimoto, R.; Takei, Y.; Kelley, R. L.; Boyce, K. R.; Kilbourne, C. A.; Porter, F. S.; Miko, J. J.; Masukawa, K.; Matsuda, K.

    2012-09-01

    We present the development status of the Pulse Shape Processor (PSP), which is the on-board digital electronics responsible for the signal processing of the X-ray microcalorimeter spectrometer instrument (the Soft X-ray Spectrometer; SXS) for the ASTRO-H satellite planned to be launched in 2014. We finished the design and fabrication for the engineering model, and are currently undertaking a series of performance verification and environmental tests. In this report, we summarize the results obtained in a part of the tests completed in the first half of this year.

  6. Library Information-Processing System

    NASA Technical Reports Server (NTRS)

    1985-01-01

    System works with Library of Congress MARC II format. System composed of subsystems that provide wide range of library informationprocessing capabilities. Format is American National Standards Institute (ANSI) format for machine-readable bibliographic data. Adaptable to any medium-to-large library.

  7. The Development of Face Processing in Autism

    ERIC Educational Resources Information Center

    Sasson, Noah J.

    2006-01-01

    Both behavioral and neuroimaging evidence indicate that individuals with autism demonstrate marked abnormalities in the processing of faces. These abnormalities are often explained as either the result of an innate impairment to specialized neural systems or as a secondary consequence of reduced levels of social interest. A review of the…

  8. Development of a low altitude airborne remote sensing system for supporting the processing of satellite remotely sensed data intended for archaeological investigations

    NASA Astrophysics Data System (ADS)

    Agapiou, Athos; Hadjimitsis, Diofantos G.; Georgopoulos, Andreas; Themistocleous, Kyriacos; Alexakis, Dimitris D.; Papadavid, George

    2012-10-01

    Earth observation techniques intended for archaeological research, such as satellite images and ground geophysical surveys are well established in the literature. In contrast, low altitude airborne systems for supporting archaeological research are still very limited. The "ICAROS" project, funded by the Cyprus Research Promotion Foundation, aims to develop an airborne system for archaeological investigations. The system will incorporate both a GER 1500 field spectroradiometer and NIR camera in a balloon system operated from the ground. The GER 1500 field spectroradiometer has the capability to record reflectance values from 400 nm up to 1050 nm (blue/green/red and NIR band). The Field of View (FOV) of the instrument is 4o while a calibrated spectralon panel will be used in order to minimize illumination errors during the data collection. Existing atmospheric conditions will be monitored using sun-photometer and meteorological station. The overall methodology of the project and the preliminary results from different cases studies in Cyprus are presented and discussed in this paper. Some practical problems are also discussed and the overall results are compared with satellite and ground measurements. Spectroradiometric measurements and NIR images will be taken from different heights from the balloon system. The results will be compared with different satellite images.

  9. PMIS Project. Planning & Management Information System. A Project To Develop a Data Processing System for Support of the Planning and Management Needs of Local School Districts. Final Report, Year 2.

    ERIC Educational Resources Information Center

    Council of the Great City Schools, Washington, DC.

    This document examines the design and structure of PMIS (Planning and Management Information System), an information system that supports the decisionmaking process of executive management in local school districts. The system is designed around a comprehensive, longitudinal, and interrelated data base. It utilizes a powerful real-time,…

  10. Development of the Low-Pressure Hydride/Dehydride Process

    SciTech Connect

    Rueben L. Gutierrez

    2001-04-01

    The low-pressure hydride/dehydride process was developed from the need to recover thin-film coatings of plutonium metal from the inner walls of an isotope separation chamber located at Los Alamos and to improve the safety operation of a hydride recovery process using hydrogen at a pressure of 0.7 atm at Rocky Flats. This process is now the heart of the Advanced Recovery and Integrated Extraction System (ARIES) project.

  11. ESS Cryogenic System Process Design

    NASA Astrophysics Data System (ADS)

    Arnold, P.; Hees, W.; Jurns, J.; Su, X. T.; Wang, X. L.; Weisend, J. G., II

    2015-12-01

    The European Spallation Source (ESS) is a neutron-scattering facility funded and supported in collaboration with 17 European countries in Lund, Sweden. Cryogenic cooling at ESS is vital particularly for the linear accelerator, the hydrogen target moderators, a test stand for cryomodules, the neutron instruments and their sample environments. The paper will focus on specific process design criteria, design decisions and their motivations for the helium cryoplants and auxiliary equipment. Key issues for all plants and their process concepts are energy efficiency, reliability, smooth turn-down behaviour and flexibility. The accelerator cryoplant (ACCP) and the target moderator cryoplant (TMCP) in particular need to be prepared for a range of refrigeration capacities due to the intrinsic uncertainties regarding heat load definitions. Furthermore the paper addresses questions regarding process arrangement, 2 K cooling methodology, LN2 precooling, helium storage, helium purification and heat recovery.

  12. Development of a short-term irradiance prediction system using post-processing tools on WRF-ARW meteorological forecasts in Spain

    NASA Astrophysics Data System (ADS)

    Rincón, A.; Jorba, O.; Baldasano, J. M.

    2010-09-01

    The increased contribution of solar energy in power generation sources requires an accurate estimation of surface solar irradiance conditioned by geographical, temporal and meteorological conditions. The knowledge of the variability of these factors is essential to estimate the expected energy production and therefore help stabilizing the electricity grid and increase the reliability of available solar energy. The use of numerical meteorological models in combination with statistical post-processing tools may have the potential to satisfy the requirements for short-term forecasting of solar irradiance for up to several days ahead and its application in solar devices. In this contribution, we present an assessment of a short-term irradiance prediction system based on the WRF-ARW mesoscale meteorological model (Skamarock et al., 2005) and several post-processing tools in order to improve the overall skills of the system in an annual simulation of the year 2004 in Spain. The WRF-ARW model is applied with 4 km x 4 km horizontal resolution and 38 vertical layers over the Iberian Peninsula. The hourly model irradiance is evaluated against more than 90 surface stations. The stations are used to assess the temporal and spatial fluctuations and trends of the system evaluating three different post-processes: Model Output Statistics technique (MOS; Glahn and Lowry, 1972), Recursive statistical method (REC; Boi, 2004) and Kalman Filter Predictor (KFP, Bozic, 1994; Roeger et al., 2003). A first evaluation of the system without post-processing tools shows an overestimation of the surface irradiance, due to the lack of atmospheric absorbers attenuation different than clouds not included in the meteorological model. This produces an annual BIAS of 16 W m-2 h-1, annual RMSE of 106 W m-2 h-1 and annual NMAE of 42%. The largest errors are observed in spring and summer, reaching RMSE of 350 W m-2 h-1. Results using Kalman Filter Predictor show a reduction of 8% of RMSE, 83% of BIAS

  13. Lexical Morphology: Structure, Process, and Development

    ERIC Educational Resources Information Center

    Jarmulowicz, Linda; Taran, Valentina L.

    2013-01-01

    Recent work has demonstrated the importance of derivational morphology to later language development and has led to a consensus that derivation is a lexical process. In this review, derivational morphology is discussed in terms of lexical representation models from both linguistic and psycholinguistic perspectives. Input characteristics, including…

  14. Development of a New Simultaneous Processing Scale.

    ERIC Educational Resources Information Center

    Keefe, James W.; Languis, Marlin L.

    The Learning Style Profile (LSP), developed in four phases from the fall of 1983 to early 1986, identifies perceptual responses, cognitive skills, study preferences, and instructional preferences. This study assesses the LSP Simultaneous Processing Skill subscale, which is modeled after the work of the Russian neuropsychologist, A. R. Luria.…

  15. L2 Chinese: Grammatical Development and Processing

    ERIC Educational Resources Information Center

    Mai, Ziyin

    2016-01-01

    Two recent books (Jiang, 2014, "Advances in Chinese as a second language"; Wang, 2013, "Grammatical development of Chinese among non-native speakers") provide new resources for exploring the role of processing in acquiring Chinese as a second language (L2). This review article summarizes, assesses and compares some of the…

  16. Developing Qualitative Research Questions: A Reflective Process

    ERIC Educational Resources Information Center

    Agee, Jane

    2009-01-01

    The reflective and interrogative processes required for developing effective qualitative research questions can give shape and direction to a study in ways that are often underestimated. Good research questions do not necessarily produce good research, but poorly conceived or constructed questions will likely create problems that affect all…

  17. Oil shale fines process developments in Brazil

    SciTech Connect

    Lisboa, A.C.; Nowicki, R.E. ); Piper, E.M. )

    1989-01-01

    The Petrobras oil shale retorting process, utilizes the particle range of +1/4 inch - 3 1/2 inches. The UPI plant in Sao Mateus do Sul has over 106,000 hours of operation, has processed over 6,200,000 metric tons of shale and has produced almost 3,000,000 barrels of shale oil. However, the nature of the raw oil shale is such that the amount of shale less than 1/4 inch that is mined and crushed and returned to the mine site is about 20 percent, thereby, increasing the cost of oil produced by a substantial number. Petrobras has investigated several systems to process the fines that are not handled by the 65 MTPH UPI plant and the 260 MTPH commercial plant. This paper provides an updated status of each of these processes in regard to the tests performed, potential contributions to an integrated use of the oil shale mine, and future considerations.

  18. Integrating system safety into the basic systems engineering process

    NASA Technical Reports Server (NTRS)

    Griswold, J. W.

    1971-01-01

    The basic elements of a systems engineering process are given along with a detailed description of what the safety system requires from the systems engineering process. Also discussed is the safety that the system provides to other subfunctions of systems engineering.

  19. MVC: A user-based on-line optimal control system for gas processing and treating plants. Development and results for claus sulfur recovery and sweetening modules. Topical report, June 1992-September 1993

    SciTech Connect

    Berkowitz, P.N.; Papadopoulos, M.N.; Colwell, L.W.; Poe, W.; Yiu, Y.

    1993-09-01

    The objective of this project was to develop and field validate modular, on-line, advanced control systems to optimize the operation of Claus sulfur recovery and sweetening in gas processing plants with emphasis on small and mid-sized facilities.

  20. Real-Time Sensor Validation System Developed

    NASA Technical Reports Server (NTRS)

    Zakrajsek, June F.

    1998-01-01

    Real-time sensor validation improves process monitoring and control system dependability by ensuring data integrity through automated detection of sensor data failures. The NASA Lewis Research Center, Expert Microsystems, and Intelligent Software Associates have developed an innovative sensor validation system that can automatically detect automated sensor failures in real-time for all types of mission-critical systems. This system consists of a sensor validation network development system and a real-time kernel. The network development system provides tools that enable systems engineers to automatically generate software that can be embedded within an application. The sensor validation methodology captured by these tools can be scaled to validate any number of sensors, and permits users to specify system sensitivity. The resulting software reliably detects all types of sensor data failures.

  1. System and method for deriving a process-based specification

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael Gerard (Inventor); Rash, James Larry (Inventor); Rouff, Christopher A. (Inventor)

    2009-01-01

    A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.

  2. Developing the E-Scape Software System

    ERIC Educational Resources Information Center

    Derrick, Karim

    2012-01-01

    Most innovations have contextual pre-cursors that prompt new ways of thinking and in their turn help to give form to the new reality. This was the case with the e-scape software development process. The origins of the system existed in software components and ideas that we had developed through previous projects, but the ultimate direction we took…

  3. Orbiter Thermal Protection System Development

    NASA Technical Reports Server (NTRS)

    Greenshields, D. H.

    1977-01-01

    The development of the Space Shuttle Orbiter Thermal Protection System (TPS) is traced from concept definition, through technical development, to final design and qualification for manned flight. A sufficiently detailed description of the TPS design is presented to support an indepth discussion of the key issues encountered in conceptual design, materials development, and structural integration. Emphasis is placed on the unique combination of requirements which resulted in the use not only of revolutionary design concepts and materials, but also of unique design criteria, newly developed analysis, testing and manufacturing methods, and finally of an unconventional approach to system certification for operational flight. The conclusion is drawn that a significant advance in all areas of thermal protection system development has been achieved which results in a highly efficient, flexible, and cost-effective thermal protection system for the Orbiter of the Space Shuttle System.

  4. Process development for scum to biodiesel conversion.

    PubMed

    Bi, Chong-hao; Min, Min; Nie, Yong; Xie, Qing-long; Lu, Qian; Deng, Xiang-yuan; Anderson, Erik; Li, Dong; Chen, Paul; Ruan, Roger

    2015-06-01

    A novel process was developed for converting scum, a waste material from wastewater treatment facilities, to biodiesel. Scum is an oily waste that was skimmed from the surface of primary and secondary settling tanks in wastewater treatment plants. Currently scum is treated either by anaerobic digestion or landfilling which raised several environmental issues. The newly developed process used a six-step method to convert scum to biodiesel, a higher value product. A combination of acid washing and acid catalyzed esterification was developed to remove soap and impurities while converting free fatty acids to methyl esters. A glycerol washing was used to facilitate the separation of biodiesel and glycerin after base catalyzed transesterification. As a result, 70% of dried and filtered scum was converted to biodiesel which is equivalent to about 134,000 gallon biodiesel per year for the Saint Paul waste water treatment plant in Minnesota. PMID:25770465

  5. Development of enhanced sulfur rejection processes

    SciTech Connect

    Yoon, R.H.; Luttrell, G.H.; Adel, G.T.; Richardson, P.E.

    1996-03-01

    Research at Virginia Tech led to the development of two complementary concepts for improving the removal of inorganic sulfur from many eastern U.S. coals. These concepts are referred to as Electrochemically Enhanced Sulfur Rejection (EESR) and Polymer Enhanced Sulfur Rejection (PESR) processes. The EESR process uses electrochemical techniques to suppress the formation of hydrophobic oxidation products believed to be responsible for the floatability of coal pyrite. The PESR process uses polymeric reagents that react with pyrite and convert floatable middlings, i.e., composite particles composed of pyrite with coal inclusions, into hydrophilic particles. These new pyritic-sulfur rejection processes do not require significant modifications to existing coal preparation facilities, thereby enhancing their adoptability by the coal industry. It is believed that these processes can be used simultaneously to maximize the rejection of both well-liberated pyrite and composite coal-pyrite particles. The project was initiated on October 1, 1992 and all technical work has been completed. This report is based on the research carried out under Tasks 2-7 described in the project proposal. These tasks include Characterization, Electrochemical Studies, In Situ Monitoring of Reagent Adsorption on Pyrite, Bench Scale Testing of the EESR Process, Bench Scale Testing of the PESR Process, and Modeling and Simulation.

  6. Information Processing in Cognition Process and New Artificial Intelligent Systems

    NASA Astrophysics Data System (ADS)

    Zheng, Nanning; Xue, Jianru

    In this chapter, we discuss, in depth, visual information processing and a new artificial intelligent (AI) system that is based upon cognitive mechanisms. The relationship between a general model of intelligent systems and cognitive mechanisms is described, and in particular we explore visual information processing with selective attention. We also discuss a methodology for studying the new AI system and propose some important basic research issues that have emerged in the intersecting fields of cognitive science and information science. To this end, a new scheme for associative memory and a new architecture for an AI system with attractors of chaos are addressed.

  7. ISE System Development Methodology Manual

    SciTech Connect

    Hayhoe, G.F.

    1992-02-17

    The Information Systems Engineering (ISE) System Development Methodology Manual (SDM) is a framework of life cycle management guidelines that provide ISE personnel with direction, organization, consistency, and improved communication when developing and maintaining systems. These guide-lines were designed to allow ISE to build and deliver Total Quality products, and to meet the goals and requirements of the US Department of Energy (DOE), Westinghouse Savannah River Company, and Westinghouse Electric Corporation.

  8. Emergent Systems Energy Laws for Predicting Myosin Ensemble Processivity

    PubMed Central

    Egan, Paul; Moore, Jeffrey; Schunn, Christian; Cagan, Jonathan; LeDuc, Philip

    2015-01-01

    In complex systems with stochastic components, systems laws often emerge that describe higher level behavior regardless of lower level component configurations. In this paper, emergent laws for describing mechanochemical systems are investigated for processive myosin-actin motility systems. On the basis of prior experimental evidence that longer processive lifetimes are enabled by larger myosin ensembles, it is hypothesized that emergent scaling laws could coincide with myosin-actin contact probability or system energy consumption. Because processivity is difficult to predict analytically and measure experimentally, agent-based computational techniques are developed to simulate processive myosin ensembles and produce novel processive lifetime measurements. It is demonstrated that only systems energy relationships hold regardless of isoform configurations or ensemble size, and a unified expression for predicting processive lifetime is revealed. The finding of such laws provides insight for how patterns emerge in stochastic mechanochemical systems, while also informing understanding and engineering of complex biological systems. PMID:25885169

  9. Reasoning with case histories of process knowledge for efficient process development

    NASA Technical Reports Server (NTRS)

    Bharwani, Seraj S.; Walls, Joe T.; Jackson, Michael E.

    1988-01-01

    The significance of compiling case histories of empirical process knowledge and the role of such histories in improving the efficiency of manufacturing process development is discussed in this paper. Methods of representing important investigations as cases and using the information from such cases to eliminate redundancy of empirical investigations in analogous process development situations are also discussed. A system is proposed that uses such methods to capture the problem-solving framework of the application domain. A conceptual design of the system is presented and discussed.

  10. The application of expert systems to process control

    NASA Astrophysics Data System (ADS)

    Wu, Bevan P. F.

    1991-01-01

    An expert system is a computer software technology developed from artificial intelligence research. It may be used for intelligent manufacturing process control and, when properly designed, has the capability to imitate human behavior. An expert system's value is to assist a human in executing realtime process control decisions in a complex system. This article provides an introduction to the concepts of intelligent process control and includes a process control scenario applying an expert system as well as statistical and optimization technologies. A simple guide of how to get started and a description of expert system tools are also presented.

  11. MVC: A user-based on-line optimal control system for small gas processing and treating plants. Development and results for lean oil absorption/desorption modules. Topical report, January-November 1991

    SciTech Connect

    Berkowitz, P.N.; Papadopoulos, M.N.; Klein, R.A.

    1992-01-01

    The two phase project involved the development and field validation of an optimal process control system for lean oil absorption/desorption gas processing plants. Phase 1 consisted of a field survey and software module development activity for the control modules. Phase 2 consisted of the field validation of the total package. The software package (called MVC) is a modular, on-line, advanced control system designed for gas processing and treating facilities. MVC is 386/486 PC-based and relatively inexpensive. Software modules (standarized self-contained process software packages for specific process units) were developed for Refrigerated Lean Oil Absorption and Lean Oil Recovery (Desorption) and were installed for field performance validation at ARCO's Denver City, Texas gas processing plant. Each module consists of process simulations, feedforward control equations, feedback trim equations and adaptive control features including re-linearization (for simulation equation coefficients) and drift factor equations (to correct for the normal drifting of on-line analyzers). The process optimization uses an economic module containing prices and operating costs which forms the basis of the profit maximization. The project duration was 10 1/2 months and was completed in December, 1991. Current process control optimization is expensive and generally infeasible for small and mid-sized gas plants. MVC is designed to gain more than 90% of the benefits of optimized advanced control while running for over 90% of the available service time. MVC utilizes standardized software algorithms with coefficients tailored to specific process units. MVC will give producers the economic incentive to process more gas and thus increase the availability to consumers of both natural gas and natural gas liquids at lower prices.

  12. Onboard Image Processing System for Hyperspectral Sensor.

    PubMed

    Hihara, Hiroki; Moritani, Kotaro; Inoue, Masao; Hoshi, Yoshihiro; Iwasaki, Akira; Takada, Jun; Inada, Hitomi; Suzuki, Makoto; Seki, Taeko; Ichikawa, Satoshi; Tanii, Jun

    2015-01-01

    Onboard image processing systems for a hyperspectral sensor have been developed in order to maximize image data transmission efficiency for large volume and high speed data downlink capacity. Since more than 100 channels are required for hyperspectral sensors on Earth observation satellites, fast and small-footprint lossless image compression capability is essential for reducing the size and weight of a sensor system. A fast lossless image compression algorithm has been developed, and is implemented in the onboard correction circuitry of sensitivity and linearity of Complementary Metal Oxide Semiconductor (CMOS) sensors in order to maximize the compression ratio. The employed image compression method is based on Fast, Efficient, Lossless Image compression System (FELICS), which is a hierarchical predictive coding method with resolution scaling. To improve FELICS's performance of image decorrelation and entropy coding, we apply a two-dimensional interpolation prediction and adaptive Golomb-Rice coding. It supports progressive decompression using resolution scaling while still maintaining superior performance measured as speed and complexity. Coding efficiency and compression speed enlarge the effective capacity of signal transmission channels, which lead to reducing onboard hardware by multiplexing sensor signals into a reduced number of compression circuits. The circuitry is embedded into the data formatter of the sensor system without adding size, weight, power consumption, and fabrication cost. PMID:26404281

  13. Onboard Image Processing System for Hyperspectral Sensor

    PubMed Central

    Hihara, Hiroki; Moritani, Kotaro; Inoue, Masao; Hoshi, Yoshihiro; Iwasaki, Akira; Takada, Jun; Inada, Hitomi; Suzuki, Makoto; Seki, Taeko; Ichikawa, Satoshi; Tanii, Jun

    2015-01-01

    Onboard image processing systems for a hyperspectral sensor have been developed in order to maximize image data transmission efficiency for large volume and high speed data downlink capacity. Since more than 100 channels are required for hyperspectral sensors on Earth observation satellites, fast and small-footprint lossless image compression capability is essential for reducing the size and weight of a sensor system. A fast lossless image compression algorithm has been developed, and is implemented in the onboard correction circuitry of sensitivity and linearity of Complementary Metal Oxide Semiconductor (CMOS) sensors in order to maximize the compression ratio. The employed image compression method is based on Fast, Efficient, Lossless Image compression System (FELICS), which is a hierarchical predictive coding method with resolution scaling. To improve FELICS’s performance of image decorrelation and entropy coding, we apply a two-dimensional interpolation prediction and adaptive Golomb-Rice coding. It supports progressive decompression using resolution scaling while still maintaining superior performance measured as speed and complexity. Coding efficiency and compression speed enlarge the effective capacity of signal transmission channels, which lead to reducing onboard hardware by multiplexing sensor signals into a reduced number of compression circuits. The circuitry is embedded into the data formatter of the sensor system without adding size, weight, power consumption, and fabrication cost. PMID:26404281

  14. New developments of process technologies for microfabrication

    NASA Astrophysics Data System (ADS)

    Piotter, Volker; Hanemann, Thomas; Ruprecht, Robert; Thies, Andreas; Hausselt, Juergen H.

    1997-09-01

    Economic success of microsystems technology requires cost- effective fabrication in large series as well as a great diversity of materials processing technologies. The different techniques of micro molding meet all these requirements. An important economic factor is the reduction of cycle time by process and tool optimization with simulation techniques. Actually, minimal cycle times are about two minutes in certain cases. Evolution of thermoplastics processing technologies is demonstrated by application of technical or even high- performance polymers like PEEK, PMMA or PSU. For manufacturing of metal microstructures, we develop three possibilities: microstructures like stepped LIGA gear wheels are obtained from galvanization on lost molds, which have been injection molded using conductively filled polymers. Additionally, electroless plating is used to replicate nonconducting plastic microstructures and the metal injection molding (MIM) process is under development. A quite different approach uses polymer precursors containing monomer/polymer mixtures in reaction injection molding. We chose photoinduced polymerization without any preheating step using photopolymerizable resins. Avoiding the time consuming thermal cycle, molding takes place at ambient temperature. Due to the low viscosity, the microcavities should be filled completely. The process is characterized by the integration of a powerful UV-source and a partially glass made molding tool.

  15. Process development of thin strip steel casting

    SciTech Connect

    Sussman, R.C.; Williams, R.S.

    1990-12-01

    An important new frontier is being opened in steel processing with the emergence of thin strip casting. Casting steel directly to thin strip has enormous benefits in energy savings by potentially eliminating the need for hot reduction in a hot strip mill. This has been the driving force for numerous current research efforts into the direct strip casting of steel. The US Department of Energy initiated a program to evaluate the development of thin strip casting in the steel industry. In earlier phases of this program, planar flow casting on an experimental caster was studied by a team of engineers from Westinghouse Electric corporation and Armco Inc. A subsequent research program was designed as a fundamental and developmental study of both planar and melt overflow casting processes. This study was arranged as several separate and distinct tasks which were often completed by different teams of researchers. An early task was to design and build a water model to study fluid flow through different designs of planar flow casting nozzles. Another important task was mathematically modeling of melt overflow casting process. A mathematical solidification model for the formation of the strip in the melt overflow process was written. A study of the material and conditioning of casting substrates was made on the small wheel caster using the melt overflow casting process. This report discusses work on the development of thin steel casting.

  16. TECHNOLOGY DEVELOPMENT ON THE DUPIC SAFEGUARDS SYSTEM

    SciTech Connect

    H. KIM; H. CHA; ET AL

    2001-02-01

    A safeguards system has been developed since 1993 in the course of supporting a fuel cycle process to fabricate CANDU fuel with spent PWR fuel (known as Direct Use of PWR spent fuel In CANDU, DUPIC). The major safeguards technology involved here was to design and fabricate a neutron coincidence counting system for process accountability, and also an unattended continuous monitoring system in association with independent verification by the IAEA. This combined technology was to produce information of nuclear material content and to maintain knowledge of the continuity of nuclear material flow. In addition to hardware development, diagnosis software is being developed to assist data acquisition, data review, and data evaluation based on a neural network system on the IAEA C/S system.

  17. Pilot testing and development of a full-scale Carrousel{reg_sign} activated sludge system for treating potato processing wastewaters

    SciTech Connect

    Menon, R.; Grames, L.M.

    1996-11-01

    Pilot Carrousel testing was conducted for about three months on wastewaters generated at a major potato processing facility in 1993. The testing focused toward removal of BOD, NH{sub 3} and NO{sub 3}, and Total-P. After five-six weeks that it took for the system to reach steady state operation, the pilot plant was able to treat the wastewaters quite well. Effluent BOD{sub 5} and TKN values were less than 8 and 4 mg/L, respectively, during the second half of testing. Total-P in the effluent was less than 10 mg/L, although this step was not optimized. Based on the pilot testing, a full-scale Carrousel activated sludge plant was designed and commissioned in 1994. This plant is currently treating all the wastewaters from the facility and performing contaminant removals at a very high level.

  18. Development of high-speed reactive processing system for carbon fiber-reinforced polyamide-6 composite: In-situ anionic ring-opening polymerization

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Woo; Seong, Dong Gi; Yi, Jin-Woo; Um, Moon-Kwang

    2016-05-01

    In order to manufacture carbon fiber-reinforced polyamide-6 (PA-6) composite, we optimized the reactive processing system. The in-situ anionic ring-opening polymerization of ɛ-caprolactam was utilized with proper catalyst and initiator for PA-6 matrix. The mechanical properties such as tensile strength, inter-laminar shear strength and compressive strength of the produced carbon fiber-reinforced PA-6 composite were measured, which were compared with the corresponding scanning electron microscope (SEM) images to investigate the polymer properties as well as the interfacial interaction between fiber and polymer matrix. Furthermore, kinetics of in-situ anionic ring-opening polymerization of ɛ-caprolactam will be discussed in the viewpoint of increasing manufacturing speed and interfacial bonding between PA-6 matrix and carbon fiber during polymerization.

  19. Thermal EOR process research and development

    SciTech Connect

    Engi, D.; Aeschliman, D.P.; Moreno, J.B.

    1985-01-01

    This paper describes recent results of Research and Development activities associated with thermal EOR processes in the areas of process mapping, downhole steam generator (DSG) materials performance, and insulated tubulars. Field measurements which can be used to map overall geometry and local stability of the displacement fronts would be particularly useful in the context of process control. Electromagnetic techniques which are being developed to make these measurements are currently limited to qualitative interpretations. This paper first describes lab scale physical simulations being conducted to develop a fundamental understanding of the variations in electrical properties associated with the constitutive zones of a recovery process to improve our ability to interpret field data. Field tests in 1981 to 1982 identified DSG combustor material failure as the primary technical concern in DSG development, with the presumption that the failure mode was due to mixed hot gas corrosion. Experimental results are summarized here for a variety of candidate combustor materials, chosen largely for corrosion resistance. No important differences in performance were found. More recently, analytical studies supported by a single experiment on a thin-walled combustor liner have suggested that thermally-induced stress in the thick-walled liners used is the probable cause of failure. Insulated steam injection tubulars have been observed to be less effective in reducing wellbore heat loss if operated in a wet wellbore - a typical situation. Wellbore refluxing, a process analogous to the action of a heat pipe, was proposed in 1983 to be the source of the reduced efficiency. The results of recently completed tests on insulated tubing in wet and dry wellbores are reported. 23 references, 16 figures, 3 tables.

  20. Exothermic furnace module development. [space processing

    NASA Technical Reports Server (NTRS)

    Darnell, R. R.; Poorman, R. M.

    1982-01-01

    An exothermic furnace module was developed to rapidly heat and cool a 0.820-in. (2.1 cm) diameter by 2.75-in. (7.0 cm) long TZM molybdenum alloy crucible. The crucible contains copper, oxygen, and carbon for processing in a low-g environment. Peak temperatures of 1270 C were obtainable 3.5 min after start of ignition, and cooling below 950 C some 4.5 min later. These time-temperature relationships were conditioned for a foam-copper experiment, Space Processing Applications Rocket experiment 77-9, in a sounding rocket having a low-g period of 5 min.

  1. Compact handheld digital holographic microscopy system development

    NASA Astrophysics Data System (ADS)

    Singh, Vijay Raj; Sui, Liansheng; Asundi, Anand

    2009-12-01

    Development of a commercial prototype of reflection handheld digital holographic microscope system is presented in this paper. The concept is based on lensless magnification using diverging wave geometry and the miniaturized optical design which provides a compact packaged system. The optical geometry design provides the same curvature of object and reference waves and thus phase aberration is automatically compensated. The basic methodology of the system is developed and it further explored for 3D imaging, static deflection and vibration measurements applications. Based on the developed methodology an user-friendly software is developed suitable for industrial shop floor environment. The applications of the system are presented for 3D imaging, static deflection measurement and vibration analysis of MEMS samples. The developed system is well suitable for the testing of MEMS and Microsystems samples, with full-field and real-time features, for static and dynamic inspection and characterization and to monitor micro-fabrication process.

  2. Compact handheld digital holographic microscopy system development

    NASA Astrophysics Data System (ADS)

    Singh, Vijay Raj; Sui, Liansheng; Asundi, Anand

    2010-03-01

    Development of a commercial prototype of reflection handheld digital holographic microscope system is presented in this paper. The concept is based on lensless magnification using diverging wave geometry and the miniaturized optical design which provides a compact packaged system. The optical geometry design provides the same curvature of object and reference waves and thus phase aberration is automatically compensated. The basic methodology of the system is developed and it further explored for 3D imaging, static deflection and vibration measurements applications. Based on the developed methodology an user-friendly software is developed suitable for industrial shop floor environment. The applications of the system are presented for 3D imaging, static deflection measurement and vibration analysis of MEMS samples. The developed system is well suitable for the testing of MEMS and Microsystems samples, with full-field and real-time features, for static and dynamic inspection and characterization and to monitor micro-fabrication process.

  3. Advanced technology development for image gathering, coding, and processing

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.

    1990-01-01

    Three overlapping areas of research activities are presented: (1) Information theory and optimal filtering are extended to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing. (2) Focal-plane processing techniques and technology are developed to combine effectively image gathering with coding. The emphasis is on low-level vision processing akin to the retinal processing in human vision. (3) A breadboard adaptive image-coding system is being assembled. This system will be used to develop and evaluate a number of advanced image-coding technologies and techniques as well as research the concept of adaptive image coding.

  4. Information Processing in Living Systems

    NASA Astrophysics Data System (ADS)

    Tkačik, Gašper; Bialek, William

    2016-03-01

    Life depends as much on the flow of information as on the flow of energy. Here we review the many efforts to make this intuition precise. Starting with the building blocks of information theory, we explore examples where it has been possible to measure, directly, the flow of information in biological networks, or more generally where information-theoretic ideas have been used to guide the analysis of experiments. Systems of interest range from single molecules (the sequence diversity in families of proteins) to groups of organisms (the distribution of velocities in flocks of birds), and all scales in between. Many of these analyses are motivated by the idea that biological systems may have evolved to optimize the gathering and representation of information, and we review the experimental evidence for this optimization, again across a wide range of scales.

  5. Compact Process Development at Babcock & Wilcox

    SciTech Connect

    Eric Shaber; Jeffrey Phillips

    2012-03-01

    Multiple process approaches have been used historically to manufacture cylindrical nuclear fuel compacts. Scale-up of fuel compacting was required for the Next Generation Nuclear Plant (NGNP) project to achieve an economically viable automated production process capable of providing a minimum of 10 compacts/minute with high production yields. In addition, the scale-up effort was required to achieve matrix density equivalent to baseline historical production processes, and allow compacting at fuel packing fractions up to 46% by volume. The scale-up approach of jet milling, fluid-bed overcoating, and hot-press compacting adopted in the U.S. Advanced Gas Reactor (AGR) Fuel Development Program involves significant paradigm shifts to capitalize on distinct advantages in simplicity, yield, and elimination of mixed waste. A series of compaction trials have been completed to optimize compaction conditions of time, temperature, and forming pressure using natural uranium oxycarbide (NUCO) fuel at packing fractions exceeding 46% by volume. Results from these trials are included. The scale-up effort is nearing completion with the process installed and operable using nuclear fuel materials. Final process testing is in progress to certify the process for manufacture of qualification test fuel compacts in 2012.

  6. Crop monitoring & yield forecasting system based on Synthetic Aperture Radar (SAR) and process-based crop growth model: Development and validation in South and South East Asian Countries

    NASA Astrophysics Data System (ADS)

    Setiyono, T. D.

    2014-12-01

    Accurate and timely information on rice crop growth and yield helps governments and other stakeholders adapting their economic policies and enables relief organizations to better anticipate and coordinate relief efforts in the wake of a natural catastrophe. Such delivery of rice growth and yield information is made possible by regular earth observation using space-born Synthetic Aperture Radar (SAR) technology combined with crop modeling approach to estimate yield. Radar-based remote sensing is capable of observing rice vegetation growth irrespective of cloud coverage, an important feature given that in incidences of flooding the sky is often cloud-covered. The system allows rapid damage assessment over the area of interest. Rice yield monitoring is based on a crop growth simulation and SAR-derived key information, particularly start of season and leaf growth rate. Results from pilot study sites in South and South East Asian countries suggest that incorporation of SAR data into crop model improves yield estimation for actual yields. Remote-sensing data assimilation into crop model effectively capture responses of rice crops to environmental conditions over large spatial coverage, which otherwise is practically impossible to achieve. Such improvement of actual yield estimates offers practical application such as in a crop insurance program. Process-based crop simulation model is used in the system to ensure climate information is adequately captured and to enable mid-season yield forecast.

  7. Development of the T+M coupled flow-geomechanical simulator to describe fracture propagation and coupled flow-thermal-geomechanical processes in tight/shale gas systems

    NASA Astrophysics Data System (ADS)

    Kim, Jihoon; Moridis, George J.

    2013-10-01

    We developed a hydraulic fracturing simulator by coupling a flow simulator to a geomechanics code, namely T+M simulator. Modeling of the vertical fracture development involves continuous updating of the boundary conditions and of the data connectivity, based on the finite element method for geomechanics. The T+M simulator can model the initial fracture development during the hydraulic fracturing operations, after which the domain description changes from single continuum to double or multiple continua in order to rigorously model both flow and geomechanics for fracture-rock matrix systems. The T+H simulator provides two-way coupling between fluid-heat flow and geomechanics, accounting for thermo-poro-mechanics, treats nonlinear permeability and geomechanical moduli explicitly, and dynamically tracks changes in the fracture(s) and in the pore volume. We also fully account for leak-off in all directions during hydraulic fracturing. We first test the T+M simulator, matching numerical solutions with the analytical solutions for poromechanical effects, static fractures, and fracture propagations. Then, from numerical simulation of various cases of the planar fracture propagation, shear failure can limit the vertical fracture propagation of tensile failure, because of leak-off into the reservoirs. Slow injection causes more leak-off, compared with fast injection, when the same amount of fluid is injected. Changes in initial total stress and contributions of shear effective stress to tensile failure can also affect formation of the fractured areas, and the geomechanical responses are still well-posed.

  8. Development of the T+M coupled flow–geomechanical simulator to describe fracture propagation and coupled flow–thermal–geomechanical processes in tight/shale gas systems

    SciTech Connect

    Kim, Jihoon; Moridis, George J.

    2013-10-01

    We developed a hydraulic fracturing simulator by coupling a flow simulator to a geomechanics code, namely T+M simulator. Modeling of the vertical fracture development involves continuous updating of the boundary conditions and of the data connectivity, based on the finite element method for geomechanics. The T+M simulator can model the initial fracture development during the hydraulic fracturing operations, after which the domain description changes from single continuum to double or multiple continua in order to rigorously model both flow and geomechanics for fracture-rock matrix systems. The T+H simulator provides two-way coupling between fluid-heat flow and geomechanics, accounting for thermoporomechanics, treats nonlinear permeability and geomechanical moduli explicitly, and dynamically tracks changes in the fracture(s) and in the pore volume. We also fully accounts for leak-off in all directions during hydraulic fracturing. We first validate the T+M simulator, matching numerical solutions with the analytical solutions for poromechanical effects, static fractures, and fracture propagations. Then, from numerical simulation of various cases of the planar fracture propagation, shear failure can limit the vertical fracture propagation of tensile failure, because of leak-off into the reservoirs. Slow injection causes more leak-off, compared with fast injection, when the same amount of fluid is injected. Changes in initial total stress and contributions of shear effective stress to tensile failure can also affect formation of the fractured areas, and the geomechanical responses are still well-posed.

  9. ENGINEERED BARRIER SYSTEM FEATURES, EVENTS, AND PROCESSES

    SciTech Connect

    na

    2005-05-30

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1 - 1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1 - 1). The objective of this analysis was to develop the BDCFs for the

  10. SOFC system with integrated catalytic fuel processing

    NASA Astrophysics Data System (ADS)

    Finnerty, Caine; Tompsett, Geoff. A.; Kendall, Kevin; Ormerod, R. Mark

    In recent years, there has been much interest in the development of solid oxide fuel cell technology operating directly on hydrocarbon fuels. The development of a catalytic fuel processing system, which is integrated with the solid oxide fuel cell (SOFC) power source is outlined here. The catalytic device utilises a novel three-way catalytic system consisting of an in situ pre-reformer catalyst, the fuel cell anode catalyst and a platinum-based combustion catalyst. The three individual catalytic stages have been tested in a model catalytic microreactor. Both temperature-programmed and isothermal reaction techniques have been applied. Results from these experiments were used to design the demonstration SOFC unit. The apparatus used for catalytic characterisation can also perform in situ electrochemical measurements as described in previous papers [C.M. Finnerty, R.H. Cunningham, K. Kendall, R.M. Ormerod, Chem. Commun. (1998) 915-916; C.M. Finnerty, N.J. Coe, R.H. Cunningham, R.M. Ormerod, Catal. Today 46 (1998) 137-145]. This enabled the performance of the SOFC to be determined at a range of temperatures and reaction conditions, with current output of 290 mA cm -2 at 0.5 V, being recorded. Methane and butane have been evaluated as fuels. Thus, optimisation of the in situ partial oxidation pre-reforming catalyst was essential, with catalysts producing high H 2/CO ratios at reaction temperatures between 873 K and 1173 K being chosen. These included Ru and Ni/Mo-based catalysts. Hydrocarbon fuels were directly injected into the catalytic SOFC system. Microreactor measurements revealed the reaction mechanisms as the fuel was transported through the three-catalyst device. The demonstration system showed that the fuel processing could be successfully integrated with the SOFC stack.

  11. Development of a one-dimensional electro-thermophysical model of the snow sea-ice system: Arctic climate processes and microwave remote sensing applications

    NASA Astrophysics Data System (ADS)

    Hanesiak, John Michael

    Snow covered sea ice plays a crucial role in the earth's climate. This includes polar biology, local, regional and world weather and ocean circulations as well as indigenous people's way of life. Recent research has indicated significant climate change in the polar regions, especially the Canadian arctic. Polar climate processes are also among the most poorly misrepresented within global circulation models (GCMs). The goal of this thesis is to improve our understanding and capability to simulate arctic climate processes in a predictive sense. An electro-thermophysical relationship exists between the thermophysical characteristics (climate variables and processes) and electrical properties (dielectrics) that control microwave remote sensing of snow-covered first- year sea ice (FYI). This work explicitly links microwave dielectrics and a thermodynamic model of snow and sea ice by addressing four key issues. These includes: (1)ensure the existing one-dimensional sea ice models treat the surface energy balance (SEB) and snow/ice thermodynamics in the appropriate time scales we see occurring in field experiments, (2)ensure the snow/ice thermodynamics are not compromised by differences in environmental and spatial representation within components of the SEB, (3)ensure the snow layer is properly handled in the modeling environment, and (4)how we can make use of satellite microwave remote sensing data within the model environment. Results suggest that diurnal processes are critical and need to be accounted for in modeling snow-covered FYI, similar to time scales acting in microwave remote sensing signatures. Output from the coupled snow sea-ice model provides the required input to microwave dielectric models of snow and sea ice to predict microwave penetration depths within the snow and sea ice (an Electro-Thermophysical model of the Snow Sea Ice System (ETSSIS)). Results suggest ETSSIS can accurately simulate microwave penetration depths in the cold dry snow season and

  12. Processed pseudogenes acquired somatically during cancer development.

    PubMed

    Cooke, Susanna L; Shlien, Adam; Marshall, John; Pipinikas, Christodoulos P; Martincorena, Inigo; Tubio, Jose M C; Li, Yilong; Menzies, Andrew; Mudie, Laura; Ramakrishna, Manasa; Yates, Lucy; Davies, Helen; Bolli, Niccolo; Bignell, Graham R; Tarpey, Patrick S; Behjati, Sam; Nik-Zainal, Serena; Papaemmanuil, Elli; Teixeira, Vitor H; Raine, Keiran; O'Meara, Sarah; Dodoran, Maryam S; Teague, Jon W; Butler, Adam P; Iacobuzio-Donahue, Christine; Santarius, Thomas; Grundy, Richard G; Malkin, David; Greaves, Mel; Munshi, Nikhil; Flanagan, Adrienne M; Bowtell, David; Martin, Sancha; Larsimont, Denis; Reis-Filho, Jorge S; Boussioutas, Alex; Taylor, Jack A; Hayes, Neil D; Janes, Sam M; Futreal, P Andrew; Stratton, Michael R; McDermott, Ultan; Campbell, Peter J

    2014-01-01

    Cancer evolves by mutation, with somatic reactivation of retrotransposons being one such mutational process. Germline retrotransposition can cause processed pseudogenes, but whether this occurs somatically has not been evaluated. Here we screen sequencing data from 660 cancer samples for somatically acquired pseudogenes. We find 42 events in 17 samples, especially non-small cell lung cancer (5/27) and colorectal cancer (2/11). Genomic features mirror those of germline LINE element retrotranspositions, with frequent target-site duplications (67%), consensus TTTTAA sites at insertion points, inverted rearrangements (21%), 5' truncation (74%) and polyA tails (88%). Transcriptional consequences include expression of pseudogenes from UTRs or introns of target genes. In addition, a somatic pseudogene that integrated into the promoter and first exon of the tumour suppressor gene, MGA, abrogated expression from that allele. Thus, formation of processed pseudogenes represents a new class of mutation occurring during cancer development, with potentially diverse functional consequences depending on genomic context. PMID:24714652

  13. Processed pseudogenes acquired somatically during cancer development

    PubMed Central

    Cooke, Susanna L.; Shlien, Adam; Marshall, John; Pipinikas, Christodoulos P.; Martincorena, Inigo; Tubio, Jose M.C.; Li, Yilong; Menzies, Andrew; Mudie, Laura; Ramakrishna, Manasa; Yates, Lucy; Davies, Helen; Bolli, Niccolo; Bignell, Graham R.; Tarpey, Patrick S.; Behjati, Sam; Nik-Zainal, Serena; Papaemmanuil, Elli; Teixeira, Vitor H.; Raine, Keiran; O’Meara, Sarah; Dodoran, Maryam S.; Teague, Jon W.; Butler, Adam P.; Iacobuzio-Donahue, Christine; Santarius, Thomas; Grundy, Richard G.; Malkin, David; Greaves, Mel; Munshi, Nikhil; Flanagan, Adrienne M.; Bowtell, David; Martin, Sancha; Larsimont, Denis; Reis-Filho, Jorge S.; Boussioutas, Alex; Taylor, Jack A.; Hayes, Neil D.; Janes, Sam M.; Futreal, P. Andrew; Stratton, Michael R.; McDermott, Ultan; Campbell, Peter J.; Provenzano, Elena; van de Vijver, Marc; Richardson, Andrea L.; Purdie, Colin; Pinder, Sarah; Mac Grogan, Gaetan; Vincent-Salomon, Anne; Larsimont, Denis; Grabau, Dorthe; Sauer, Torill; Garred, Øystein; Ehinger, Anna; Van den Eynden, Gert G.; van Deurzen, C.H.M; Salgado, Roberto; Brock, Jane E.; Lakhani, Sunil R.; Giri, Dilip D.; Arnould, Laurent; Jacquemier, Jocelyne; Treilleux, Isabelle; Caldas, Carlos; Chin, Suet-Feung; Fatima, Aquila; Thompson, Alastair M.; Stenhouse, Alasdair; Foekens, John; Martens, John; Sieuwerts, Anieta; Brinkman, Arjen; Stunnenberg, Henk; Span, Paul N.; Sweep, Fred; Desmedt, Christine; Sotiriou, Christos; Thomas, Gilles; Broeks, Annegein; Langerod, Anita; Aparicio, Samuel; Simpson, Peter T.; van ’t Veer, Laura; Erla Eyfjörd, Jórunn; Hilmarsdottir, Holmfridur; Jonasson, Jon G.; Børresen-Dale, Anne-Lise; Lee, Ming Ta Michael; Wong, Bernice Huimin; Tan, Benita Kiat Tee; Hooijer, Gerrit K.J.

    2014-01-01

    Cancer evolves by mutation, with somatic reactivation of retrotransposons being one such mutational process. Germline retrotransposition can cause processed pseudogenes, but whether this occurs somatically has not been evaluated. Here we screen sequencing data from 660 cancer samples for somatically acquired pseudogenes. We find 42 events in 17 samples, especially non-small cell lung cancer (5/27) and colorectal cancer (2/11). Genomic features mirror those of germline LINE element retrotranspositions, with frequent target-site duplications (67%), consensus TTTTAA sites at insertion points, inverted rearrangements (21%), 5′ truncation (74%) and polyA tails (88%). Transcriptional consequences include expression of pseudogenes from UTRs or introns of target genes. In addition, a somatic pseudogene that integrated into the promoter and first exon of the tumour suppressor gene, MGA, abrogated expression from that allele. Thus, formation of processed pseudogenes represents a new class of mutation occurring during cancer development, with potentially diverse functional consequences depending on genomic context. PMID:24714652

  14. Development of Bio-GAS systems

    NASA Technical Reports Server (NTRS)

    Takayanagi, M.; Kitamura, S.; Nemoto, H.; Kimura, T.; Zaiki, Y.; Kitakohji, T.; Fujita, S.; Kameda, M.; Noda, M.; Kawasaki, Y.

    1988-01-01

    Four experiment systems which have fundamental significance in the field of biotechnology are developed for the Get Away Special (GAS). Unique considerations were necessary to develop the systems which carry out biotechnological experiments under GAS's restricted conditions: delicate thermal control, fluid handling and protection from contamination. All experimental processes are controlled by internal sequencers and results of the experiments are recorded as images and numerical data within the systems. The systems are standardized in order to enable repeated use with a variety of experiments by replacement of the experiment modules and modification of experiment sequencing programs.

  15. Development of the selective coagulation process

    SciTech Connect

    Yoon, R.H.; Luttrell, G.H.

    1992-07-01

    The selective hydrophobic coagulation (SHC) process is based on the recent finding that hydrophobic particles can be selectively coagulated without using traditional agglomerating agents or flocculants. The driving force for the coagulation is the attractive energy between hydrophobic surfaces, an interaction that has been overlooked in classical colloid chemistry. In most cases, selective separations can be achieved using simple pH control to disperse the mineral matter, followed by recovery of the coal coagula using techniques that take advantage of the size enlargement. In the present work, studies have been carried out to further investigate the fundamental mechanisms of the SHC process and the parameters that affect the process of separating coal from the ash-forming minerals and pyritic sulfur. Studies have included direct force measurements of the attractive interaction between model hydrophobic surfaces, in-situ measurements of the size distributions of coagula formed under a variety of operating conditions, and development of a population balance model to describe the coagulation process. An extended DLVO colloid stability model which includes a hydrophobic interaction energy term has also been developed to explain the findings obtained from the experimental studies. In addition to the fundamental studies, bench-scale process development test work has been performed to establish the best possible method of separating the coagula from dispersed mineral matter. Two types of separators, i.e., a sedimentation tank and a rotating drum screen, were examined in this study. The sedimentation tank proved to be the more efficient unit, achieving ash reductions as high as 60% in a single pass while recovering more than 90% of the combustible material. This device, which minimizes turbulence and coagula breakage, was used in subsequent test work to optimize design and operating parameters.

  16. Electrochemical decontamination system for actinide processing gloveboxes

    SciTech Connect

    Wedman, D.E.; Lugo, J.L.; Ford, D.K.; Nelson, T.O.; Trujillo, V.L.; Martinez, H.E.

    1998-03-01

    An electrolytic decontamination technology has been developed and successfully demonstrated at Los Alamos National Laboratory (LANL) for the decontamination of actinide processing gloveboxes. The technique decontaminates the interior surfaces of stainless steel gloveboxes utilizing a process similar to electropolishing. The decontamination device is compact and transportable allowing it to be placed entirely within the glovebox line. In this way, decontamination does not require the operator to wear any additional personal protective equipment and there is no need for additional air handling or containment systems. Decontamination prior to glovebox decommissioning reduces the potential for worker exposure and environmental releases during the decommissioning, transport, and size reduction procedures which follow. The goal of this effort is to reduce contamination levels of alpha emitting nuclides for a resultant reduction in waste level category from High Level Transuranic (TRU) to low Specific Activity (LSA, less than or equal 100 nCi/g). This reduction in category results in a 95% reduction in disposal and disposition costs for the decontaminated gloveboxes. The resulting contamination levels following decontamination by this method are generally five orders of magnitude below the LSA specification. Additionally, the sodium sulfate based electrolyte utilized in the process is fully recyclable which results in the minimum of secondary waste. The process bas been implemented on seven gloveboxes within LANL`s Plutonium Facility at Technical Area 55. Of these gloveboxes, two have been discarded as low level waste items and the remaining five have been reused.

  17. RDD-100 and the systems engineering process

    NASA Technical Reports Server (NTRS)

    Averill, Robert D.

    1994-01-01

    An effective systems engineering approach applied through the project life cycle can help Langley produce a better product. This paper demonstrates how an enhanced systems engineering process for in-house flight projects assures that each system will achieve its goals with quality performance and within planned budgets and schedules. This paper also describes how the systems engineering process can be used in combination with available software tools.

  18. New technological developments in processing solid waste to energy

    SciTech Connect

    Sherwin, E.T.; Nollet, A.R.

    1980-01-01

    The state-of-the-art in production of energy from municipal solid waste outlining relative advantages, limitations, and economics of various systems is briefly reviewed including mass-burning versus RDF in spreader-stoker fired boilers; suspension-firing of RDF; pulverized fuel; pelletized fuels; and gaseous fuels generated by pyrolysis processes. A new system for processing solid waste for resource recovery separates the incoming waste by air-classification as the first processing step; conventional systems shred as the first processing step. This new system, originally developed to guard against shredder explosions, has the following supplemental advantages: produces a refuse-derived fuel (RDF) having higher heating values and less ash than conventional systems; and reclaims waste paper which can be used as paper-making furnish, utilizing current beneficiating and cleaning techniques. Production of paper from virgin materials requires 20 to 30 million Btu per ton of paper - versus 10 million Btu when waste paper is utilized as furnish. A new system proposed for storage, handling, and feeding refuse-derived fuel to large suspension-fired boilers is examined. This system proposes coarse shredding only of the light fraction at the solid waste processing plant; shipment in compactor trailers; storage in the same trailers; fine-shredding near the boiler; and air transport from the shredders using material handling fans injecting directly into the boilers. This system provides more efficient operation at less capital cost than systems utilized to date.

  19. A multiarchitecture parallel-processing development environment

    NASA Technical Reports Server (NTRS)

    Townsend, Scott; Blech, Richard; Cole, Gary

    1993-01-01

    A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.

  20. Video processing for DLP display systems

    NASA Astrophysics Data System (ADS)

    Markandey, Vishal; Clatanoff, Todd; Pettitt, Greg

    1996-03-01

    Texas Instruments' Digital Light Processing TM (DLPTM) technology provides all- digital projection displays that offer superior picture quality in terms of resolution, brightness, contrast, and color fidelity. This paper provides an overview of the digital video processing solutions that have been developed by Texas Instruments for the all-digital display. The video processing solutions include: progressive scan conversion, digital video resampling, picture enhancements, color processing, and gamma processing. The real-time implementation of the digital video processing is also discussed, highlighting the use of the scanline video processor (SVP) and the development of custom ASIC solutions.

  1. Transport processes in biological systems: Tumoral cells and human brain

    NASA Astrophysics Data System (ADS)

    Lucia, Umberto

    2014-01-01

    The entropy generation approach has been developed for the analysis of complex systems, with particular regards to biological systems, in order to evaluate their stationary states. The entropy generation is related to the transport processes related to exergy flows. Moreover, cancer can be described as an open complex dynamic and self-organizing system. Consequently, it is used as an example useful to evaluate the different thermo-chemical quantities of the transport processes in normal and in tumoral cells systems.

  2. Development of NIL processes for PV applications

    NASA Astrophysics Data System (ADS)

    Hauser, H.; Tucher, N.; Tokai, K.; Schneider, P.; Wellens, Ch.; Volk, A.; Barke, S.; Müller, C.; Glinsner, T.; Bläsi, B.

    2015-03-01

    Due to its high resolution and applicability for large area patterning, Nanoimprint Lithography (NIL) is a promising technology for photovoltaic (PV) applications. However, a successful industrial application of NIL processes is only possible if large-area processing on thin, brittle and potentially rough substrates can be achieved in a high-throughput process. In this work, the development of NIL processes using the novel SmartNILTM technology from EV Group with a focus on PV applications is described. We applied this tooling to realize a honeycomb texture (8 μm period) on the front side of multicrystalline silicon solar cells leading to an improvement in optical efficiency of 7% relative and a total efficiency gain of 0.5% absolute compared to the industrial standard texture (isotexture). On the rear side of monocrystalline silicon solar cells, we realized diffraction gratings to make use of light trapping effects. An absorption enhancement of up to 35% absolute at a wavelength of 1100 nm is demonstrated. Furthermore, we combined photolithography and NIL processes to introduce features for metal contacts into honeycomb master structures, which initially were realized using interference lithography. As final application, we investigated the realization of very fine contact fingers with prismatic shape in order to minimize reflection losses.

  3. Development of nanoimprint processes for photovoltaic applications

    NASA Astrophysics Data System (ADS)

    Hauser, Hubert; Tucher, Nico; Tokai, Katharina; Schneider, Patrick; Wellens, Christine; Volk, Anne; Seitz, Sonja; Benick, Jan; Barke, Simon; Dimroth, Frank; Müller, Claas; Glinsner, Thomas; Bläsi, Benedikt

    2015-07-01

    Due to its high resolution and applicability for large area patterning, nanoimprint lithography (NIL) is a promising technology for photovoltaic (PV) applications. However, a successful industrial application of NIL processes is only possible if large-area processing on thin, brittle, and potentially rough substrates can be achieved in a high-throughput process. The development of NIL processes using the SmartNIL technology from EV Group with a focus on PV applications is described. The authors applied this tooling to realize a honeycomb texture (8 μm period) on the front side of multicrystalline silicon solar cells, leading to an improvement in optical efficiency of 7% relative and a total efficiency gain of 0.5% absolute compared to the industrial standard texture (isotexture). On the rear side of monocrystalline silicon solar cells, the authors realized diffraction gratings to make use of light trapping effects. An absorption enhancement of up to 35% absolute at a wavelength of 1100 nm is demonstrated. Furthermore, photolithography was combined with NIL processes to introduce features for metal contacts into honeycomb master structures, which were initially realized using interference lithography. As a final application, the authors investigated the realization of very fine contact fingers with prismatic shape in order to minimize reflection losses.

  4. Process for Managing and Customizing HPC Operating Systems

    SciTech Connect

    Brown, David ML

    2014-04-02

    A process for maintaining a custom HPC operating system was developed at the Environmental Molecular Sciences Laboratory (EMSL) over the past ten years. This process is generic and flexible to manage continuous change as well as keep systems updated while managing communication through well defined pieces of software.

  5. Image processing system performance prediction and product quality evaluation

    NASA Technical Reports Server (NTRS)

    Stein, E. K.; Hammill, H. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

  6. Landsat 7 Science Data Processing: A System's Overview

    NASA Technical Reports Server (NTRS)

    Schweiss, Robert; Daniels, Nate; Derrick, Debora

    2000-01-01

    The Landsat Science Data Processing System, developed by NASA for the Landsat 7 Project provides science data handling infrastructure used at the EROS Data Center Landsat 7 Data Handling Facility of the USGS Department of Interior. This paper presents an overview the designs, architectures, and details of the various systems used in the processing of the Landsat 7 Science Data.

  7. Information Processing Capacity of Dynamical Systems

    PubMed Central

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-01-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory. PMID:22816038

  8. Digital system for monitoring and controlling remote processes

    NASA Astrophysics Data System (ADS)

    Roach, Dennis P.

    The need to operate increasingly complex and potentially hazardous facilities at higher degrees of efficiency can be met through the development of automated process control systems. The availability of microcomputers capable of interfacing to data acquisition and control equipment results in the possibility of developing such systems at low investment costs. An automated control system is described which maintains a constant or time varying pressure in a pressure vessel. Process control data acquisition and analysis is carried out using a commercially available microcomputer and data scanner interface device. In this system, a computer interface is developed to allow precision positioning of custom designed proportional valves. Continuous real time process control is achieved through a direct digital control algorithm. The advantages to be gained by adapting this system to other process control applications is discussed. The modular design and ability of this system to operate many types of hardware control mechanisms makes it adaptable to a wide variety of industrial applications.

  9. Research on machine vision system of monitoring injection molding processing

    NASA Astrophysics Data System (ADS)

    Bai, Fan; Zheng, Huifeng; Wang, Yuebing; Wang, Cheng; Liao, Si'an

    2016-01-01

    With the wide development of injection molding process, the embedded monitoring system based on machine vision has been developed to automatically monitoring abnormality of injection molding processing. First, the construction of hardware system and embedded software system were designed. Then camera calibration was carried on to establish the accurate model of the camera to correct distortion. Next the segmentation algorithm was applied to extract the monitored objects of the injection molding process system. The realization procedure of system included the initialization, process monitoring and product detail detection. Finally the experiment results were analyzed including the detection rate of kinds of the abnormality. The system could realize the multi-zone monitoring and product detail detection of injection molding process with high accuracy and good stability.

  10. High-power ultrasonic processing: Recent developments and prospective advances

    NASA Astrophysics Data System (ADS)

    Gallego-Juarez, Juan A.

    2010-01-01

    Although the application of ultrasonic energy to produce or to enhance a wide variety of processes have been explored since about the middle of the 20th century, only a reduced number of ultrasonic processes have been established at industrial level. However, during the last ten years the interest in ultrasonic processing has revived particularly in industrial sectors where the ultrasonic technology may represent a clean and efficient tool to improve classical existing processes or an innovation alternative for the development of new processes. Such seems to be the case of relevant sectors such as food industry, environment, pharmaceuticals and chemicals manufacture, machinery, mining, etc where power ultrasound is becoming an emerging technology for process development. The possible major problem in the application of high-intensity ultrasound on industrial processing is the design and development of efficient power ultrasonic systems (generators and reactors) capable of large scale successful operation specifically adapted to each individual process. In the area of ultrasonic processing in fluid media and more specifically in gases, the development of the steppedplate transducers and other power ge with extensive radiating surface has strongly contributed to the implementation at semi-industrial and industrial stage of several commercial applications, in sectors such as food and beverage industry (defoaming, drying, extraction, etc), environment (air cleaning, sludge filtration, etc...), machinery and process for manufacturing (textile washing, paint manufacture, etc). The development of different cavitational reactors for liquid treatment in continuous flow is helping to introduce into industry the wide potential of the area of sonochemistry. Processes such as water and effluent treatment, crystallization, soil remediation, etc have been already implemented at semi-industrial and/or industrial stage. Other single advances in sectors like mining or energy have

  11. Development of Waste Reduction System of Wastewater Treatment Process Using a Moss: Production of Useful Materials from Remainder of a Moss

    NASA Astrophysics Data System (ADS)

    Fumihisa, Kobayashi

    Landfill leachate pollution presents a serious environmental problem. It would be valuable to develop a sustainable method, one that is inexpensive and requires little energy, to eliminate the pollution and dispose of the waste. In a previous study, we reported the results of a leachate treatment for landfills in which we relied on the moss, Scopelophia cataractae, to support a sustainable method of waste reduction. In this study, for the development of a waste reduction system of landfill leachate treatment, we attempted to produce zinc as useful metal and ethanol as fuel from the remainder of moss after wastewater treatment. Steam explosions, which were used as physicochemical pretreatments to expose the raw material to saturated steam under high pressure and temperature, were used to pretreat the moss. By electrolysis, zinc recovered, and the maximum zinc recovery after wastewater treatment was 0.504 at 2.0 MPa steam pressure (211 °C) and 5 min steaming time. After that time, by simultaneous saccharification and fermentation using a Meicelase and Saccharomyces cerevisiae AM12, 0.42 g dm-3 of the maximum ethanol concentration was produced from 10 g dm-3 of exploded moss at 2.5 MPa steam pressure (223 °C) and 1 min steaming time.

  12. Developing the Manufacturing Process for Hylene MP Curing Agent

    SciTech Connect

    Eastwood, Eric

    2009-02-16

    This report details efforts to scale-up and re-establish the manufacturing process for the curing agent known as Hylene MP. First, small scale reactions were completed with varying conditions to determine key drivers for yielding high quality product. Once the optimum conditions were determined on the small scale, the scaled-up process conditions were determined. New equipment was incorporated into the manufacturing process to create a closed production system and improve chemical exposure controls and improve worker safety. A safe, efficient manufacturing process was developed to manufacture high quality Hylene MP in large quantities.

  13. A Reverse Osmosis System for an Advanced Separation Process Laboratory.

    ERIC Educational Resources Information Center

    Slater, C. S.; Paccione, J. D.

    1987-01-01

    Focuses on the development of a pilot unit for use in an advanced separations process laboratory in an effort to develop experiments on such processes as reverse osmosis, ultrafiltration, adsorption, and chromatography. Discusses reverse osmosis principles, the experimental system design, and some experimental studies. (TW)

  14. Integration mockup and process material management system

    NASA Astrophysics Data System (ADS)

    Verble, Adas James, Jr.

    1992-02-01

    Work to define and develop a full scale Space Station Freedom (SSF) mockup with the flexibility to evolve into future designs, to validate techniques for maintenance and logistics and verify human task allocations and support trade studies is described. This work began in early 1985 and ended in August, 1991. The mockups are presently being used at MSFC in Building 4755 as a technology and design testbed, as well as for public display. Micro Craft also began work on the Process Material Management System (PMMS) under this contract. The PMMS simulator was a sealed enclosure for testing to identify liquids, gaseous, particulate samples, and specimen including, urine, waste water, condensate, hazardous gases, surrogate gasses, liquids, and solids. The SSF would require many trade studies to validate techniques for maintenance and logistics and verify system task allocations; it was necessary to develop a full scale mockup which would be representative of current SSF design with the ease of changing those designs as the SSF design evolved and changed. The tasks defined for Micro Craft were to provide the personnel, services, tools, and materials for the SSF mockup which would consist of four modules, nodes, interior components, and part task mockups of MSFC responsible engineering systems. This included the Engineering Control and Life Support Systems (ECLSS) testbed. For the initial study, the mockups were low fidelity, soft mockups of graphics art bottle, and other low cost materials, which evolved into higher fidelity mockups as the R&D design evolved, by modifying or rebuilding, an important cost saving factor in the design process. We designed, fabricated, and maintained the full size mockup shells and support stands. The shells consisted of cylinders, end cones, rings, longerons, docking ports, crew airlocks, and windows. The ECLSS required a heavier cylinder to support the ECLSS systems test program. Details of this activity will be covered. Support stands were

  15. Integration mockup and process material management system

    NASA Technical Reports Server (NTRS)

    Verble, Adas James, Jr.

    1992-01-01

    Work to define and develop a full scale Space Station Freedom (SSF) mockup with the flexibility to evolve into future designs, to validate techniques for maintenance and logistics and verify human task allocations and support trade studies is described. This work began in early 1985 and ended in August, 1991. The mockups are presently being used at MSFC in Building 4755 as a technology and design testbed, as well as for public display. Micro Craft also began work on the Process Material Management System (PMMS) under this contract. The PMMS simulator was a sealed enclosure for testing to identify liquids, gaseous, particulate samples, and specimen including, urine, waste water, condensate, hazardous gases, surrogate gasses, liquids, and solids. The SSF would require many trade studies to validate techniques for maintenance and logistics and verify system task allocations; it was necessary to develop a full scale mockup which would be representative of current SSF design with the ease of changing those designs as the SSF design evolved and changed. The tasks defined for Micro Craft were to provide the personnel, services, tools, and materials for the SSF mockup which would consist of four modules, nodes, interior components, and part task mockups of MSFC responsible engineering systems. This included the Engineering Control and Life Support Systems (ECLSS) testbed. For the initial study, the mockups were low fidelity, soft mockups of graphics art bottle, and other low cost materials, which evolved into higher fidelity mockups as the R&D design evolved, by modifying or rebuilding, an important cost saving factor in the design process. We designed, fabricated, and maintained the full size mockup shells and support stands. The shells consisted of cylinders, end cones, rings, longerons, docking ports, crew airlocks, and windows. The ECLSS required a heavier cylinder to support the ECLSS systems test program. Details of this activity will be covered. Support stands were

  16. Expert system development for probabilistic load simulation

    NASA Technical Reports Server (NTRS)

    Ho, H.; Newell, J. F.

    1991-01-01

    A knowledge based system LDEXPT using the intelligent data base paradigm was developed for the Composite Load Spectra (CLS) project to simulate the probabilistic loads of a space propulsion system. The knowledge base approach provides a systematic framework of organizing the load information and facilitates the coupling of the numerical processing and symbolic (information) processing. It provides an incremental development environment for building generic probabilistic load models and book keeping the associated load information. A large volume of load data is stored in the data base and can be retrieved and updated by a built-in data base management system. The data base system standardizes the data storage and retrieval procedures. It helps maintain data integrity and avoid data redundancy. The intelligent data base paradigm provides ways to build expert system rules for shallow and deep reasoning and thus provides expert knowledge to help users to obtain the required probabilistic load spectra.

  17. Developing an Environmental Scanning System.

    ERIC Educational Resources Information Center

    Morrison, James L.

    A step-by-step approach is provided for developing an environmental scanning system for colleges and universities to assist them in planning for the future. The objectives of such a system are to detect social, scientific, economic, technical, and political interactions important to the organization; define potential threats and opportunities from…

  18. Developing a Carbon Observing System

    NASA Astrophysics Data System (ADS)

    Moore, B., III

    2015-12-01

    There is a clear need to better understand and predict future climate change, so that science can more confidently inform climate policy, including adaptation planning and future mitigation strategies. Understanding carbon cycle feedbacks, and the relationship between emissions (fossil and land use) and the resulting atmospheric carbon dioxide (CO2) and methane (CH4) concentrations in a changing climate has been recognized as an important goal by the IPCC. The existing surface greenhouse gas observing networks provide accurate and precise measurements of background values, but they are not configured to target the extended, complex and dynamic regions of the carbon budget. Space Agencies around the globe are committed to CO2 and CH4 observations: GOSAT-1/2, OCO-2/3, MERLin, TanSat, and CarbonSat. In addition to these Low Earth Orbit (LEO) missions, a new mission in Geostationary Orbit (GEO), geoCARB, which would provide mapping-like measurements of carbon dioxide, methane, and carbon monoxide concentrations over major land areas, has been recently proposed to the NASA Venture Program. These pioneering missions do not provide the spatial/temporal coverage to answer the key carbon-climate questions at process relevant scales nor do they address the distribution and quantification of anthropogenic sources at urban scales. They do demonstrate, however, that a well-planned future system of system integrating space-based LEO and GEO missions with extensive in situ observations could provide the accuracy, spatial resolution, and coverage needed to address critical open issues in the carbon-climate system. Dr. Diana Wickland devoted enormous energy in developing a comprehensive apprioach to understand the global carbon cycle; she understood well that an integrated, coordinated, international approach is needed. This shines through in her recent contribution in co-chairing the team that produced the "CEOS Strategy for Carbon Observations from Space." A NASA-funded community

  19. Aging Processes and the Development of Osteoarthritis

    PubMed Central

    Loeser, Richard F.

    2013-01-01

    Purpose of review Aging is a primary risk factor for the development of osteoarthritis (OA) and the understanding of how aging processes contribute to the development of OA is an important area of active research. The most recent literature in this area was reviewed in order update investigators on the status of the field. Recent findings The field is beginning to move beyond a cartilage focus to include other joint tissues relevant to OA such as ligaments, meniscus, and bone. Synovitis also appears to play a role in OA but has not been a focus of aging studies. Studies in small animals, including mice and rats, demonstrate age-related changes that can contribute to OA and show that animal age is a key factor to be considered in interpreting the results of studies using surgically-induced models of OA. There is accumulating evidence that cellular processes such as damage-induced cell senescence contribute to OA and a growing body of literature on the role of epigenetic regulation of gene expression in aging and OA. Summary Not all OA is due to aging processes in joint tissues but the age-related changes being discovered certainly could play a major contributing role. PMID:23080227

  20. Systems, Development, and Early Intervention.

    ERIC Educational Resources Information Center

    Sameroff, Arnold J.

    1992-01-01

    This commentary on the study reported in this monograph focuses on three topics raised by the study: (1) social systems, or individuals in the context of institutions; (2) the study of development through the use of disabled populations as experiments in human growth; and (3) the ability of intervention programs to manipulate development. (BC)