Science.gov

Sample records for processing system development

  1. Spaceport Processing System Development Lab

    NASA Technical Reports Server (NTRS)

    Dorsey, Michael

    2013-01-01

    The Spaceport Processing System Development Lab (SPSDL), developed and maintained by the Systems Hardware and Engineering Branch (NE-C4), is a development lab with its own private/restricted networks. A private/restricted network is a network with restricted or no communication with other networks. This allows users from different groups to work on their own projects in their own configured environment without interfering with others utilizing their resources in the lab. The different networks being used in the lab have no way to talk with each other due to the way they are configured, so how a user configures his software, operating system, or the equipment doesn't interfere or carry over on any of the other networks in the lab. The SPSDL is available for any project in KSC that is in need of a lab environment. My job in the SPSDL was to assist in maintaining the lab to make sure it's accessible for users. This includes, but is not limited to, making sure the computers in the lab are properly running and patched with updated hardware/software. In addition to this, I also was to assist users who had issues in utilizing the resources in the lab, which may include helping to configure a restricted network for their own environment. All of this was to ensure workers were able to use the SPSDL to work on their projects without difficulty which would in turn, benefit the work done throughout KSC. When I wasn't working in the SPSDL, I would instead help other coworkers with smaller tasks which included, but wasn't limited to, the proper disposal, moving of, or search for essential equipment. I also, during the free time I had, used NASA's resources to increase my knowledge and skills in a variety of subjects related to my major as a computer engineer, particularly in UNIX, Networking, and Embedded Systems.

  2. An Instructional Systems Development Process.

    ERIC Educational Resources Information Center

    Campbell, Clifton P.

    Instructional systems development (ISD) is a systems approach to curriculum development and instructional delivery. It is oriented toward occupational needs with an emphasis on what it is that students must learn to perform specific tasks, what facilities best provide a setting for the neccessary learning, and what instructional methods and media…

  3. Teaching Information Systems Development via Process Variants

    ERIC Educational Resources Information Center

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  4. Teaching Information Systems Development via Process Variants

    ERIC Educational Resources Information Center

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  5. Digital processing system for developing countries

    NASA Technical Reports Server (NTRS)

    Nanayakkara, C.; Wagner, H.

    1977-01-01

    An effort was undertaken to perform simple digital processing tasks using pre-existing general purpose digital computers. An experimental software package, LIGMALS, was obtained and modified for this purpose. The resulting software permits basic processing tasks to be performed including level slicing, gray mapping and ratio processing. The experience gained in this project indicates a possible direction which may be used by other developing countries to obtain digital processing capabilities.

  6. System Development by Process Integrated Knowledge Management

    NASA Astrophysics Data System (ADS)

    Stoll, Margareth; Laner, Dietmar

    Due to globalization and ever shorter change cycle's organizations improve increasingly faster their products, services, technologies, IT and organization according to customer requirements, optimize their efficiency, effectiveness and reduce costs. Thus the largest potential is the continually improvement and the management of information, data and knowledge. Long time organizations had developed lot separate and frequently independent IT applications. In the last years they were integrated by interfaces and always more by common databases. In large sized enterprises or in the public administration IT must operate various different applications, which requires a lot of personal and cost. Many organizations improve their IT starting from the lived processes using new technologies, but ask not, how they can use technology to support new processes.

  7. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Basile, Lisa R.; Kelly, Angelita C.

    1987-01-01

    The Spacelab Data Processing Facility (SLDPF) is an integral part of the Space Shuttle data network for missions that involve attached scientific payloads. Expert system prototypes were developed to aid in the performance of the quality assurance function of the Spacelab and/or Attached Shuttle Payloads processed telemetry data. The Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS), two expert systems, were developed to determine their feasibility and potential in the quality assurance of processed telemetry data. The capabilities and performance of these systems are discussed.

  8. The Systems Engineering Process for Human Support Technology Development

    NASA Technical Reports Server (NTRS)

    Jones, Harry

    2005-01-01

    Systems engineering is designing and optimizing systems. This paper reviews the systems engineering process and indicates how it can be applied in the development of advanced human support systems. Systems engineering develops the performance requirements, subsystem specifications, and detailed designs needed to construct a desired system. Systems design is difficult, requiring both art and science and balancing human and technical considerations. The essential systems engineering activity is trading off and compromising between competing objectives such as performance and cost, schedule and risk. Systems engineering is not a complete independent process. It usually supports a system development project. This review emphasizes the NASA project management process as described in NASA Procedural Requirement (NPR) 7120.5B. The process is a top down phased approach that includes the most fundamental activities of systems engineering - requirements definition, systems analysis, and design. NPR 7120.5B also requires projects to perform the engineering analyses needed to ensure that the system will operate correctly with regard to reliability, safety, risk, cost, and human factors. We review the system development project process, the standard systems engineering design methodology, and some of the specialized systems analysis techniques. We will discuss how they could apply to advanced human support systems development. The purpose of advanced systems development is not directly to supply human space flight hardware, but rather to provide superior candidate systems that will be selected for implementation by future missions. The most direct application of systems engineering is in guiding the development of prototype and flight experiment hardware. However, anticipatory systems engineering of possible future flight systems would be useful in identifying the most promising development projects.

  9. The Systems Engineering Process for Human Support Technology Development

    NASA Technical Reports Server (NTRS)

    Jones, Harry

    2005-01-01

    Systems engineering is designing and optimizing systems. This paper reviews the systems engineering process and indicates how it can be applied in the development of advanced human support systems. Systems engineering develops the performance requirements, subsystem specifications, and detailed designs needed to construct a desired system. Systems design is difficult, requiring both art and science and balancing human and technical considerations. The essential systems engineering activity is trading off and compromising between competing objectives such as performance and cost, schedule and risk. Systems engineering is not a complete independent process. It usually supports a system development project. This review emphasizes the NASA project management process as described in NASA Procedural Requirement (NPR) 7120.5B. The process is a top down phased approach that includes the most fundamental activities of systems engineering - requirements definition, systems analysis, and design. NPR 7120.5B also requires projects to perform the engineering analyses needed to ensure that the system will operate correctly with regard to reliability, safety, risk, cost, and human factors. We review the system development project process, the standard systems engineering design methodology, and some of the specialized systems analysis techniques. We will discuss how they could apply to advanced human support systems development. The purpose of advanced systems development is not directly to supply human space flight hardware, but rather to provide superior candidate systems that will be selected for implementation by future missions. The most direct application of systems engineering is in guiding the development of prototype and flight experiment hardware. However, anticipatory systems engineering of possible future flight systems would be useful in identifying the most promising development projects.

  10. Development of the Diagnostic Expert System for Tea Processing

    NASA Astrophysics Data System (ADS)

    Yoshitomi, Hitoshi; Yamaguchi, Yuichi

    A diagnostic expert system for tea processing which can presume the cause of the defect of the processed tea was developed to contribute to the improvement of tea processing. This system that consists of some programs can be used through the Internet. The inference engine, the core of the system adopts production system which is well used on artificial intelligence, and is coded by Prolog as the artificial intelligence oriented language. At present, 176 rules for inference have been registered on this system. The system will be able to presume better if more rules are added to the system.

  11. Vulnerable periods and processes during central nervous system development.

    PubMed Central

    Rodier, P M

    1994-01-01

    The developing central nervous system (CNS) is the organ system most frequently observed to exhibit congenital abnormalities. While the developing CNS lacks a blood brain barrier, the characteristics of known teratogens indicate that differential doses to the developing vs mature brain are not the major factor in differential sensitivity. Instead, most agents seem to act on processes that occur only during development. Thus, it appears that the susceptibility of the developing brain compared to the mature one depends to a great extent on the presence of processes sensitive to disruption. Yet cell proliferation, migration, and differentiation characterize many other developing organs, so the difference between CNS and other organs must depend on other properties of the developing CNS. The most important of these is probably the fact that nervous system development takes much longer than development of other organs, making it subject to injury over a longer period. PMID:7925182

  12. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William

    1987-01-01

    Spacelab Data Processing Facility (SLDPF) expert system prototypes have been developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

  13. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William

    1987-01-01

    Spacelab Data Processing Facility (SLDPF) expert system prototypes were developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. The SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

  14. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William

    1987-01-01

    Spacelab Data Processing Facility (SLDPF) expert system prototypes have been developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

  15. Mechanical Design Support System Based on Thinking Process Development Diagram

    NASA Astrophysics Data System (ADS)

    Mase, Hisao; Kinukawa, Hiroshi; Morii, Hiroshi; Nakao, Masayuki; Hatamura, Yotaro

    This paper describes a system that directly supports a design process in a mechanical domain. This system is based on a thinking process development diagram that draws distinctions between requirement, tasks, solutions, and implementation, which enables designers to expand and deepen their thoughts of design. The system provides five main functions that designers require in each phase of the proposed design process: (1) thinking process description support which enables designers to describe their thoughts, (2) creativity support by term association with thesauri, (3) timely display of design knowledge including know-how obtained through earlier failures, general design theories, standard-parts data, and past designs, (4) design problem solving support using 46 kinds of thinking operations, and (5) proper technology transfer support which accumulates not only design conclusions but also the design process. Though this system is applied to mechanical engineering as the first target domain, it can be easily expanded to many other domains such as architecture and electricity.

  16. Metal containing material processing on coater/developer system

    NASA Astrophysics Data System (ADS)

    Kawakami, Shinichiro; Mizunoura, Hiroshi; Matsunaga, Koichi; Hontake, Koichi; Nakamura, Hiroshi; Shimura, Satoru; Enomoto, Masashi

    2016-03-01

    Challenges of processing metal containing materials need to be addressed in order apply this technology to Behavior of metal containing materials on coater/developer processing including coating process, developer process and tool metal contamination is studied using CLEAN TRACKTM LITHIUS ProTM Z (Tokyo Electron Limited). Through this work, coating uniformity and coating film defectivity were studied. Metal containing material performance was comparable to conventional materials. Especially, new dispense system (NDS) demonstrated up to 80% reduction in coating defect for metal containing materials. As for processed wafer metal contamination, coated wafer metal contamination achieved less than 1.0E10 atoms/cm2 with 3 materials. After develop metal contamination also achieved less than 1.0E10 atoms/cm2 with 2 materials. Furthermore, through the metal defect study, metal residues and metal contamination were reduced by developer rinse optimization.

  17. Guideline Development Process in a Public Workers' Compensation System.

    PubMed

    Javaher, Simone P

    2015-08-01

    Washington state's public workers' compensation system has had a formal process for developing and implementing evidence-based clinical practice guidelines since 2007. Collaborating with the Industrial Insurance Medical Advisory Committee and clinicians from the medical community, the Office of the Medical Director has provided leadership and staff support necessary to develop guidelines that have improved outcomes and reduced the number of potentially harmful procedures. Guidelines are selected according to a prioritization schema and follow a development process consistent with that of the national Institute of Medicine. Evaluation criteria are also applied. Guidelines continue to be developed to provide clinical recommendations for optimizing care and reducing risk of harm.

  18. Process approach in developing or improvement of student information systems

    NASA Astrophysics Data System (ADS)

    Jaskowska, Małgorzata

    2015-02-01

    An aim of research described in the article was to evaluate usefulness of the university information system, which precedes its reorganization. The study was conducted among representatives of all stakeholders - system users: candidates, students and university authorities. A need of system users expressed in the study: change of the approach in its construction - from purely information to procedural, it is consistent with a current process approach in systems design, intensified by the fashionable service oriented architecture (SOA). This thread was developed by conducting literature research and analysis of student information systems best practices. As a result the processes were selected and described, which implementation may assist the university system. Research result can be used by system designers for its improvement.

  19. The Development of Sun-Tracking System Using Image Processing

    PubMed Central

    Lee, Cheng-Dar; Huang, Hong-Cheng; Yeh, Hong-Yih

    2013-01-01

    This article presents the development of an image-based sun position sensor and the algorithm for how to aim at the Sun precisely by using image processing. Four-quadrant light sensors and bar-shadow photo sensors were used to detect the Sun's position in the past years. Nevertheless, neither of them can maintain high accuracy under low irradiation conditions. Using the image-based Sun position sensor with image processing can address this drawback. To verify the performance of the Sun-tracking system including an image-based Sun position sensor and a tracking controller with embedded image processing algorithm, we established a Sun image tracking platform and did the performance testing in the laboratory; the results show that the proposed Sun tracking system had the capability to overcome the problem of unstable tracking in cloudy weather and achieve a tracking accuracy of 0.04°. PMID:23615582

  20. Development of KIAPS Observation Processing Package for Data Assimilation System

    NASA Astrophysics Data System (ADS)

    Kang, Jeon-Ho; Chun, Hyoung-Wook; Lee, Sihye; Han, Hyun-Jun; Ha, Su-Jin

    2015-04-01

    The Korea Institute of Atmospheric Prediction Systems (KIAPS) was founded in 2011 by the Korea Meteorological Administration (KMA) to develop Korea's own global Numerical Weather Prediction (NWP) system as nine year (2011-2019) project. Data assimilation team at KIAPS has been developing the observation processing system (KIAPS Package for Observation Processing: KPOP) to provide optimal observations to the data assimilation system for the KIAPS Global Model (KIAPS Integrated Model - Spectral Element method based on HOMME: KIM-SH). Currently, the KPOP is capable of processing the satellite radiance data (AMSU-A, IASI), GPS Radio Occultation (GPS-RO), AIRCRAFT (AMDAR, AIREP, and etc…), and synoptic observation (SONDE and SURFACE). KPOP adopted Radiative Transfer for TOVS version 10 (RTTOV_v10) to get brightness temperature (TB) for each channel at top of the atmosphere (TOA), and Radio Occultation Processing Package (ROPP) 1-dimensional forward module to get bending angle (BA) at each tangent point. The observation data are obtained from the KMA which has been composited with BUFR format to be converted with ODB that are used for operational data assimilation and monitoring at the KMA. The Unified Model (UM), Community Atmosphere - Spectral Element (CAM-SE) and KIM-SH model outputs are used for the bias correction (BC) and quality control (QC) of the observations, respectively. KPOP provides radiance and RO data for Local Ensemble Transform Kalman Filter (LETKF) and also provides SONDE, SURFACE and AIRCRAFT data for Three-Dimensional Variational Assimilation (3DVAR). We are expecting all of the observation type which processed in KPOP could be combined with both of the data assimilation method as soon as possible. The preliminary results from each observation type will be introduced with the current development status of the KPOP.

  1. Development and fabrication of a solar cell junction processing system

    NASA Technical Reports Server (NTRS)

    1984-01-01

    A processing system capable of producing solar cell junctions by ion implantation followed by pulsed electron beam annealing was developed and constructed. The machine was to be capable of processing 4-inch diameter single-crystal wafers at a rate of 10(7) wafers per year. A microcomputer-controlled pulsed electron beam annealer with a vacuum interlocked wafer transport system was designed, built and demonstrated to produce solar cell junctions on 4-inch wafers with an AMI efficiency of 12%. Experiments showed that a non-mass-analyzed (NMA) ion beam could implant 10 keV phosphorous dopant to form solar cell junctions which were equivalent to mass-analyzed implants. A NMA ion implanter, compatible with the pulsed electron beam annealer and wafer transport system was designed in detail but was not built because of program termination.

  2. Development and fabrication of a solar cell junction processing system

    NASA Technical Reports Server (NTRS)

    Bunker, S.

    1981-01-01

    A solar cell junction processing system was developed and fabricated. A pulsed electron beam for the four inch wafers is being assembled and tested, wafers were successfully pulsed, and solar cells fabricated. Assembly of the transport locks is completed. The transport was operated successfully but not with sufficient reproducibility. An experiment test facility to examine potential scaleup problems associated with the proposed ion implanter design was constructed and operated. Cells were implanted and found to have efficiency identical to the normal Spire implant process.

  3. Rapid prototyping in the development of image processing systems

    NASA Astrophysics Data System (ADS)

    von der Fecht, Arno; Kelm, Claus Thomas

    2004-08-01

    This contribution presents a rapid prototyping approach for the real-time demonstration of image processing algorithms. As an example EADS/LFK has developed a basic IR target tracking system implementing this approach. Traditionally in research and industry time-independent simulation of image processing algorithms on a host computer is processed. This method is good for demonstrating the algorithms' capabilities. Rarely done is a time-dependent simulation or even a real-time demonstration on a target platform to prove the real-time capabilities. In 1D signal processing applications time-dependent simulation and real-time demonstration has already been used for quite a while. For time-dependent simulation Simulink from The MathWorks has established as an industry standard. Combined with The MathWorks' Real-Time Workshop the simulation model can be transferred to a real-time target processor. The executable is generated automatically by the Real-Time Workshop directly out of the simulation model. In 2D signal processing applications like image processing The Mathworks' Matlab is commonly used for time-independent simulation. To achieve time-dependent simulation and real-time demonstration capabilities the algorithms can be transferred to Simulink, which in fact runs on top of Matlab. Additionally to increase the performance Simulink models or parts of them can be transferred to Xilinx FPGAs using Xilinx' System Generator. With a single model and the automatic workflow both, a time-dependant simulation and the real-time demonstration, are covered leading to an easy and flexible rapid prototyping approach. EADS/LFK is going to use this approach for a wider spectrum of IR image processing applications like automatic target recognition or image based navigation or imaging laser radar target recognition.

  4. Market development directory for solar industrial process heat systems

    SciTech Connect

    1980-02-01

    The purpose of this directory is to provide a basis for market development activities through a location listing of key trade associations, trade periodicals, and key firms for three target groups. Potential industrial users and potential IPH system designers were identified as the prime targets for market development activities. The bulk of the directory is a listing of these two groups. The third group, solar IPH equipment manufacturers, was included to provide an information source for potential industrial users and potential IPH system designers. Trade associates and their publications are listed for selected four-digit Standard Industrial Code (SIC) industries. Since industries requiring relatively lower temperature process heat probably will comprise most of the near-term market for solar IPH systems, the 80 SIC's included in this chapter have process temperature requirements less than 350/sup 0/F. Some key statistics and a location list of the largest plants (according to number of employees) in each state are included for 15 of the 80 SIC's. Architectural/engineering and consulting firms are listed which are known to have solar experience. Professional associated and periodicals to which information on solar IPH sytstems may be directed also are included. Solar equipment manufacturers and their associations are listed. The listing is based on the SERI Solar Energy Information Data Base (SEIDB).

  5. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  6. Development of Data Processing Software for NBI Spectroscopic Analysis System

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodan; Hu, Chundong; Sheng, Peng; Zhao, Yuanzhe; Wu, Deyun; Cui, Qinglong

    2015-04-01

    A set of data processing software is presented in this paper for processing NBI spectroscopic data. For better and more scientific managment and querying these data, they are managed uniformly by the NBI data server. The data processing software offers the functions of uploading beam spectral original and analytic data to the data server manually and automatically, querying and downloading all the NBI data, as well as dealing with local LZO data. The set software is composed of a server program and a client program. The server software is programmed in C/C++ under a CentOS development environment. The client software is developed under a VC 6.0 platform, which offers convenient operational human interfaces. The network communications between the server and the client are based on TCP. With the help of this set software, the NBI spectroscopic analysis system realizes the unattended automatic operation, and the clear interface also makes it much more convenient to offer beam intensity distribution data and beam power data to operators for operation decision-making. supported by National Natural Science Foundation of China (No. 11075183), the Chinese Academy of Sciences Knowledge Innovation

  7. On the Hilbert-Huang Transform Data Processing System Development

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Flatley, Thomas P.; Huang, Norden E.; Cornwell, Evette; Smith, Darell

    2003-01-01

    One of the main heritage tools used in scientific and engineering data spectrum analysis is the Fourier Integral Transform and its high performance digital equivalent - the Fast Fourier Transform (FFT). The Fourier view of nonlinear mechanics that had existed for a long time, and the associated FFT (fairly recent development), carry strong a-priori assumptions about the source data, such as linearity and of being stationary. Natural phenomena measurements are essentially nonlinear and nonstationary. A very recent development at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), known as the Hilbert-Huang Transform (HHT) proposes a novel approach to the solution for the nonlinear class of spectrum analysis problems. Using the Empirical Mode Decomposition (EMD) followed by the Hilbert Transform of the empirical decomposition data (HT), the HHT allows spectrum analysis of nonlinear and nonstationary data by using an engineering a-posteriori data processing, based on the EMD algorithm. This results in a non-constrained decomposition of a source real value data vector into a finite set of Intrinsic Mode Functions (IMF) that can be further analyzed for spectrum interpretation by the classical Hilbert Transform. This paper describes phase one of the development of a new engineering tool, the HHT Data Processing System (HHTDPS). The HHTDPS allows applying the "T to a data vector in a fashion similar to the heritage FFT. It is a generic, low cost, high performance personal computer (PC) based system that implements the HHT computational algorithms in a user friendly, file driven environment. This paper also presents a quantitative analysis for a complex waveform data sample, a summary of technology commercialization efforts and the lessons learned from this new technology development.

  8. Development of techniques for processing metal-metal oxide systems

    NASA Technical Reports Server (NTRS)

    Johnson, P. C.

    1976-01-01

    Techniques for producing model metal-metal oxide systems for the purpose of evaluating the results of processing such systems in the low-gravity environment afforded by a drop tower facility are described. Because of the lack of success in producing suitable materials samples and techniques for processing in the 3.5 seconds available, the program was discontinued.

  9. System Engineering Processes at Kennedy Space Center for Development of the SLS and Orion Launch Systems

    NASA Technical Reports Server (NTRS)

    Schafer, Eric J.

    2012-01-01

    There are over 40 subsystems being developed for the future SLS and Orion Launch Systems at Kennedy Space Center. These subsystems developed at the Kennedy Space Center Engineering Directorate follow a comprehensive design process which requires several different product deliverables during each phase of each of the subsystems. This Paper describes this process and gives an example of where the process has been applied.

  10. System Engineering Processes at Kennedy Space Center for Development of the SLS and Orion Launch Systems

    NASA Technical Reports Server (NTRS)

    Schafer, Eric J.

    2012-01-01

    There are over 40 subsystems being developed for the future SLS and Orion Launch Systems at Kennedy Space Center. These subsystems developed at the Kennedy Space Center Engineering Directorate follow a comprehensive design process which requires several different product deliverables during each phase of each of the subsystems. This Paper describes this process and gives an example of where the process has been applied.

  11. Development of an instructional expert system for hole drilling processes

    NASA Technical Reports Server (NTRS)

    Al-Mutawa, Souhaila; Srinivas, Vijay; Moon, Young Bai

    1990-01-01

    An expert system which captures the expertise of workshop technicians in the drilling domain was developed. The expert system is aimed at novice technicians who know how to operate the machines but have not acquired the decision making skills that are gained with experience. This paper describes the domain background and the stages of development of the expert system.

  12. Development of an automated ammunition processing system for battlefield use

    SciTech Connect

    Speaks, D.M.; Chesser, J.B.; Lloyd, P.D.; Miller, E.D.; Ray, T.L.; Weil, B.S.

    1995-03-01

    The Future Armored Resupply Vehicle (FARV) will be the companion ammunition resupply vehicle to the Advanced Field Artillery System (AFAS). These systems are currently being investigated by the US Army for future acquisition. The FARV will sustain the AFAS with ammunition and fuel and will significantly increase capabilities over current resupply vehicles. Currently ammunition is transferred to field artillery almost entirely by hand. The level of automation to be included into the FARV is still under consideration. At the request of the US Army`s Project Manager, AFAS/FARV, Oak Ridge National Laboratory (ORNL) identified and evaluated various concepts for the automated upload, processing, storage, and delivery equipment for the FARV. ORNL, working with the sponsor, established basic requirements and assumptions for concept development and the methodology for concept selection. A preliminary concept has been selected, and the associated critical technologies have been identified. ORNL has provided technology demonstrations of many of these critical technologies. A technology demonstrator which incorporates all individual components into a total process demonstration is planned for late FY 1995.

  13. Multi-kilowatt modularized spacecraft power processing system development

    NASA Technical Reports Server (NTRS)

    Andrews, R. E.; Hayden, J. H.; Hedges, R. T.; Rehmann, D. W.

    1975-01-01

    A review of existing information pertaining to spacecraft power processing systems and equipment was accomplished with a view towards applicability to the modularization of multi-kilowatt power processors. Power requirements for future spacecraft were determined from the NASA mission model-shuttle systems payload data study which provided the limits for modular power equipment capabilities. Three power processing systems were compared to evaluation criteria to select the system best suited for modularity. The shunt regulated direct energy transfer system was selected by this analysis for a conceptual design effort which produced equipment specifications, schematics, envelope drawings, and power module configurations.

  14. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Appendices; Volume 2

    NASA Technical Reports Server (NTRS)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g. missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  15. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Volume 1; Appendices

    NASA Technical Reports Server (NTRS)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g., missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  16. Risk communication strategy development using the aerospace systems engineering process

    NASA Technical Reports Server (NTRS)

    Dawson, S.; Sklar, M.

    2004-01-01

    This paper explains the goals and challenges of NASA's risk communication efforts and how the Aerospace Systems Engineering Process (ASEP) was used to map the risk communication strategy used at the Jet Propulsion Laboratory to achieve these goals.

  17. Risk communication strategy development using the aerospace systems engineering process

    NASA Technical Reports Server (NTRS)

    Dawson, S.; Sklar, M.

    2004-01-01

    This paper explains the goals and challenges of NASA's risk communication efforts and how the Aerospace Systems Engineering Process (ASEP) was used to map the risk communication strategy used at the Jet Propulsion Laboratory to achieve these goals.

  18. Systems Engineering of Unmanned DoD Systems: Following the Joint Capabilities Integration and Development System/Defense Acquisition System Process to Develop an Unmanned Ground Vehicle System

    DTIC Science & Technology

    2015-12-01

    and Development System/Defense Acquisition System (JCIDS/DAS) process to gain insight into JCIDS/DAS as it relates to unmanned robotics systems...DAS regulations to tailor an SE approach in designing and building the TECHMAN robot , starting with the mission needs and requirements followed by...into the JCIDS/DAS process with regard to procurement of robotics systems. 14. SUBJECT TERMS JCIDS, DAS, unmanned systems, unmanned ground vehicle

  19. Tracker: Image-Processing and Object-Tracking System Developed

    NASA Technical Reports Server (NTRS)

    Klimek, Robert B.; Wright, Theodore W.

    1999-01-01

    Tracker is an object-tracking and image-processing program designed and developed at the NASA Lewis Research Center to help with the analysis of images generated by microgravity combustion and fluid physics experiments. Experiments are often recorded on film or videotape for analysis later. Tracker automates the process of examining each frame of the recorded experiment, performing image-processing operations to bring out the desired detail, and recording the positions of the objects of interest. It can load sequences of images from disk files or acquire images (via a frame grabber) from film transports, videotape, laser disks, or a live camera. Tracker controls the image source to automatically advance to the next frame. It can employ a large array of image-processing operations to enhance the detail of the acquired images and can analyze an arbitrarily large number of objects simultaneously. Several different tracking algorithms are available, including conventional threshold and correlation-based techniques, and more esoteric procedures such as "snake" tracking and automated recognition of character data in the image. The Tracker software was written to be operated by researchers, thus every attempt was made to make the software as user friendly and self-explanatory as possible. Tracker is used by most of the microgravity combustion and fluid physics experiments performed by Lewis, and by visiting researchers. This includes experiments performed on the space shuttles, Mir, sounding rockets, zero-g research airplanes, drop towers, and ground-based laboratories. This software automates the analysis of the flame or liquid s physical parameters such as position, velocity, acceleration, size, shape, intensity characteristics, color, and centroid, as well as a number of other measurements. It can perform these operations on multiple objects simultaneously. Another key feature of Tracker is that it performs optical character recognition (OCR). This feature is useful in

  20. System Engineering Processes at Kennedy Space Center for Development of SLS and Orion Launch Systems

    NASA Technical Reports Server (NTRS)

    Schafer, Eric; Stambolian, Damon; Henderson, Gena

    2013-01-01

    There are over 40 subsystems being developed for the future SLS and Orion Launch Systems at Kennedy Space Center. These subsystems are developed at the Kennedy Space Center Engineering Directorate. The Engineering Directorate at Kennedy Space Center follows a comprehensive design process which requires several different product deliverables during each phase of each of the subsystems. This Presentation describes this process with examples of where the process has been applied.

  1. Process development

    NASA Technical Reports Server (NTRS)

    Bickler, D. B.

    1985-01-01

    An overview is given of seven process development activities which were presented at this session. Pulsed excimer laser processing of photovoltaic cells was presented. A different pulsed excimer laser annealing was described using a 50 w laser. Diffusion barrier research focused on lowering the chemical reactivity of amorphous thin film on silicon. In another effort adherent and conductive films were successfully achieved. Other efforts were aimed at achieving a simultaneous front and back junction. Microwave enhanced plasma deposition experiments were performed. An updated version of the Solar Array Manufacturing Industry Costing Standards (SAMICS) was presented, along with a life cycle cost analysis of high efficiency cells. The last presentation was on the evaluation of the ethyl vinyl acetate encapsulating system.

  2. A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The purpose of this program is to demonstrate the technical readiness of a cost effective process sequence that has the potential for the production of flat plate photovoltaic modules which met the price goal in 1986 of $.70 or less per watt peak. Program efforts included: preliminary design review, preliminary cell fabrication using the proposed process sequence, verification of sandblasting back cleanup, study of resist parameters, evaluation of pull strength of the proposed metallization, measurement of contact resistance of Electroless Ni contacts, optimization of process parameter, design of the MEPSDU module, identification and testing of insulator tapes, development of a lamination process sequence, identification, discussions, demonstrations and visits with candidate equipment vendors, evaluation of proposals for tabbing and stringing machine.

  3. Developing a Mobile Application "Educational Process Remote Management System" on the Android Operating System

    ERIC Educational Resources Information Center

    Abildinova, Gulmira M.; Alzhanov, Aitugan K.; Ospanova, Nazira N.; Taybaldieva, Zhymatay; Baigojanova, Dametken S.; Pashovkin, Nikita O.

    2016-01-01

    Nowadays, when there is a need to introduce various innovations into the educational process, most efforts are aimed at simplifying the learning process. To that end, electronic textbooks, testing systems and other software is being developed. Most of them are intended to run on personal computers with limited mobility. Smart education is…

  4. A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Design work for a photovoltaic module, fabricated using single crystal silicon dendritic web sheet material, resulted in the identification of surface treatment to the module glass superstrate which improved module efficiencies. A final solar module environmental test, a simulated hailstone impact test, was conducted on full size module superstrates to verify that the module's tempered glass superstrate can withstand specified hailstone impacts near the corners and edges of the module. Process sequence design work on the metallization process selective, liquid dopant investigation, dry processing, and antireflective/photoresist application technique tasks, and optimum thickness for Ti/Pd are discussed. A noncontact cleaning method for raw web cleaning was identified and antireflective and photoresist coatings for the dendritic webs were selected. The design of a cell string conveyor, an interconnect feed system, rolling ultrasonic spot bonding heat, and the identification of the optimal commercially available programmable control system are also discussed. An economic analysis to assess cost goals of the process sequence is also given.

  5. DEVS Unified Process for Web-Centric Development and Testing of System of Systems

    DTIC Science & Technology

    2008-05-20

    gathering from the user. Further, methodologies have been developed to generate DEVS models from BPMN /BPEL-based and message-based requirement specifications...27] 3. BPMN /BPEL based system specifications: Business Process Modeling Notation ( BPMN ) [bpm] or Business Process Execution Language (BPEL) provide a...information is stored in .wsdl and .bpel files for BPEL but in proprietary format for BPMN . 4. DoDAF-based requirement specifications: Department of

  6. Enterprise and system of systems capability development life-cycle processes.

    SciTech Connect

    Beck, David Franklin

    2014-08-01

    This report and set of appendices are a collection of memoranda originally drafted circa 2007-2009 for the purpose of describing and detailing a models-based systems engineering approach for satisfying enterprise and system-of-systems life cycle process requirements. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. The main thrust of the material presents a rational exposâe of a structured enterprise development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of standard systems engineering processes. While the approach described invokes application of the Department of Defense Architectural Framework (DoDAF), it is suitable for use with other architectural description frameworks.

  7. Decreasing costs of ground data processing system development using a software product line

    NASA Technical Reports Server (NTRS)

    Chaffin, Brian

    2005-01-01

    In this paper, I describe software product lines and why a Ground Data Processing System should use one. I also describe how to develop a software product line, using examples from an imaginary Ground Data Processing System.

  8. Carbon Dioxide Reduction Post-Processing Sub-System Development

    NASA Technical Reports Server (NTRS)

    Abney, Morgan B.; Miller, Lee A.; Greenwood, Zachary; Barton, Katherine

    2012-01-01

    The state-of-the-art Carbon Dioxide (CO2) Reduction Assembly (CRA) on the International Space Station (ISS) facilitates the recovery of oxygen from metabolic CO2. The CRA utilizes the Sabatier process to produce water with methane as a byproduct. The methane is currently vented overboard as a waste product. Because the CRA relies on hydrogen for oxygen recovery, the loss of methane ultimately results in a loss of oxygen. For missions beyond low earth orbit, it will prove essential to maximize oxygen recovery. For this purpose, NASA is exploring an integrated post-processor system to recover hydrogen from CRA methane. The post-processor, called a Plasma Pyrolysis Assembly (PPA) partially pyrolyzes methane to recover hydrogen with acetylene as a byproduct. In-flight operation of post-processor will require a Methane Purification Assembly (MePA) and an Acetylene Separation Assembly (ASepA). Recent efforts have focused on the design, fabrication, and testing of these components. The results and conclusions of these efforts will be discussed as well as future plans.

  9. A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    A cost effective process sequence and machinery for the production of flat plate photovoltaic modules are described. Cells were fabricated using the process sequence which was optimized, as was a lamination procedure. Insulator tapes and edge seal material were identified and tested. Encapsulation materials were evaluated.

  10. The Instructional Developer, Expert Systems, and the Front End Process.

    ERIC Educational Resources Information Center

    Dills, Charles R.; Romiszowski, Alexander

    This paper is intended to provide the instructional technologist already possessing some understanding of expert systems with some insight into two of the many steps involved in the design and production of such systems: knowledge acquisition and knowledge structuring or representation. It is also intended to help technologists to see how they…

  11. Systematic, Systemic and Motivating: The K-12 Career Development Process

    ERIC Educational Resources Information Center

    Snyder, Deborah; Jackson, Sherry

    2006-01-01

    In Butler County, Ohio, Butler Technology and Career Development Schools (Butler Tech) firmly believes that systematic delivery of career development theory and practice integrated with academic content standards will enable students to do all of the above. Because of this, Butler Tech's Career Initiatives division delivers a countywide career…

  12. A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Restructuring research objectives from a technical readiness demonstration program to an investigation of high risk, high payoff activities associated with producing photovoltaic modules using non-CZ sheet material is reported. Deletion of the module frame in favor of a frameless design, and modification in cell series parallel electrical interconnect configuration are reviewed. A baseline process sequence was identified for the fabrication of modules using the selected dendritic web sheet material, and economic evaluations of the sequence were completed.

  13. Development and fabrication of a solar cell junction processing system

    NASA Technical Reports Server (NTRS)

    Kiesling, R.

    1981-01-01

    The major component fabrication program was completed. Assembly and system testing of the pulsed electron beam annealing machine are described. The design program for the transport reached completion, and the detailed drawings were released for fabrication and procurement of the long lead time components.

  14. Development of GENOA Progressive Failure Parallel Processing Software Systems

    NASA Technical Reports Server (NTRS)

    Abdi, Frank; Minnetyan, Levon

    1999-01-01

    A capability consisting of software development and experimental techniques has been developed and is described. The capability is integrated into GENOA-PFA to model polymer matrix composite (PMC) structures. The capability considers the physics and mechanics of composite materials and structure by integration of a hierarchical multilevel macro-scale (lamina, laminate, and structure) and micro scale (fiber, matrix, and interface) simulation analyses. The modeling involves (1) ply layering methodology utilizing FEM elements with through-the-thickness representation, (2) simulation of effects of material defects and conditions (e.g., voids, fiber waviness, and residual stress) on global static and cyclic fatigue strengths, (3) including material nonlinearities (by updating properties periodically) and geometrical nonlinearities (by Lagrangian updating), (4) simulating crack initiation. and growth to failure under static, cyclic, creep, and impact loads. (5) progressive fracture analysis to determine durability and damage tolerance. (6) identifying the percent contribution of various possible composite failure modes involved in critical damage events. and (7) determining sensitivities of failure modes to design parameters (e.g., fiber volume fraction, ply thickness, fiber orientation. and adhesive-bond thickness). GENOA-PFA progressive failure analysis is now ready for use to investigate the effects on structural responses to PMC material degradation from damage induced by static, cyclic (fatigue). creep, and impact loading in 2D/3D PMC structures subjected to hygrothermal environments. Its use will significantly facilitate targeting design parameter changes that will be most effective in reducing the probability of a given failure mode occurring.

  15. The development process for the space shuttle primary avionics software system

    NASA Technical Reports Server (NTRS)

    Keller, T. W.

    1987-01-01

    Primary avionics software system; software development approach; user support and problem diagnosis; software releases and configuration; quality/productivity programs; and software development/production facilities are addressed. Also examined are the external evaluations of the IBM process.

  16. A process of material development towards teaching the subject of parabola using computer algebra systems

    NASA Astrophysics Data System (ADS)

    Ardıç, Mehmet Alper; Işleyen, Tevfik

    2017-04-01

    This study discusses a process of material development towards teaching the subject of the graphs of quadratic functions (parabola) by utilizing computer algebra systems. Additionally, the results obtained during and after the process of developing materials are summarized. The last section of the study provides recommendations for teachers and researchers who want to develop computer-assisted instruction materials.

  17. Low cost solar array project production process and equipment task. A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Technical readiness for the production of photovoltaic modules using single crystal silicon dendritic web sheet material is demonstrated by: (1) selection, design and implementation of solar cell and photovoltaic module process sequence in a Module Experimental Process System Development Unit; (2) demonstration runs; (3) passing of acceptance and qualification tests; and (4) achievement of a cost effective module.

  18. The development of a coal-fired combustion system for industrial process heating applications

    SciTech Connect

    Not Available

    1992-07-16

    PETC has implemented a number of advanced combustion research projects that will lead to the establishment of a broad, commercially acceptable engineering data base for the advancement of coal as the fuel of choice for boilers, furnaces, and process heaters. Vortec Corporation's Coal-Fired Combustion System for Industrial Process Heating Applications has been selected for Phase III development under contract DE-AC22-91PC91161. This advanced combustion system research program is for the development of innovative coal-fired process heaters which can be used for high temperature melting, smelting, recycling, and refining processes. The process heater concepts to be developed are based on advanced glass melting and ore smelting furnaces developed and patented by Vortec Corporation. The process heater systems to be developed have multiple use applications; however, the Phase HI research effort is being focused on the development of a process heater system to be used for producing glass frits and wool fiber from boiler and incinerator ashes. The primary objective of the Phase III project is to develop and integrate all the system components, from fuel through total system controls, and then test the complete system in order to evaluate its potential marketability. The economic evaluation of commercial scale CMS processes has begun. In order to accurately estimate the cost of the primary process vessels, preliminary designs for 25, 50, and 100 ton/day systems have been started under Task 1. This data will serve as input data for life cycle cost analysis performed as part of techno-economic evaluations. The economic evaluations of commercial CMS systems will be an integral part of the commercialization plan.

  19. Development of a Systems Engineering Model of the Chemical Separations Process

    SciTech Connect

    Sun, Lijian; Li, Jianhong; Chen, Yitung; Clarksean, Randy; Ladler, Jim; Vandergrift, George

    2002-07-01

    Work is being performed to develop a general-purpose systems engineering model for the AAA separation process. The work centers on the development of a new user interface for the AMUSE code and on the specification of a systems engineering model. This paper presents background information and an overview of work completed to date. (authors)

  20. Lessons Learned From Developing Three Generations of Remote Sensing Science Data Processing Systems

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt; Fleig, Albert J.

    2005-01-01

    The Biospheric Information Systems Branch at NASA s Goddard Space Flight Center has developed three generations of Science Investigator-led Processing Systems for use with various remote sensing instruments. The first system is used for data from the MODIS instruments flown on NASA s Earth Observing Systems @OS) Terra and Aqua Spacecraft launched in 1999 and 2002 respectively. The second generation is for the Ozone Measuring Instrument flying on the EOS Aura spacecraft launched in 2004. We are now developing a third generation of the system for evaluation science data processing for the Ozone Mapping and Profiler Suite (OMPS) to be flown by the NPOESS Preparatory Project (NPP) in 2006. The initial system was based on large scale proprietary hardware, operating and database systems. The current OMI system and the OMPS system being developed are based on commodity hardware, the LINUX Operating System and on PostgreSQL, an Open Source RDBMS. The new system distributes its data archive across multiple server hosts and processes jobs on multiple processor boxes. We have created several instances of this system, including one for operational processing, one for testing and reprocessing and one for applications development and scientific analysis. Prior to receiving the first data from OMI we applied the system to reprocessing information from the Solar Backscatter Ultraviolet (SBUV) and Total Ozone Mapping Spectrometer (TOMS) instruments flown from 1978 until now. The system was able to process 25 years (108,000 orbits) of data and produce 800,000 files (400 GiB) of level 2 and level 3 products in less than a week. We will describe the lessons we have learned and tradeoffs between system design, hardware, operating systems, operational staffing, user support and operational procedures. During each generational phase, the system has become more generic and reusable. While the system is not currently shrink wrapped we believe it is to the point where it could be readily

  1. Developing a Multi Sensor Scanning System for Hardwood Inspection and Processing

    Treesearch

    Richard W. Conners; D.Earl Kline; Philip A. Araman

    1995-01-01

    For the last few years the authors as part of the Center for Automated Processing of Hardwoods have been attempting to develop a multiple sensor hardwood defect detection system. This development activity has been ongoing for approximately 6 years, a very long time in the commercial development world. This paper will report the progress that has been made and will...

  2. The Development of a Generic Framework for the Forensic Analysis of SCADA and Process Control Systems

    NASA Astrophysics Data System (ADS)

    Slay, Jill; Sitnikova, Elena

    There is continuing interest in researching generic security architectures and strategies for managing SCADA and process control systems. Documentation from various countries on IT security does now begin to recommendations for security controls for (federal) information systems which include connected process control systems. Little or no work exists in the public domain which takes a big picture approach to the issue of developing a generic or generalisable approach to SCADA and process control system forensics. The discussion raised in this paper is that before one can develop solutions to the problem of SCADA forensics, a good understanding of the forensic computing process, and the range of technical and procedural issues subsumed with in this process, need to be understood, and also agreed, by governments, industry and academia.

  3. Processes and process development

    NASA Astrophysics Data System (ADS)

    Hwang, H. L.

    1986-02-01

    Silicon material research in the Republic of China (ROC) parallels its development in the electronic industry. A brief outline of the historical development in ROC silicon material research is given. Emphasis is placed on the recent Silane Project managed by the National Science Council, ROC, including project objectives, task forces, and recent accomplishments. An introduction is also given to industrialization of the key technologies developed in this project.

  4. FRENDY: A new nuclear data processing system being developed at JAEA

    NASA Astrophysics Data System (ADS)

    Tada, Kenichi; Nagaya, Yasunobu; Kunieda, Satoshi; Suyama, Kenya; Fukahori, Tokio

    2017-09-01

    JAEA has provided an evaluated nuclear data library JENDL and nuclear application codes such as MARBLE, SRAC, MVP and PHITS. These domestic codes have been widely used in many universities and industrial companies in Japan. However, we sometimes find problems in imported processing systems and need to revise them when the new JENDL is released. To overcome such problems and immediately process the nuclear data when it is released, JAEA started developing a new nuclear data processing system, FRENDY in 2013. This paper describes the outline of the development of FRENDY and both its capabilities and performances by the analyses of criticality experiments. The verification results indicate that FRENDY properly generates ACE files.

  5. Recent development for the ITS code system: Parallel processing and visualization

    SciTech Connect

    Fan, W.C.; Turner, C.D.; Halbleib, J.A. Sr.; Kensek, R.P.

    1996-03-01

    A brief overview is given for two software developments related to the ITS code system. These developments provide parallel processing and visualization capabilities and thus allow users to perform ITS calculations more efficiently. Timing results and a graphical example are presented to demonstrate these capabilities.

  6. Simulation platform for application development on a vision-system-on-chip with integrated signal processing

    NASA Astrophysics Data System (ADS)

    Reichel, Peter; Döge, Jens; Hoppe, Christoph; Peter, Nico; Reichel, Andreas; Schneider, Peter

    2016-07-01

    Image sensors with integrated, programmable signal processing execute computationally intensive processing steps during or immediately after image acquisition, thereby allowing for reducing output data to relevant features only. In contrast to conventional image processing systems, the tasks of image acquisition and actual image processing in such a "vision chip" cannot be viewed independently of each other. Both for validating the architecture and supporting programming in the course of application development, modeling on the system level has been performed as part of the design process of the vision-system-on-chip. Apart from the implementation of all essential components of the integrated control unit as well as digital and analog signal processing, special attention has been paid to the integration into the development environment. Being able to purposefully insert parameter deviations and/or defects at different points of the analog processing enables investigations with respect to their influence on image processing algorithms performed on the image sensor. Due to its high simulation speed and compatibility to the real system, especially regarding the to-be-executed programs, the resulting simulation model is very well suited for use in application development.

  7. Test processing system (SEE)

    NASA Technical Reports Server (NTRS)

    Gaulene, P.

    1986-01-01

    The SEE data processing system, developed in 1985, manages and process test results. General information is provided on the SEE system: objectives, characteristics, basic principles, general organization, and operation. Full documentation is accessible by computer using the HELP SEE command.

  8. The open source, object- and process oriented simulation system OpenGeoSys - concepts, development, community

    NASA Astrophysics Data System (ADS)

    Bauer, S.; Li, D.; Beyer, C.; Wang, W.; Bilke, L.; Graupner, B.

    2011-12-01

    Many geoscientific problems, such as underground waste disposal, nuclear waste disposal, CO2 sequestration, geothermal energy, etc., require for prediction of ongoing processes as well as risk and safety assessment a numerical simulation system. The governing processes are thermal heat transfer (T), hydraulic flow in multi-phase systems (H), mechanical deformation (M) and geochemical reactions (C), which interact in a complex way (THMC). The development of suitable simulation systems requires a large amount of effort for code development, verification and applications. OpenGeoSys (OGS) is an open source scientific initiative for the simulation of these THMC processes in porous media. A flexible numerical framework based on the Finite Element Method is provided and applied to the governing process equations. Due to the object- and process-oriented character of the code, functionality enhancement and code coupling with external simulators can be performed reasonably effectively. This structure also allows for a distributed development, with developers at different locations contributing to the common code. The code is platform independent, accessible via internet for development and application, and checked by an automated benchmarking system regularly.

  9. How Process Helps You in Developing a High Quality Medical Information System

    NASA Astrophysics Data System (ADS)

    Akiyama, Yoshihiro

    A medical information system is one extreme in using tacit knowledge that patricians and medical experts such as medical doctors use a lot but the knowledge may include a lot of experience information and be not explicitly formulated or implied. This is simply different from other discipline areas such as embedded engineering systems. Developing a mechanical system critically depends on how effectively such various knowledge is organized and integrated in implementing a system. As such, the development process that customers, management, engineers, and teams are involved must be evaluated from this view point. Existence of tacit knowledge may not be sensed well enough at project beginning, however it is necessary for project success. This paper describes the problems and how the Personal Software Process (PSP) and Team Software Process (TSP2) manage this problem and then typical performance results are discussed. It may be said that PSP individual and TSP team are CMMI level 4 units respectively.

  10. Materials, Processes and Manufacturing in Ares 1 Upper Stage: Integration with Systems Design and Development

    NASA Technical Reports Server (NTRS)

    Bhat, Biliyar N.

    2008-01-01

    Ares I Crew Launch Vehicle Upper Stage is designed and developed based on sound systems engineering principles. Systems Engineering starts with Concept of Operations and Mission requirements, which in turn determine the launch system architecture and its performance requirements. The Ares I-Upper Stage is designed and developed to meet these requirements. Designers depend on the support from materials, processes and manufacturing during the design, development and verification of subsystems and components. The requirements relative to reliability, safety, operability and availability are also dependent on materials availability, characterization, process maturation and vendor support. This paper discusses the roles and responsibilities of materials and manufacturing engineering during the various phases of Ares IUS development, including design and analysis, hardware development, test and verification. Emphasis is placed how materials, processes and manufacturing support is integrated over the Upper Stage Project, both horizontally and vertically. In addition, the paper describes the approach used to ensure compliance with materials, processes, and manufacturing requirements during the project cycle, with focus on hardware systems design and development.

  11. Progress in the Development of Direct Osmotic Concentration Wastewater Recovery Process for Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Cath, Tzahi Y.; Adams, Dean V.; Childress, Amy; Gormly, Sherwin; Flynn, Michael

    2005-01-01

    Direct osmotic concentration (DOC) has been identified as a high potential technology for recycling of wastewater to drinking water in advanced life support (ALS) systems. As a result the DOC process has been selected for a NASA Rapid Technology Development Team (RTDT) effort. The existing prototype system has been developed to a Technology Readiness Level (TRL) 3. The current project focuses on advancing the development of this technology from TRL 3 to TRL 6 (appropriate for human rated testing). A new prototype of a DOC system is been designed and fabricated that addresses the deficiencies encountered during the testing of the original system and allowing the new prototype to achieve TRL 6. Background information is provided about the technologies investigated and their capabilities, results from preliminary tests, and the milestones plan and activities for the RTDT program intended to develop a second generation prototype of the DOC system.

  12. Progress in the Development of Direct Osmotic Concentration Wastewater Recovery Process for Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Cath, Tzahi Y.; Adams, Dean V.; Childress, Amy; Gormly, Sherwin; Flynn, Michael

    2005-01-01

    Direct osmotic concentration (DOC) has been identified as a high potential technology for recycling of wastewater to drinking water in advanced life support (ALS) systems. As a result the DOC process has been selected for a NASA Rapid Technology Development Team (RTDT) effort. The existing prototype system has been developed to a Technology Readiness Level (TRL) 3. The current project focuses on advancing the development of this technology from TRL 3 to TRL 6 (appropriate for human rated testing). A new prototype of a DOC system is been designed and fabricated that addresses the deficiencies encountered during the testing of the original system and allowing the new prototype to achieve TRL 6. Background information is provided about the technologies investigated and their capabilities, results from preliminary tests, and the milestones plan and activities for the RTDT program intended to develop a second generation prototype of the DOC system.

  13. Functional process descriptions for the program to develop the Nuclear Waste Management System

    SciTech Connect

    Woods, T.W.

    1991-09-01

    The Office of Civilian Radioactive Waste Management (OCRWM) is executing a plan for improvement of the systems implemented to carry out its responsibilities under the Nuclear Waste Policy Act of 1982 (NWPA). As part of the plan, OCRWM is performing a systems engineering analysis of both the physical system, i.e., the Nuclear Waste Management System (NWMS), and the programmatic functions that must be accomplished to bring the physical system into being. The purpose of the program analysis is to provide a systematic identification and definition of all program functions, functional process flows, and function products necessary and sufficient to provide the physical system. The analysis resulting from this approach provides a basis for development of a comprehensive and integrated set of policies, standard practices, and procedures for the effective and efficient execution of the program. Thus, this analysis will form a basis for revising current OCRWM policies and procedures, or developing new ones is necessary. The primary purposes of this report are as follows: (1) summarizes the major functional processes and process flows that have been developed as a part of the program analysis, and (2) provide an introduction and assistance in understanding the detailed analysis information contained in the three volume report titled The Analysis of the Program to Develop the Nuclear Waste Management System (Woods 1991a).

  14. The Ergonomist’s Role in the Weapon System Development Process in Canada,

    DTIC Science & Technology

    1983-01-01

    AD A145 573 THE ERGONOMIST’S ROLE IN THE WEAPON SYSTEM DEVELOPMENT 1/1 PROCESS IN CANADA (U) DEFENCE AND CIVIL INS OF ENVIRONMENTAL MEDICINE DOWNSVIEW...AOAL 8VREAU OF STANDARDS-,, 6 3 - DCIEN No. 83-C-583 In I’ THE ERGON01II-ST’S ROLE IN THE LEAPON SYSTEM DEVELCPMENT PROCESS IN CANADA D.Beevis, 0A...the Canadian Forces a weapons system is defined as a composite of equipment, facili- ties, skills, and techniques forming a self-sufficient instrument

  15. TRW’s Ada Process Model for Incremental Development of Large Software Systems

    DTIC Science & Technology

    1990-01-01

    TRW’s Ada Process Model has proven to be key to the Command Center Processing and Display System-Replacement (CCPDS-R) project’s success to data in...developing over 3000,000 lines of Ada source code executing in a distributed VAX VMS environment. The Ada Process Model is, in simplest terms, a...software progress metrics. This paper provides an overview of the techniques and benefits of the Ada Process Model and describes some of the experience and

  16. Implementation of a configurable laboratory information management system for use in cellular process development and manufacturing.

    PubMed

    Russom, Diana; Ahmed, Amira; Gonzalez, Nancy; Alvarnas, Joseph; DiGiusto, David

    2012-01-01

    Regulatory requirements for the manufacturing of cell products for clinical investigation require a significant level of record-keeping, starting early in process development and continuing through to the execution and requisite follow-up of patients on clinical trials. Central to record-keeping is the management of documentation related to patients, raw materials, processes, assays and facilities. To support these requirements, we evaluated several laboratory information management systems (LIMS), including their cost, flexibility, regulatory compliance, ongoing programming requirements and ability to integrate with laboratory equipment. After selecting a system, we performed a pilot study to develop a user-configurable LIMS for our laboratory in support of our pre-clinical and clinical cell-production activities. We report here on the design and utilization of this system to manage accrual with a healthy blood-donor protocol, as well as manufacturing operations for the production of a master cell bank and several patient-specific stem cell products. The system was used successfully to manage blood donor eligibility, recruiting, appointments, billing and serology, and to provide annual accrual reports. Quality management reporting features of the system were used to capture, report and investigate process and equipment deviations that occurred during the production of a master cell bank and patient products. Overall the system has served to support the compliance requirements of process development and phase I/II clinical trial activities for our laboratory and can be easily modified to meet the needs of similar laboratories.

  17. Development of high-throughput fabrication process of HTS SQUID for 51-ch MCG system

    NASA Astrophysics Data System (ADS)

    Tsukamoto, A.; Saitoh, K.; Yokosawa, K.; Suzuki, D.; Seki, Y.; Kandori, A.; Tsukada, K.

    2005-10-01

    A high-throughput high-Tc SQUID fabrication process that can provide the appropriate number of SQUIDs for a 51-channel magnetocardiograph (MCG) has been developed. A new deposition system-based on a pulsed-laser-deposition technique to increase the process throughput in fabricating superconducting YBa2Cu3Oy thin films-was developed. In this system, nine superconducting thin films are successively deposited on bicrystal substrates in one deposition sequence. A mask aligner, which was customized for the bicrystal substrate, was also developed. This system enables mask alignment for the bicrystal grain boundary without the need for preprocessing to visualize it. In addition, the magnetometer pattern was designed to improve the yield for magnetometer fabrication. In this directly coupled magnetometer, four SQUIDs were connected with the same pickup coil. Accordingly, the yield of magnetometer could be enhanced by selecting the best SQUID among the four.

  18. Development of a 3-D Measuring System for Upper Limb Movements Using Image Processing

    NASA Astrophysics Data System (ADS)

    Ogata, Kohichi; Toume, Tadashi; Nakanishi, Ryoji

    This paper describes a 3-D motion capture system for the quantitative evaluation of a finger-nose test using image processing. In the field of clinical medicine, qualitative and quantitative evaluation of voluntary movements is necessary for correct diagnosis of disorders. For this purpose, we have developed a 3-D measuring system with a multi-camera system. The configuration of the system is described and examples of movement data are shown for normal subjects and patients. In the finger-nose test at a fast trial speed, a discriminant analysis using Maharanobis generalized distances shows a discriminant rate of 93% between normal subjects and spinocerebellar degeneration(SCD) patients.

  19. The development of data acquisition and processing application system for RF ion source

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodan; Wang, Xiaoying; Hu, Chundong; Jiang, Caichao; Xie, Yahong; Zhao, Yuanzhe

    2017-07-01

    As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and high reliability. In addition, it easily achieves long-pulse steady-state operation. During the process of the development and testing of the RF ion source, a lot of original experimental data will be generated. Therefore, it is necessary to develop a stable and reliable computer data acquisition and processing application system for realizing the functions of data acquisition, storage, access, and real-time monitoring. In this paper, the development of a data acquisition and processing application system for the RF ion source is presented. The hardware platform is based on the PXI system and the software is programmed on the LabVIEW development environment. The key technologies that are used for the implementation of this software programming mainly include the long-pulse data acquisition technology, multi-threading processing technology, transmission control communication protocol, and the Lempel-Ziv-Oberhumer data compression algorithm. Now, this design has been tested and applied on the RF ion source. The test results show that it can work reliably and steadily. With the help of this design, the stable plasma discharge data of the RF ion source are collected, stored, accessed, and monitored in real-time. It is shown that it has a very practical application significance for the RF experiments.

  20. Development of emergent processing loops as a system of systems concept

    NASA Astrophysics Data System (ADS)

    Gainey, James C., Jr.; Blasch, Erik P.

    1999-03-01

    This paper describes an engineering approach toward implementing the current neuroscientific understanding of how the primate brain fuses, or integrates, 'information' in the decision-making process. We describe a System of Systems (SoS) design for improving the overall performance, capabilities, operational robustness, and user confidence in Identification (ID) systems and show how it could be applied to biometrics security. We use the Physio-associative temporal sensor integration algorithm (PATSIA) which is motivated by observed functions and interactions of the thalamus, hippocampus, and cortical structures in the brain. PATSIA utilizes signal theory mathematics to model how the human efficiently perceives and uses information from the environment. The hybrid architecture implements a possible SoS-level description of the Joint Directors of US Laboratories for Fusion Working Group's functional description involving 5 levels of fusion and their associated definitions. This SoS architecture propose dynamic sensor and knowledge-source integration by implementing multiple Emergent Processing Loops for predicting, feature extracting, matching, and Searching both static and dynamic database like MSTAR's PEMS loops. Biologically, this effort demonstrates these objectives by modeling similar processes from the eyes, ears, and somatosensory channels, through the thalamus, and to the cortices as appropriate while using the hippocampus for short-term memory search and storage as necessary. The particular approach demonstrated incorporates commercially available speaker verification and face recognition software and hardware to collect data and extract features to the PATSIA. The PATSIA maximizes the confidence levels for target identification or verification in dynamic situations using a belief filter. The proof of concept described here is easily adaptable and scaleable to other military and nonmilitary sensor fusion applications.

  1. Development of the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2010-01-01

    A general overview of the development of a data acquisition and processing system is presented for a pulsed, 2-micron coherent Doppler Lidar system located in NASA Langley Research Center in Hampton, Virginia, USA. It is a comprehensive system that performs high-speed data acquisition, analysis, and data display both in real time and offline. The first flight missions are scheduled for the summer of 2010 as part of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The system as well as the control software is reviewed and its requirements and unique features are discussed.

  2. Design and development of a medical big data processing system based on Hadoop.

    PubMed

    Yao, Qin; Tian, Yu; Li, Peng-Fei; Tian, Li-Li; Qian, Yang-Ming; Li, Jing-Song

    2015-03-01

    Secondary use of medical big data is increasingly popular in healthcare services and clinical research. Understanding the logic behind medical big data demonstrates tendencies in hospital information technology and shows great significance for hospital information systems that are designing and expanding services. Big data has four characteristics--Volume, Variety, Velocity and Value (the 4 Vs)--that make traditional systems incapable of processing these data using standalones. Apache Hadoop MapReduce is a promising software framework for developing applications that process vast amounts of data in parallel with large clusters of commodity hardware in a reliable, fault-tolerant manner. With the Hadoop framework and MapReduce application program interface (API), we can more easily develop our own MapReduce applications to run on a Hadoop framework that can scale up from a single node to thousands of machines. This paper investigates a practical case of a Hadoop-based medical big data processing system. We developed this system to intelligently process medical big data and uncover some features of hospital information system user behaviors. This paper studies user behaviors regarding various data produced by different hospital information systems for daily work. In this paper, we also built a five-node Hadoop cluster to execute distributed MapReduce algorithms. Our distributed algorithms show promise in facilitating efficient data processing with medical big data in healthcare services and clinical research compared with single nodes. Additionally, with medical big data analytics, we can design our hospital information systems to be much more intelligent and easier to use by making personalized recommendations.

  3. Development of the process management system of coke-oven batteries at Raahe Steel Works

    SciTech Connect

    Swanljung, J.; Palmu, P. . Raahe Steel Works)

    1994-09-01

    The latest stage of development of the process management system in the coke plant at the Raahe Steel works is presented. The operation environment has been updated twice since commissioning (Oct. 18, 1987). When the second battery was put into operation (Nov. 28, 1992), the process computer was also changed to a model with a larger capacity. The process automation system is the same as at the start of coke production (Damatic by Valmet). Only the necessary enlargements were made when doubling coke production. The heating control system of the coke-oven batteries has been under strong development during the existence of the coke plant. The first generation system was a statistical heating model (1987--1991). The principle of the second generation heating model is based, on the one hand, on the energy balance calculated from measured energy supply, amount of coal charged and coking time data and, on the other, on the estimated coke temperature as a function of coking index. The reliability and regularity of coke production has been developed using the dynamic oven scheduling system.

  4. Development of coke-oven battery process management system at Rautaruukki Oy steelworks

    SciTech Connect

    Swanljung, J.; Palmu, P.

    1996-01-01

    Coke production in Finland is based on one coke-oven plant located at the Raahe steelworks. The first battery was brought into operation in Oct. 1987 and the second in Nov. 1992. The latest stage of development of the process management system in the coke plant at the Raahe steelworks is presented. When No. 2 battery was placed in operation the process computer was also changed to a larger capacity model. The process automation system is the same as it was at the start of coke production (Damatic). The necessary upgrades were made only when coke production was doubled. The heating control system has been under continuous development during the existence of the coke-oven plant. The first generation system was a statistical heating model (1987--1991). The second generation heating model is based on an energy balance calculated from the measured energy supply, amount of coal charged and coking time data, and also on the estimated coke temperature as a function of the coking index. The reliability and regularity of coke production has been developed using the dynamic oven scheduling system.

  5. Development of an automated processing system for potential fishing zone forecast

    NASA Astrophysics Data System (ADS)

    Ardianto, R.; Setiawan, A.; Hidayat, J. J.; Zaky, A. R.

    2017-01-01

    The Institute for Marine Research and Observation (IMRO) - Ministry of Marine Affairs and Fisheries Republic of Indonesia (MMAF) has developed a potential fishing zone (PFZ) forecast using satellite data, called Peta Prakiraan Daerah Penangkapan Ikan (PPDPI). Since 2005, IMRO disseminates everyday PPDPI maps for fisheries marine ports and 3 days average for national areas. The accuracy in determining the PFZ and processing time of maps depend much on the experience of the operators creating them. This paper presents our research in developing an automated processing system for PPDPI in order to increase the accuracy and shorten processing time. PFZ are identified by combining MODIS sea surface temperature (SST) and chlorophyll-a (CHL) data in order to detect the presence of upwelling, thermal fronts and biological productivity enhancement, where the integration of these phenomena generally representing the PFZ. The whole process involves data download, map geo-process as well as layout that are carried out automatically by Python and ArcPy. The results showed that the automated processing system could be used to reduce the operator’s dependence on determining PFZ and speed up processing time.

  6. ENERGY SYSTEM DEVELOPMENT AND LOAD MANAGEMENT THROUGH THE REHABILITATION AND RETURN TO PLAY PROCESS

    PubMed Central

    Ward, Patrick; duManoir, Gregory R

    2017-01-01

    Return-to-play from injury is a complex process involving many factors including the balancing of tissue healing rates with the development of biomotor abilities. This process requires interprofessional cooperation to ensure success. An often-overlooked aspect of return-to-play is the development and maintenance of sports specific conditioning while monitoring training load to ensure that the athlete's training stimulus over the rehabilitation period is appropriate to facilitate a successful return to play. The purpose of this clinical commentary is to address the role of energy systems training as part of the return-to-play process. Additionally the aim is to provide practitioners with an overview of practical sports conditioning training methods and monitoring strategies to allow them to direct and quantify the return-to-play process. Level of Evidence 5 PMID:28900575

  7. Transferase activity function and system development process are critical in cattle embryo development.

    PubMed

    Adams, Heather A; Southey, Bruce R; Everts, Robin E; Marjani, Sadie L; Tian, Cindy X; Lewin, Harris A; Rodriguez-Zas, Sandra L

    2011-03-01

    Microarray gene expression experiments often consider specific developmental stages, tissue sources, or reproductive technologies. This focus hinders the understanding of the cattle embryo transcriptome. To address this, four microarray experiments encompassing three developmental stages (7, 25, 280 days), two tissue sources (embryonic or extra-embryonic), and two reproductive technologies (artificial insemination or AI and somatic cell nuclear transfer or NT) were combined using two sets of meta-analyses. The first set of meta-analyses uncovered 434 genes differentially expressed between AI and NT (regardless of stage or source) that were not detected by the individual-experiment analyses. The molecular function of transferase activity was enriched among these genes that included ECE2, SLC22A1, and a gene similar to CAMK2D. Gene POLG2 was over-expressed in AI versus NT 7-day embryos and was under-expressed in AI versus NT 25-day embryos. Gene HAND2 was over-expressed in AI versus NT extra-embryonic samples at 280 days yet under-expressed in AI versus NT embryonic samples at 7 days. The second set of meta-analyses uncovered enrichment of system, organ, and anatomical structure development among the genes differentially expressed between 7- and 25-day embryos from either reproductive technology. Genes PRDX1and SLC16A1 were over-expressed in 7- versus 25-day AI embryos and under-expressed in 7- versus 25-day NT embryos. Changes in stage were associated with high number of differentially expressed genes, followed by technology and source. Genes with transferase activity may hold a clue to the differences in efficiency between reproductive technologies.

  8. A Scenario-Based Process for Requirements Development: Application to Mission Operations Systems

    NASA Technical Reports Server (NTRS)

    Bindschadler, Duane L.; Boyles, Carole A.

    2008-01-01

    The notion of using operational scenarios as part of requirements development during mission formulation (Phases A & B) is widely accepted as good system engineering practice. In the context of developing a Mission Operations System (MOS), there are numerous practical challenges to translating that notion into the cost-effective development of a useful set of requirements. These challenges can include such issues as a lack of Project-level focus on operations issues, insufficient or improper flowdown of requirements, flowdown of immature or poor-quality requirements from Project level, and MOS resource constraints (personnel expertise and/or dollars). System engineering theory must be translated into a practice that provides enough structure and standards to serve as guidance, but that retains sufficient flexibility to be tailored to the needs and constraints of a particular MOS or Project. We describe a detailed, scenario-based process for requirements development. Identifying a set of attributes for high quality requirements, we show how the portions of the process address many of those attributes. We also find that the basic process steps are robust, and can be effective even in challenging Project environments.

  9. A Scenario-Based Process for Requirements Development: Application to Mission Operations Systems

    NASA Technical Reports Server (NTRS)

    Bindschadler, Duane L.; Boyles, Carole A.

    2008-01-01

    The notion of using operational scenarios as part of requirements development during mission formulation (Phases A & B) is widely accepted as good system engineering practice. In the context of developing a Mission Operations System (MOS), there are numerous practical challenges to translating that notion into the cost-effective development of a useful set of requirements. These challenges can include such issues as a lack of Project-level focus on operations issues, insufficient or improper flowdown of requirements, flowdown of immature or poor-quality requirements from Project level, and MOS resource constraints (personnel expertise and/or dollars). System engineering theory must be translated into a practice that provides enough structure and standards to serve as guidance, but that retains sufficient flexibility to be tailored to the needs and constraints of a particular MOS or Project. We describe a detailed, scenario-based process for requirements development. Identifying a set of attributes for high quality requirements, we show how the portions of the process address many of those attributes. We also find that the basic process steps are robust, and can be effective even in challenging Project environments.

  10. Development of an on-line expert system for integrated alarm processing in nuclear power plants

    SciTech Connect

    Kim, Han Gon; Choi, Seong Soo; Kang, Ki Sig; Chang, Soon Heung

    1994-12-31

    An on-line expert system, called AFDS (Alarm Filtering and Diagnostic System), has been developed to assist operators in effectively maintaining plant safety and to enhance plant availability using advanced computer technologies for alarm processing. The AFDS is designed to perform alarm filtering and overall plantwide diagnosis when an abnormal state occurs. in addition to these functions, it carries out alarm prognosis to provide the operator with prediction-based messages and to generate high-level alarms that can be used as another diagnostic information. The system is developed on a SUN SPARC 2 workstation, and its target domain is the alarm system in the main control room of Yonggwang units 1 and 2.

  11. Development and evaluation of an intelligent traceability system for frozen tilapia fillet processing.

    PubMed

    Xiao, Xinqing; Fu, Zetian; Qi, Lin; Mira, Trebar; Zhang, Xiaoshuan

    2015-10-01

    The main export varieties in China are brand-name, high-quality bred aquatic products. Among them, tilapia has become the most important and fast-growing species since extensive consumer markets in North America and Europe have evolved as a result of commodity prices, year-round availability and quality of fresh and frozen products. As the largest tilapia farming country, China has over one-third of its tilapia production devoted to further processing and meeting foreign market demand. Using by tilapia fillet processing, this paper introduces the efforts for developing and evaluating ITS-TF: an intelligent traceability system integrated with statistical process control (SPC) and fault tree analysis (FTA). Observations, literature review and expert questionnaires were used for system requirement and knowledge acquisition; scenario simulation was applied to evaluate and validate ITS-TF performance. The results show that traceability requirement is evolved from a firefighting model to a proactive model for enhancing process management capacity for food safety; ITS-TF transforms itself as an intelligent system to provide functions on early warnings and process management by integrated SPC and FTA. The valuable suggestion that automatic data acquisition and communication technology should be integrated into ITS-TF was achieved for further system optimization, perfection and performance improvement. © 2014 Society of Chemical Industry.

  12. A Module Experimental Process System Development Unit (MEPSDU). [development of low cost solar arrays

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The technical readiness of a cost effective process sequence that has the potential for the production of flat plate photovoltaic modules which met the price goal in 1986 of $.70 or less per Watt peak was demonstrated. The proposed process sequence was reviewed and laboratory verification experiments were conducted. The preliminary process includes the following features: semicrystalline silicon (10 cm by 10 cm) as the silicon input material; spray on dopant diffusion source; Al paste BSF formation; spray on AR coating; electroless Ni plate solder dip metallization; laser scribe edges; K & S tabbing and stringing machine; and laminated EVA modules.

  13. Development and fabrication of a solar cell junction processing system. Quarterly report No. 2, July 1980

    SciTech Connect

    Siesling, R.

    1980-07-01

    The basic objectives of the program are the following: (1) to design, develop, construct and deliver a junction processing system which will be capable of producing solar cell junctions by means of ion implantation followed by pulsed electron beam annealing; (2) to include in the system a wafer transport mechanism capable of transferring 4-inch-diameter wafers into and out of the vacuum chamber where the ion implantation and pulsed electron beam annealing processes take place; (3) to integrate, test and demonstrate the system prior to its delivery to JPL along with detailed operating and maintenance manuals; and (4) to estimate component lifetimes and costs, as necessary for the contract, for the performance of comprehensive analyses in accordance with the Solar Array Manufacturing Industry Costing Standards (SAMICS). Under this contract the automated junction formation equipment to be developed involves a new system design incorporating a modified, government-owned, JPL-controlled ion implanter into a Spire-developed pulsed electron beam annealer and wafer transport system. When modified, the ion implanter will deliver a 16 mA beam of /sup 31/P/sup +/ ions with a fluence of 2.5 x 10/sup 15/ ions per square centimeter at an energy of 10 keV. The throughput design goal rate for the junction processor is 10/sup 7/ four-inch-diameter wafers per year.

  14. Development of ink transfer monitoring system for roll-to-plate gravure offset printing process

    NASA Astrophysics Data System (ADS)

    Lee, Seung-Hyun; Lee, Taik-Min; Kim, Dong-Soo; Kim, Byoung Jae; Lee, Seungwoo

    2010-11-01

    The gravure offset printing process is very cost-effective for printed electronics, such as printed solar cell, printed battery, printed TFT, printed RFID tag and so on. In gravure offset printing, there are two kinds of ink transfer processes -- off and set processes. At the off process, an elastic blanket cylinder picks up the ink from patterned plate or patterned cylinder. At the set process, ink on the elastic blanket cylinder is transferred onto the target substrate. These two ink transfer processes determine printing quality, therefore understanding of ink transfer mechanism during off and set processes are very important to control printing quality. In this study, we developed ink transfer monitoring system for roll-to-plate gravure printing. We visualized ink transfer from pattern plate to rolling blanket cylinder (off process) and from rolling blanket cylinder to plate substrate (set process) by using high-speed camera and long range microscope. We investigated the effects of pattern size, printing speed, rotational effect of blanket cylinder, contact angle and rheological property of ink to understanding gravure offset printing mechanism.

  15. Development of ultra-short pulse VUV laser system for nanoscale processing

    NASA Astrophysics Data System (ADS)

    Katto, Masahito; Zushi, Hironari; Nagaya, Wataru; Harano, Shinya; Matsumoto, Ryota; Yokotani, Atushi; Kaku, Masanori; Kubodera, Shoichi; Miyanaga, Noriaki

    2010-11-01

    We have developed intense vacuum ultraviolet (VUV) radiation sources for advanced material processing, such as photochemical surface reactions and precise processing on a nanometer scale. We have constructed a new VUV laser system to generate sub-picosecond pulses at the wavelength of 126 nm. A seed VUV pulse was generated in Xe as the 7th harmonic of a 882-nm Ti:sapphire laser. The optimum conversion was achieved at the pressure of 1.2 Torr. The seed pulse will be amplified by the Ar2^{*} media generated by an optical-field-induced ionization Ar plasma produced by the Ti:sapphire laser. We have obtained a gain coefficient of g=0.16 cm-1. Our developing system will provide VUV ultra-short pulses with sub-μJ energy at a repetition rate of 1 kHz.

  16. Applying Early Systems Engineering: Injecting Knowledge into the Capability Development Process

    DTIC Science & Technology

    2012-10-01

    CDD provides several key benefits. First, it provides a framework upon which PMs can base their own plans, synchronizing the overall effort...each KP event. The results of the KP should be summarized in a Memorandum for Record ( MFR ) stored in a location accessible to those who need to...accomplished via an MFR summarizing Applying Early Systems Engineering: Injecting Knowledge into the Capability Development Process 433Defense ARJ, October

  17. Process System Development for Proportioning and Control of Compounded Dehydrated Components of Subsistence Items

    DTIC Science & Technology

    1962-01-01

    ure to a moist atmosphere occurs, ’o~ hether this be immediately aftt:r the food has been dried or at the time it is packed . The producer is nbt...ntainers may be opened on the platform at the machine, and hand.-q urnped into the filler hoppers. The light-weight characteristic of dehydrated foods ...INDUSTRIAL PREPAREDNESS MEASURES STUDY PROCESS SYSTEM DEVELOPMENT for PROPORTIONING and CONTROL of COMPOUNDED DEHYDRATED COMPONEN’!’S of

  18. Tritium processing for the European test blanket systems: current status of the design and development strategy

    SciTech Connect

    Ricapito, I.; Calderoni, P.; Poitevin, Y.; Aiello, A.; Utili, M.; Demange, D.

    2015-03-15

    Tritium processing technologies of the two European Test Blanket Systems (TBS), HCLL (Helium Cooled Lithium Lead) and HCPB (Helium Cooled Pebble Bed), play an essential role in meeting the main objectives of the TBS experimental campaign in ITER. The compliancy with the ITER interface requirements, in terms of space availability, service fluids, limits on tritium release, constraints on maintenance, is driving the design of the TBS tritium processing systems. Other requirements come from the characteristics of the relevant test blanket module and the scientific programme that has to be developed and implemented. This paper identifies the main requirements for the design of the TBS tritium systems and equipment and, at the same time, provides an updated overview on the current design status, mainly focusing onto the tritium extractor from Pb-16Li and TBS tritium accountancy. Considerations are also given on the possible extrapolation to DEMO breeding blanket. (authors)

  19. Image processing and analysis system for development and use of free flow electrophoresis chips.

    PubMed

    Kochmann, Sven; Krylov, Sergey N

    2017-01-17

    We present an image processing and analysis system to facilitate detailed performance analysis of free flow electrophoresis (FFE) chips. It consists of a cost-effective self-built imaging setup and a comprehensive customizable software suite. Both components were designed modularly to be accessible, adaptable, versatile, and automatable. The system provides tools for i) automated identification of chip features (e.g. separation zone and flow markers), ii) extraction and analysis of stream trajectories, and iii) evaluation of flow profiles and separation quality (e.g. determination of resolution). Equipped with these tools, the presented image processing and analysis system will enable faster development of FFE chips and applications. It will also serve as a robust detector for fluorescence-based analytical applications of FFE.

  20. Development of an automated processing and screening system for the space shuttle orbiter flight test data

    NASA Technical Reports Server (NTRS)

    Mccutchen, D. K.; Brose, J. F.; Palm, W. E.

    1982-01-01

    One nemesis of the structural dynamist is the tedious task of reviewing large quantities of data. This data, obtained from various types of instrumentation, may be represented by oscillogram records, root-mean-squared (rms) time histories, power spectral densities, shock spectra, 1/3 octave band analyses, and various statistical distributions. In an attempt to reduce the laborious task of manually reviewing all of the space shuttle orbiter wideband frequency-modulated (FM) analog data, an automated processing system was developed to perform the screening process based upon predefined or predicted threshold criteria.

  1. Use of a continuous twin screw granulation and drying system during formulation development and process optimization.

    PubMed

    Vercruysse, J; Peeters, E; Fonteyne, M; Cappuyns, P; Delaet, U; Van Assche, I; De Beer, T; Remon, J P; Vervaet, C

    2015-01-01

    Since small scale is key for successful introduction of continuous techniques in the pharmaceutical industry to allow its use during formulation development and process optimization, it is essential to determine whether the product quality is similar when small quantities of materials are processed compared to the continuous processing of larger quantities. Therefore, the aim of this study was to investigate whether material processed in a single cell of the six-segmented fluid bed dryer of the ConsiGma™-25 system (a continuous twin screw granulation and drying system introduced by GEA Pharma Systems, Collette™, Wommelgem, Belgium) is predictive of granule and tablet quality during full-scale manufacturing when all drying cells are filled. Furthermore, the performance of the ConsiGma™-1 system (a mobile laboratory unit) was evaluated and compared to the ConsiGma™-25 system. A premix of two active ingredients, powdered cellulose, maize starch, pregelatinized starch and sodium starch glycolate was granulated with distilled water. After drying and milling (1000 μm, 800 rpm), granules were blended with magnesium stearate and compressed using a Modul™ P tablet press (tablet weight: 430 mg, main compression force: 12 kN). Single cell experiments using the ConsiGma™-25 system and ConsiGma™-1 system were performed in triplicate. Additionally, a 1h continuous run using the ConsiGma™-25 system was executed. Process outcomes (torque, barrel wall temperature, product temperature during drying) and granule (residual moisture content, particle size distribution, bulk and tapped density, hausner ratio, friability) as well as tablet (hardness, friability, disintegration time and dissolution) quality attributes were evaluated. By performing a 1h continuous run, it was detected that a stabilization period was needed for torque and barrel wall temperature due to initial layering of the screws and the screw chamber walls with material. Consequently, slightly deviating

  2. Microarthroscopy System With Image Processing Technology Developed for Minimally Invasive Surgery

    NASA Technical Reports Server (NTRS)

    Steele, Gynelle C.

    2001-01-01

    In a joint effort, NASA, Micro Medical Devices, and the Cleveland Clinic have developed a microarthroscopy system with digital image processing. This system consists of a disposable endoscope the size of a needle that is aimed at expanding the use of minimally invasive surgery on the knee, ankle, and other small joints. This device not only allows surgeons to make smaller incisions (by improving the clarity and brightness of images), but it gives them a better view of the injured area to make more accurate diagnoses. Because of its small size, the endoscope helps reduce physical trauma and speeds patient recovery. The faster recovery rate also makes the system cost effective for patients. The digital image processing software used with the device was originally developed by the NASA Glenn Research Center to conduct computer simulations of satellite positioning in space. It was later modified to reflect lessons learned in enhancing photographic images in support of the Center's microgravity program. Glenn's Photovoltaic Branch and Graphics and Visualization Lab (G-VIS) computer programmers and software developers enhanced and speed up graphic imaging for this application. Mary Vickerman at Glenn developed algorithms that enabled Micro Medical Devices to eliminate interference and improve the images.

  3. Microarthroscopy System With Image Processing Technology Developed for Minimally Invasive Surgery

    NASA Technical Reports Server (NTRS)

    Steele, Gynelle C.

    2001-01-01

    In a joint effort, NASA, Micro Medical Devices, and the Cleveland Clinic have developed a microarthroscopy system with digital image processing. This system consists of a disposable endoscope the size of a needle that is aimed at expanding the use of minimally invasive surgery on the knee, ankle, and other small joints. This device not only allows surgeons to make smaller incisions (by improving the clarity and brightness of images), but it gives them a better view of the injured area to make more accurate diagnoses. Because of its small size, the endoscope helps reduce physical trauma and speeds patient recovery. The faster recovery rate also makes the system cost effective for patients. The digital image processing software used with the device was originally developed by the NASA Glenn Research Center to conduct computer simulations of satellite positioning in space. It was later modified to reflect lessons learned in enhancing photographic images in support of the Center's microgravity program. Glenn's Photovoltaic Branch and Graphics and Visualization Lab (G-VIS) computer programmers and software developers enhanced and speed up graphic imaging for this application. Mary Vickerman at Glenn developed algorithms that enabled Micro Medical Devices to eliminate interference and improve the images.

  4. Development of the lateral line canal system through a bone remodeling process in zebrafish.

    PubMed

    Wada, Hironori; Iwasaki, Miki; Kawakami, Koichi

    2014-08-01

    The lateral line system of teleost fish is composed of mechanosensory receptors (neuromasts), comprising superficial receptors and others embedded in canals running under the skin. Canal diameter and size of the canal neuromasts are correlated with increasing body size, thus providing a very simple system to investigate mechanisms underlying the coordination between organ growth and body size. Here, we examine the development of the trunk lateral line canal system in zebrafish. We demonstrated that trunk canals originate from scales through a bone remodeling process, which we suggest is essential for the normal growth of canals and canal neuromasts. Moreover, we found that lateral line cells are required for the formation of canals, suggesting the existence of mutual interactions between the sensory system and surrounding connective tissues. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Integrated Process Model Development and Systems Analyses for the LIFE Power Plant

    SciTech Connect

    Meier, W R; Anklam, T; Abbott, R; Erlandson, A; Halsey, W; Miles, R; Simon, A J

    2009-07-15

    We have developed an integrated process model (IPM) for a Laser Inertial Fusion-Fission Energy (LIFE) power plant. The model includes cost and performance algorithms for the major subsystems of the plant, including the laser, fusion target fabrication and injection, fusion-fission chamber (including the tritium and fission fuel blankets), heat transfer and power conversion systems, and other balance of plant systems. The model has been developed in Visual Basic with an Excel spreadsheet user interface in order to allow experts in various aspects of the design to easily integrate their individual modules and provide a convenient, widely accessible platform for conducting the system studies. Subsystem modules vary in level of complexity; some are based on top-down scaling from fission power plant costs (for example, electric plant equipment), while others are bottom-up models based on conceptual designs being developed by LLNL (for example, the fusion-fission chamber and laser systems). The IPM is being used to evaluate design trade-offs, do design optimization, and conduct sensitivity analyses to identify high-leverage areas for R&D. We describe key aspects of the IPM and report on the results of our systems analyses. Designs are compared and evaluated as a function of key design variables such as fusion target yield and pulse repetition rate.

  6. Development of a safeguards data acquisition system for the process monitoring of a simulated reprocessing facility

    SciTech Connect

    Wachter, J.W.

    1986-01-01

    As part of the Consolidated Fuel Reprocessing Program of the Fuel Recycle Division at the Oak Ridge National Laboratory (ORNL), an Integrated Process Demonstration (IPD) facility has been constructed for development of reprocessing plant technology. Through the use of cold materials, the IPD facility provides for the integrated operation of the major equipment items of the chemical-processing portion of a nuclear fuel reprocessing plant. The equipment, processes, and the extensive use of computers in data acquisition and control are prototypical of future reprocessing facilities and provide a unique test-bed for nuclear safeguards demonstrations. The data acquisition and control system consists of several microprocessors that communicate with one another and with a host minicomputer over a common data highway. At intervals of a few minutes, a ''snapshot'' is taken of the process variables, and the data are transmitted to a safeguards computer and minicomputer work station for analysis. This paper describes this data acquisition system and the data-handling procedures leading to microscopic process monitoring for safeguards purposes.

  7. Development of visualization tools and data processing for the PRISM earth system model

    NASA Astrophysics Data System (ADS)

    de Martino, G.; Prism Work Package 4a Visualization Group

    2003-04-01

    The PRISM project includes development of a set of visualization and processing tools for use by earth system scientists. A list of requirements has been formulated, based upon information provided by the PRISM community. After having conducted a review of the requirements and of the software packages available, the team is ready to begin development of two visualization systems: a web-enabled system designed for monitoring and quality controlling model runs as they are running (Low-End graphics), and another system for high quality analysis of data which includes the ability to do 3-D plots, animations etc. with the option of controlling plot generation through scripts or using graphical interfaces (High-End graphics). Both Low-End and High-End graphics tools will use netCDF-CF metadata, the chosen PRISM System standard type of data. This poster is intended to be a showcase for our current ideas and early plans. We wish to invite comments from the wide community of earth system modellers about what functionalities would be most useful.

  8. Dealing with the Archetypes Development Process for a Regional EHR System

    PubMed Central

    Santos, M.R.; Bax, M.P.; Kalra, D.

    2012-01-01

    Objectives This paper aims to present the archetype modelling process used for the Health Department of Minas Gerais State, Brazil (SES/MG), to support building its regional EHR system, and the lessons learned during this process. Methods This study was undertaken within the Minas Gerais project. The EHR system architecture was built assuming the reference model from the ISO 13606 norm. The whole archetype development process took about ten months, coordinated by a clinical team co-ordinated by three health professionals and one systems analyst from the SES/MG. They were supported by around 30 health professionals from the internal SES/MG areas, and 5 systems analysts from the PRODEMGE. Based on a bottom-up approach, the project team used technical interviews and brainstorming sessions to conduct the modelling process. Results The main steps of the archetype modelling process were identified and described, and 20 archetypes were created. Lessons learned: –The set of principles established during the selection of PCS elements helped the clinical team to keep the focus in their objectives;–The initial focus on the archetype structural organization aspects was important;–The data elements identified were subjected to a rigorous analysis aimed at determining the most suitable clinical domain;–Levelling the concepts to accommodate them within the hierarchical levels in the reference model was definitely no easy task, and the use of a mind mapping tool facilitated the modelling process;–Part of the difficulty experienced by the clinical team was related to a view focused on the original forms previously used;–The use of worksheets facilitated the modelling process by health professionals;–It was important to have a health professional that knew about the domain tables and health classifications from the Brazilian Federal Government as member in the clinical team. Conclusion The archetypes (referencing terminology, domain tables and term lists) provided a

  9. Intelligent process development of foam molding for the Thermal Protection System (TPS) of the space shuttle external tank

    NASA Technical Reports Server (NTRS)

    Bharwani, S. S.; Walls, J. T.; Jackson, M. E.

    1987-01-01

    A knowledge based system to assist process engineers in evaluating the processability and moldability of poly-isocyanurate (PIR) formulations for the thermal protection system of the Space Shuttle external tank (ET) is discussed. The Reaction Injection Molding- Process Development Advisor (RIM-PDA) is a coupled system which takes advantage of both symbolic and numeric processing techniques. This system will aid the process engineer in identifying a startup set of mold schedules and in refining the mold schedules to remedy specific process problems diagnosed by the system.

  10. Flow cytometry as a useful tool for process development: rapid evaluation of expression systems.

    PubMed

    Patkar, Anant; Vijayasankaran, Natarajan; Urry, Dan W; Srienc, Friedrich

    2002-02-28

    Flow cytometry is an established tool in fundamental studies of single-cell microbial physiology. Here we show that it can also provide valuable information for process development. Using recombinant Escherichia coli strains, which express the protein-based polymer (GVGIP)(260)GVGVP, the utility of flow cytometry in monitoring and optimization of fermentations is demonstrated. Single cell right angle light scatter was found to be significantly affected by intracellular product formation possibly due to the formation of inclusion bodies. Translational fusions with green fluorescent protein (GFP) enabled monitoring of product accumulation, as well as plasmid free cell fraction (PFCF). Such fusions also allowed rapid evaluation of induction strategies and three different expression systems based on the T7 promoter, T7-lac promoter and the P(BAD) promoter. The expression system based on the P(BAD) promoter was found to be superior to the T7-based system.

  11. The Development of Two Science Investigator-led Processing Systems (SIPS) for NASA's Earth Observation System (EOS)

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt

    2004-01-01

    In 2001, NASA Goddard Space Flight Center's Laboratory for Terrestrial Physics started the construction of a science Investigator-led Processing System (SIPS) for processing data from the Ozone Monitoring Instrument (OMI) which will launch on the Aura platform in mid 2004. The Ozone Monitoring Instrument (OMI) is a contribution of the Netherlands Agency for Aerospace Programs (NIVR) in collaboration with the Finnish Meteorological Institute (FMI) to the Earth Observing System (EOS) Aura mission. It will continue the Total Ozone Monitoring System (TOMS) record for total ozone and other atmospheric parameters related to ozone chemistry and climate. OMI measurements will be highly synergistic with the other instruments on the EOS Aura platform. The LTP previously developed the Moderate Resolution Imaging Spectrometer (MODIS) Data Processing System (MODAPS), which has been in full operations since the launches of the Terra and Aqua spacecrafts in December, 1999 and May, 2002 respectively. During that time, it has continually evolved to better support the needs of the MODIS team. We now run multiple instances of the system managing faster than real time reprocessings of the data as well as continuing forward processing. The new OMI Data Processing System (OMIDAPS) was adapted from the MODAPS. It will ingest raw data from the satellite ground station and process it to produce calibrated, geolocated higher level data products. These data products will be transmitted to the Goddard Distributed Active Archive Center (GDAAC) instance of the Earth Observing System (EOS) Data and Information System (EOSDIS) for long term archive and distribution to the public. The OMIDAPS will also provide data distribution to the OMI Science Team for quality assessment, algorithm improvement, calibration, etc. We have taken advantage of lessons learned from the MODIS experience and software already developed for MODIS. We made some changes in the hardware system organization, database and

  12. The Development of Two Science Investigator-led Processing Systems (SIPS) for NASA's Earth Observation System (EOS)

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt

    2004-01-01

    In 2001, NASA Goddard Space Flight Center's Laboratory for Terrestrial Physics started the construction of a science Investigator-led Processing System (SIPS) for processing data from the Ozone Monitoring Instrument (OMI) which will launch on the Aura platform in mid 2004. The Ozone Monitoring Instrument (OMI) is a contribution of the Netherlands Agency for Aerospace Programs (NIVR) in collaboration with the Finnish Meteorological Institute (FMI) to the Earth Observing System (EOS) Aura mission. It will continue the Total Ozone Monitoring System (TOMS) record for total ozone and other atmospheric parameters related to ozone chemistry and climate. OMI measurements will be highly synergistic with the other instruments on the EOS Aura platform. The LTP previously developed the Moderate Resolution Imaging Spectrometer (MODIS) Data Processing System (MODAPS), which has been in full operations since the launches of the Terra and Aqua spacecrafts in December, 1999 and May, 2002 respectively. During that time, it has continually evolved to better support the needs of the MODIS team. We now run multiple instances of the system managing faster than real time reprocessings of the data as well as continuing forward processing. The new OMI Data Processing System (OMIDAPS) was adapted from the MODAPS. It will ingest raw data from the satellite ground station and process it to produce calibrated, geolocated higher level data products. These data products will be transmitted to the Goddard Distributed Active Archive Center (GDAAC) instance of the Earth Observing System (EOS) Data and Information System (EOSDIS) for long term archive and distribution to the public. The OMIDAPS will also provide data distribution to the OMI Science Team for quality assessment, algorithm improvement, calibration, etc. We have taken advantage of lessons learned from the MODIS experience and software already developed for MODIS. We made some changes in the hardware system organization, database and

  13. Development of an image processing system in splendid squid quality classification

    NASA Astrophysics Data System (ADS)

    Masunee, Niyada; Chaiprapat, Supapan; Waiyagan, Kriangkrai

    2013-07-01

    Agricultural products typically exhibit high variance in quality characteristics. To assure customer satisfaction and control manufacturing productivity, quality classification is necessary to screen off defective items and to grade the products. This article presents an application of image processing techniques on squid grading and defect discrimination. A preliminary study indicated that surface color was an efficient determinant to justify quality of splendid squids. In this study, a computer vision system (CVS) was developed to examine the characteristics of splendid squids. Using image processing techniques, squids could be classified into three different quality grades as in accordance with an industry standard. The developed system first sifted through squid images to reject ones with black marks. Qualified squids were graded on a proportion of white, pink, and red regions appearing on their bodies by using fuzzy logic. The system was evaluated on 100 images of squids at different quality levels. It was found that accuracy obtained by the proposed technique was 95% compared with sensory evaluation of an expert.

  14. Flat-plate solar array project: Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The engineering design, fabrication, assembly, operation, economic analysis, and process support research and development for an Experimental Process System Development Unit for producing semiconductor-grade silicon using the slane-to-silicon process are reported. The design activity was completed. About 95% of purchased equipment was received. The draft of the operations manual was about 50% complete and the design of the free-space system continued. The system using silicon power transfer, melting, and shotting on a psuedocontinuous basis was demonstrated.

  15. Design and development of an in-line sputtering system and process development of thin film multilayer neutron supermirrors

    SciTech Connect

    Biswas, A.; Sampathkumar, R.; Kumar, Ajaya; Bhattacharyya, D.; Sahoo, N. K.; Lagoo, K. D.; Veerapur, R. D.; Padmanabhan, M.; Puri, R. K.; Bhattacharya, Debarati; Singh, Surendra; Basu, S.

    2014-12-15

    Neutron supermirrors and supermirror polarizers are thin film multilayer based devices which are used for reflecting and polarizing neutrons in various neutron based experiments. In the present communication, the in-house development of a 9 m long in-line dc sputtering system has been described which is suitable for deposition of neutron supermirrors on large size (1500 mm × 150 mm) substrates and in large numbers. The optimisation process of deposition of Co and Ti thin film, Co/Ti periodic multilayers, and a-periodic supermirrors have also been described. The system has been used to deposit thin film multilayer supermirror polarizers which show high reflectivity up to a reasonably large critical wavevector transfer of ∼0.06 Å{sup −1} (corresponding to m = 2.5, i.e., 2.5 times critical wavevector transfer of natural Ni). The computer code for designing these supermirrors has also been developed in-house.

  16. Development of an anthropogenic emissions processing system for Asia using SMOKE

    NASA Astrophysics Data System (ADS)

    Woo, Jung-Hun; Choi, Ki-Chul; Kim, Hyeon Kook; Baek, Bok H.; Jang, Meongdo; Eum, Jeong-Hee; Song, Chul Han; Ma, Young-Il; Sunwoo, Young; Chang, Lim-Seok; Yoo, Seung Heon

    2012-10-01

    Air quality modeling is a useful methodology to investigate air quality degradation in various locations and to analyze effectiveness of emission reduction plans. A comprehensive air quality model usually requires a coordinated set of emissions input of all necessary chemical species. We have developed an anthropogenic emissions processing system for Asia in support of air quality modeling and analysis over Asia (named SMOKE-Asia). The SMOKE (Sparse Matrix Operator kernel Emissions) system, which was developed by U.S. EPA and has been maintained by the Carolina Environmental Program (CEP) of the University of North Carolina, was used to develop our emissions processing system. A merged version of INTEX 2006 and TRACE-P 2000 inventories was used as an initial Asian emissions inventory. The IDA (Inventory Data Analyzer) format was used to create SMOKE-ready emissions. Source Classification Codes (SCCs) and country/state/county (FIPS) code, which are the two key data fields of SMOKE IDA data structure, were created for Asia. The 38 SCCs and 2752 FIPS codes were allocated to our SMOKE-ready emissions for more comprehensive processing. US EPA's MIMS (Multimedia Integrated Modeling System) Spatial Allocator software, along with many global and regional GIS shapes, were used to create spatial allocation profiles for Asia. Temporal allocation and chemical speciation profiles were partly regionalized using Asia-based studies. Initial data production using the developed SMOKE-Asia system was successfully performed. NOx and VOC emissions for the year 2009 were projected to be increased by 50% from those of 1997. The emission hotspots, such as large cities and large point sources, are distinguished in the domain due to spatial allocation. Regional emission peaks were distinguished due to temporally resolved emission information. The PAR (Paraffin carbon bond) and XYL (Xylene and other polyalkyl aromatics) showed the first and second largest emission rate among VOC species

  17. Development and Application of a Process-based River System Model at a Continental Scale

    NASA Astrophysics Data System (ADS)

    Kim, S. S. H.; Dutta, D.; Vaze, J.; Hughes, J. D.; Yang, A.; Teng, J.

    2014-12-01

    Existing global and continental scale river models, mainly designed for integrating with global climate model, are of very course spatial resolutions and they lack many important hydrological processes, such as overbank flow, irrigation diversion, groundwater seepage/recharge, which operate at a much finer resolution. Thus, these models are not suitable for producing streamflow forecast at fine spatial resolution and water accounts at sub-catchment levels, which are important for water resources planning and management at regional and national scale. A large-scale river system model has been developed and implemented for water accounting in Australia as part of the Water Information Research and Development Alliance between Australia's Bureau of Meteorology (BoM) and CSIRO. The model, developed using node-link architecture, includes all major hydrological processes, anthropogenic water utilisation and storage routing that influence the streamflow in both regulated and unregulated river systems. It includes an irrigation model to compute water diversion for irrigation use and associated fluxes and stores and a storage-based floodplain inundation model to compute overbank flow from river to floodplain and associated floodplain fluxes and stores. An auto-calibration tool has been built within the modelling system to automatically calibrate the model in large river systems using Shuffled Complex Evolution optimiser and user-defined objective functions. The auto-calibration tool makes the model computationally efficient and practical for large basin applications. The model has been implemented in several large basins in Australia including the Murray-Darling Basin, covering more than 2 million km2. The results of calibration and validation of the model shows highly satisfactory performance. The model has been operalisationalised in BoM for producing various fluxes and stores for national water accounting. This paper introduces this newly developed river system model

  18. Real-time multiuser image processing system for research and development

    NASA Astrophysics Data System (ADS)

    Andersson, Ingmar A.

    1993-01-01

    This is a description of the SAAB Missiles image processing laboratory. At present four persons can work simultaneously with advanced image processing and it is easy to expand the system. A lot of functions can be realized in real-time since it consists of 21 image processing boards. It is the combination of the extremely fast image processing and the multi-user function that makes this system unique.

  19. The development of a zeolite system for upgrade of the Process Waste Treatment Plant

    SciTech Connect

    Robinson, S.M.; Kent, T.E.; Arnold, W.D.; Parrott, J.R. Jr.

    1993-10-01

    Studies have been undertaken to design an efficient zeolite ion exchange system for use at the ORNL Process Waste Treatment Plant to remove cesium and strontium to meet discharge limits. This report focuses on two areas: (1) design of column hardware and pretreatment steps needed to eliminate column plugging and channeling and (2) development of equilibrium models for the wastewater system. Results indicate that zeolite columns do not plug as quickly when the wastewater equalization is performed in the new Bethel Valley Storage Tanks instead of the former equalization basin where suspended solids concentration is high. A down-flow column with spent zeolite was used successfully as a prefilter to prevent plugging of the zeolite columns being used to remove strontium and cesium. Equilibrium studies indicate that a Langmuir isotherm models binary zeolite equilibrium data while the modified Dubinin-Polyani model predicts multicomponent data.

  20. Development of Three-Layer Simulation Model for Freezing Process of Food Solution Systems

    NASA Astrophysics Data System (ADS)

    Kaminishi, Koji; Araki, Tetsuya; Shirakashi, Ryo; Ueno, Shigeaki; Sagara, Yasuyuki

    A numerical model has been developed for simulating freezing phenomena of food solution systems. The cell model was simplified to apply to food solution systems, incorporating with the existence of 3 parts such as unfrozen, frozen and moving boundary layers. Moreover, the moving rate of freezing front model was also introduced and calculated by using the variable space network method proposed by Murray and Landis (1957). To demonstrate the validity of the model, it was applied to the freezing processes of coffee solutions. Since the model required the phase diagram of the material to be frozen, the initial freezing temperatures of 1-55 % coffee solutions were measured by the DSC method. The effective thermal conductivity for coffee solutions was determined as a function of temperature and solute concentration by using the Maxwell - Eucken model. One-dimensional freezing process of 10 % coffee solution was simulated based on its phase diagram and thermo-physical properties. The results were good agreement with the experimental data and then showed that the model could accurately describe the change in the location of the freezing front and the distributions of temperature as well as ice fraction during a freezing process.

  1. Developing and implementing a standardized process for global trigger tool application across a large health system.

    PubMed

    Garrett, Paul R; Sammer, Christine; Nelson, Antoinette; Paisley, Kathleen A; Jones, Cason; Shapiro, Eve; Tonkel, Jackie; Housman, Michael

    2013-07-01

    To complement voluntary adverse event reporting, which may detect only specific categories of harms and may represent merely a fraction of actual adverse events, the Adventist Health System (AHS) began using the Institute for Healthcare Improvement (IHI) Global Trigger Tool (GTT) to more accurately gauge the number, types, and severity levels of adverse events and developed a centralized process to do so uniformly. AHS began using the GTT in 2009 in 25 of its 42 hospitals that used a common electronic medical record (EMR). The common EMR and centralized record review enables AHS to apply the GTT uniformly and provides consistency of data collected. AHS sends quarterly reports to participating facilities to communicate findings and provides case studies illustrating the most egregious harms. Case study recipients are encouraged to further examine patient records, explore events leading to harm, and share the information with process/quality improvement committees, medical executive committees, and boards of directors to identify opportunities for quality improvement. AHS staffing and record review processes have evolved since 2009. A GTT review of 17,295 patient records indicated that adverse events clustered as medication-related glycemic events; medication-related delirium, confusion, or oversedation related to analgesics, sedatives, and muscle relaxants; pressure ulcers; medication-related bleeding; and medication-related skin/mucosal reaction/itching. The AHS process demonstrates how a large health system uses the GTT to detect harms. Since 2009 AHS has improved and streamlined its reporting, data entry and review processes. AHS used major harms findings to initiate systemwide collaborative improvement projects for glycemic management and pressure ulcers.

  2. Image retrieval and processing system version 2.0 development work

    NASA Technical Reports Server (NTRS)

    Slavney, Susan H.; Guinness, Edward A.

    1991-01-01

    The Image Retrieval and Processing System (IRPS) is a software package developed at Washington University and used by the NASA Regional Planetary Image Facilities (RPIF's). The IRPS combines data base management and image processing components to allow the user to examine catalogs of image data, locate the data of interest, and perform radiometric and geometric calibration of the data in preparation for analysis. Version 1.0 of IRPS was completed in Aug. 1989 and was installed at several IRPS's. Other RPIF's use remote logins via NASA Science Internet to access IRPS at Washington University. Work was begun on designing and population a catalog of Magellan image products that will be part of IRPS Version 2.0, planned for release by the end of calendar year 1991. With this catalog, a user will be able to search by orbit and by location for Magellan Basic Image Data Records (BIDR's), Mosaicked Image Data Records (MIDR's), and Altimetry-Radiometry Composite Data Records (ARCDR's). The catalog will include the Magellan CD-ROM volume, director, and file name for each data product. The image processing component of IRPS is based on the Planetary Image Cartography Software (PICS) developed by the U.S. Geological Survey, Flagstaff, Arizona. To augment PICS capabilities, a set of image processing programs were developed that are compatible with PICS-format images. This software includes general-purpose functions that PICS does not have, analysis and utility programs for specific data sets, and programs from other sources that were modified to work with PICS images. Some of the software will be integrated into the Version 2.0 release of IRPS. A table is presented that lists the programs with a brief functional description of each.

  3. Barotropic processes associated with the development of the Mei-yu precipitation system

    NASA Astrophysics Data System (ADS)

    Li, Tingting; Li, Xiaofan

    2016-05-01

    The barotropic processes associated with the development of a precipitation system are investigated through analysis of cloud-resolving model simulations of Mei-yu torrential rainfall events over eastern China in mid-June 2011. During the model integration period, there were three major heavy rainfall events: 9-12, 13-16 and 16-20 June. The kinetic energy is converted from perturbation to mean circulations in the first and second period, whereas it is converted from mean to perturbation circulations in the third period. Further analysis shows that kinetic energy conversion is determined by vertical transport of zonal momentum. Thus, the prognostic equation of vertical transport of zonal momentum is derived, in which its tendency is associated with dynamic, pressure gradient and buoyancy processes. The kinetic energy conversion from perturbation to mean circulations in the first period is mainly associated with the dynamic processes. The kinetic energy conversion from mean to perturbation circulations in the third period is generally related to the pressure gradient processes.

  4. Software Technology for Adaptable, Reliable Systems (STARS) Program. The Cleanroom Engineering Software Development Process

    DTIC Science & Technology

    1991-02-28

    required for performing a Cleanroom Engineering effort from the standpoint of specifiers, developers, certifiers, and managers . The manual was developed...Process, Process Management , Defined Process, Cleanroom, 302 Software Engineering 16. PRICE CODE N/A 17. ’.CURIrY CLASSIFICATION 18. SECURITY...Findings 1-4 1.3 IR-70 Project Profile 1-5 1.4 Acknowledgements 1-5 2. The Cleanroom Engineering Process: The Management Basis 2.1 Why Cleanroom

  5. Image Processing System

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Mallinckrodt Institute of Radiology (MIR) is using a digital image processing system which employs NASA-developed technology. MIR's computer system is the largest radiology system in the world. It is used in diagnostic imaging. Blood vessels are injected with x-ray dye, and the images which are produced indicate whether arteries are hardened or blocked. A computer program developed by Jet Propulsion Laboratory known as Mini-VICAR/IBIS was supplied to MIR by COSMIC. The program provides the basis for developing the computer imaging routines for data processing, contrast enhancement and picture display.

  6. Development and validation of a notational system to study the offensive process in football.

    PubMed

    Sarmento, Hugo; Anguera, Teresa; Campaniço, Jorge; Leitão, José

    2010-01-01

    The most striking change within football development is the application of science to its problems and in particular the use of increasingly sophisticated technology that, supported by scientific data, allows us to establish a "code of reading" the reality of the game. Therefore, this study describes the process of the development and validation of an ad hoc system of categorization, which allows the different methods of offensive game in football and the interaction to be analyzed. Therefore, through an exploratory phase of the study, we identified 10 vertebrate criteria and the respective behaviors observed for each of these criteria. We heard a panel of five experts with the purpose of a content validation. The resulting instrument is characterized by a combination of field formats and systems of categories. The reliability of the instrument was calculated by the intraobserver agreement, and values above 0.95 for all criteria were achieved. Two FC Barcelona games were coded and analyzed, which allowed the detection of various T-patterns. The results show that the instrument serves the purpose for which it was developed and can provide important information for the understanding of game interaction in football.

  7. A Plan to Develop a Red Tide Warning System for Seawater Desalination Process Management

    NASA Astrophysics Data System (ADS)

    Kim, Tae Woo; Yun, Hong Sik

    2017-04-01

    The holt of the seawater desalination process for fifty five days due to the eight-month long red tide in 2008 in the Persian Gulf, the Middle East, had lost about 10 billion KRW. The POSCO Seawater Desalination facility, located in Gwangyang Bay Area in the Southern Sea, has produced 30,000 tons of fresh water per day since 2014. Since there has been an incident of red time in the area for three months in August, 2012, it is necessary to establish a warning system for red tide that threatens the stable operation of the seawater desalination facility. A red tide warning system can offer the seawater desalination facility manager customized services on red tide information and potential red tide inflow to the water intake. This study aimed to develop a red tide warning system in Gwangyang Bay Area by combining RS, modeling and monitoring technologies, which provides red tide forecasting information with which to effectively control the seawater desalination process. Using the proposed system, the seawater desalination facility manager can take phased measures to cope with the inflow of red tide. ACKNOWLEDGMENTS This research was supported by a grant(16IFIP-C088924-03) from Industrial Facilities & Infrastructure Research Program funded by Ministry of Land, Infrastructure and Transport(MOLIT) of the Korea government and the Korea Agency for Infrastructure Technology Advancement (KAIA). This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education(NRF-2014R1A1A2054975).

  8. System design development for microwave and millimeter-wave materials processing

    NASA Astrophysics Data System (ADS)

    Feher, Lambert; Thumm, Manfred

    2002-06-01

    The most notable effect in processing dielectrics with micro- and millimeter-waves is volumetric heating of these materials, offering the opportunity of very high heating rates for the samples. In comparison to conventional heating where the heat transfer is diffusive and depends on the thermal conductivity of the material, the microwave field penetrates the sample and acts as an instantaneous heat source at each point of the sample. By this unique property, microwave heating at 2.45 GHz and 915 MHz ISM (Industrial, Medical, Scientific) frequencies is established as an important industrial technology since more than 50 years ago. Successful application of microwaves in industries has been reported e.g. by food processing systems, domestic ovens, rubber industry, vacuum drying etc. The present paper shows some outlines of microwave system development at Forschungszentrum Karlsruhe, IHM by transferring properties from the higher frequency regime (millimeter-waves) to lower frequency applications. Anyway, the need for using higher frequencies like 24 GHz (ISM frequency) for industrial applications has to be carefully verified with respect to special physical/engineering advantages or to limits the standard microwave technology meets for the specific problem.

  9. Image processing, geometric modeling and data management for development of a virtual bone surgery system.

    PubMed

    Niu, Qiang; Chi, Xiaoyi; Leu, Ming C; Ochoa, Jorge

    2008-01-01

    This paper describes image processing, geometric modeling and data management techniques for the development of a virtual bone surgery system. Image segmentation is used to divide CT scan data into different segments representing various regions of the bone. A region-growing algorithm is used to extract cortical bone and trabecular bone structures systematically and efficiently. Volume modeling is then used to represent the bone geometry based on the CT scan data. Material removal simulation is achieved by continuously performing Boolean subtraction of the surgical tool model from the bone model. A quadtree-based adaptive subdivision technique is developed to handle the large set of data in order to achieve the real-time simulation and visualization required for virtual bone surgery. A Marching Cubes algorithm is used to generate polygonal faces from the volumetric data. Rendering of the generated polygons is performed with the publicly available VTK (Visualization Tool Kit) software. Implementation of the developed techniques consists of developing a virtual bone-drilling software program, which allows the user to manipulate a virtual drill to make holes with the use of a PHANToM device on a bone model derived from real CT scan data.

  10. The Naval Enlisted Professional Development Information System (NEPDIS): Front End Analysis (FEA) Process. Technical Report 159.

    ERIC Educational Resources Information Center

    Aagard, James A.; Ansbro, Thomas M.

    The Naval Enlisted Professional Development Information System (NEPDIS) was designed to function as a fully computerized information assembly and analysis system to support labor force, personnel, and training management. The NEPDIS comprises separate training development, instructional, training record and evaluation, career development, and…

  11. Development and Evaluation of a Thai Learning System on the Web Using Natural Language Processing.

    ERIC Educational Resources Information Center

    Dansuwan, Suyada; Nishina, Kikuko; Akahori, Kanji; Shimizu, Yasutaka

    2001-01-01

    Describes the Thai Learning System, which is designed to help learners acquire the Thai word order system. The system facilitates the lessons on the Web using HyperText Markup Language and Perl programming, which interfaces with natural language processing by means of Prolog. (Author/VWL)

  12. Development and Evaluation of a Thai Learning System on the Web Using Natural Language Processing.

    ERIC Educational Resources Information Center

    Dansuwan, Suyada; Nishina, Kikuko; Akahori, Kanji; Shimizu, Yasutaka

    2001-01-01

    Describes the Thai Learning System, which is designed to help learners acquire the Thai word order system. The system facilitates the lessons on the Web using HyperText Markup Language and Perl programming, which interfaces with natural language processing by means of Prolog. (Author/VWL)

  13. BIOGAS Process development

    SciTech Connect

    Ghosh, S.; Mensinger, M.C.; Sajjad, A.; Henry, M.P.

    1984-01-01

    The overall objective of the program is to demonstrate and commercialize the IGT two-phase BIOGAS Process for optimized methane production from, and simultaneous stabilization of, municipal solid waste (MSW). The specific objective of the current program is to conduct a laboratory-scale investigation of simple, cost-effective feed pretreatment techniques and selected digestion reactor designs to optimize methane production from MSW-sludge blends, and to select the best pretreatment and digestion conditions for testing during the subsequent program for process development unit (PDU) operation. A significant portion of the program efforts to date has been directed at evaluating and/or developing feeding, mixing and discharging systems for handling high concentration, large particle size RDF slurries for anaerobic digestion processes. The performance of such processes depends significantly on the operational success of these subsystems. The results of the subsystem testing have been implemented in the design and operation of the 10-L, 20-L, and 125-L digesters. These results will also be utilized to design the CSTR and the upflow digesters of a large two-phase system. Data collected during the initial phase of this research showed in general that methane production from RDF decreased as the loading rate was increased. Thermophilic digestion did not appear to be significantly better than mesophlic digestion. 9 figures, 3 tables.

  14. Attitude determination of a high altitude balloon system. Part 2: Development of the parameter determination process

    NASA Technical Reports Server (NTRS)

    Nigro, N. J.; Elkouh, A. F.

    1975-01-01

    The attitude of the balloon system is determined as a function of time if: (a) a method for simulating the motion of the system is available, and (b) the initial state is known. The initial state is obtained by fitting the system motion (as measured by sensors) to the corresponding output predicted by the mathematical model. In the case of the LACATE experiment the sensors consisted of three orthogonally oriented rate gyros and a magnetometer all mounted on the research platform. The initial state was obtained by fitting the angular velocity components measured with the gyros to the corresponding values obtained from the solution of the math model. A block diagram illustrating the attitude determination process employed for the LACATE experiment is shown. The process consists of three essential parts; a process for simulating the balloon system, an instrumentation system for measuring the output, and a parameter estimation process for systematically and efficiently solving the initial state. Results are presented and discussed.

  15. Development of a web-based video management and application processing system

    NASA Astrophysics Data System (ADS)

    Chan, Shermann S.; Wu, Yi; Li, Qing; Zhuang, Yueting

    2001-07-01

    How to facilitate efficient video manipulation and access in a web-based environment is becoming a popular trend for video applications. In this paper, we present a web-oriented video management and application processing system, based on our previous work on multimedia database and content-based retrieval. In particular, we extend the VideoMAP architecture with specific web-oriented mechanisms, which include: (1) Concurrency control facilities for the editing of video data among different types of users, such as Video Administrator, Video Producer, Video Editor, and Video Query Client; different users are assigned various priority levels for different operations on the database. (2) Versatile video retrieval mechanism which employs a hybrid approach by integrating a query-based (database) mechanism with content- based retrieval (CBR) functions; its specific language (CAROL/ST with CBR) supports spatio-temporal semantics of video objects, and also offers an improved mechanism to describe visual content of videos by content-based analysis method. (3) Query profiling database which records the `histories' of various clients' query activities; such profiles can be used to provide the default query template when a similar query is encountered by the same kind of users. An experimental prototype system is being developed based on the existing VideoMAP prototype system, using Java and VC++ on the PC platform.

  16. EARSEC SAR processing system

    NASA Astrophysics Data System (ADS)

    Protheroe, Mark; Sloggett, David R.; Sieber, Alois J.

    1994-12-01

    Traditionally, the production of high quality Synthetic Aperture Radar imagery has been an area where a potential user would have to expend large amounts of money in either the bespoke development of a processing chain dedicated to his requirements or in the purchase of a dedicated hardware platform adapted using accelerator boards and enhanced memory management. Whichever option the user adopted there were limitations based on the desire for a realistic throughput in data load and time. The user had a choice, made early in the purchase, for either a system that adopted innovative algorithmic manipulation, to limit the processing time of the purchase of expensive hardware. The former limits the quality of the product, while the latter excludes the user from any visibility into the processing chain. Clearly there was a need for a SAR processing architecture that gave the user a choice into the methodology to be adopted for a particular processing sequence, allowing him to decide on either a quick (lower quality) product or a detailed slower (high quality) product, without having to change the algorithmic base of his processor or the hardware platform. The European Commission, through the Advanced Techniques unit of the Joint Research Centre (JRC) Institute for Remote Sensing at Ispra in Italy, realizing the limitations on current processing abilities, initiated its own program to build airborne SAR and Electro-Optical (EO) sensor systems. This program is called the European Airborne Remote Sensing Capabilities (EARSEC) program. This paper describes the processing system developed for the airborne SAR sensor system. The paper considers the requirements for the system and the design of the EARSEC Airborne SAR Processing System. It highlights the development of an open SAR processing architecture where users have full access to intermediate products that arise from each of the major processing stages. It also describes the main processing stages in the overall

  17. Development of polymer MEMS process technology as an approach to a sustainable production system

    NASA Astrophysics Data System (ADS)

    Sugiyama, Susumu; Amaya, Satoshi; Viet Dao, Dzung

    2012-03-01

    Polymethyl methacrylate (PMMA) has been proposed as a material for micro-electromechanical systems (MEMS) to initiate the research on environmentally friendly micro-nano machining technology using polymer materials. A polymer MEMS process has been developed using hot embossing and precision machining. MEMS structures less than 2 μm were successfully embossed. The PMMA layer that remained after hot embossing was removed by a polishing process to release the movable parts. A PMMA electrostatic comb-drive microactuator was fabricated. Both finger width and gap between fingers were 5 μm, and thickness was larger than 70 μm. An operated displacement of 11 μm at a drive voltage of 100 V was obtained. It was 20 times larger than that of an identical silicon device. A torsional micro mirror device driving with vertical comb actuator was fabricated. The size of the mirror was 1×1 mm2. The maximum tilt angle of 5.6 was obtained with driving voltage of 100 V and frequency up to 100 Hz. A chevron-shaped PMMA thermal actuator with a thickness of about 50 μm has been fabricated and tested successfully. The displacement was about 5 times larger than that of a Si counterpart at the same power consumption.

  18. An Information System Development Method Combining Business Process Modeling with Executable Modeling and its Evaluation by Prototyping

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Hirabayashi, Syuichi; Suzuki, Ryo; Mitsui, Hiroyasu; Koizumi, Hisao

    The business in the enterprise is closely related with the information system to such an extent that the business activities are difficult without the information system. The system design technique that considers the business process well, and that enables a quick system development is requested. In addition, the demand for the development cost is also severe than before. To cope with the current situation, the modeling technology named BPM(Business Process Management/Modeling)is drawing attention and becoming important as a key technology. BPM is a technology to model business activities as business processes and visualize them to improve the business efficiency. However, a general methodology to develop the information system using the analysis result of BPM doesn't exist, and a few development cases are reported. This paper proposes an information system development method combining business process modeling with executable modeling. In this paper we describe a guideline to support consistency of development and development efficiency and the framework enabling to develop the information system from model. We have prototyped the information system with the proposed method and our experience has shown that the methodology is valuable.

  19. Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, & Visualization

    SciTech Connect

    Wright, David L.

    2004-12-01

    Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, and Visualization Methods with Applications to Site Characterization EMSP Project 86992 Progress Report as of 9/2004.

  20. DoD Software-intensive Systems Development: A Hit and Miss Process

    DTIC Science & Technology

    2015-04-30

    system. The Defense Acquisition System The DoD Acquisition, Technology , and Logistics Life Cycle Management System is the framework for control...schedule growth as the true demands of the software development effort are discovered only after contract award. Technology Readiness Assessment and...Risk Management Another important management aspect is addressing the readiness of the key technologies for successful development and deployment. A

  1. Tank Waste Remediation System tank waste pretreatment and vitrification process development testing requirements assessment

    SciTech Connect

    Howden, G.F.

    1994-10-24

    A multi-faceted study was initiated in November 1993 to provide assurance that needed testing capabilities, facilities, and support infrastructure (sampling systems, casks, transportation systems, permits, etc.) would be available when needed for process and equipment development to support pretreatment and vitrification facility design and construction schedules. This first major report provides a snapshot of the known testing needs for pretreatment, low-level waste (LLW) and high-level waste (HLW) vitrification, and documents the results of a series of preliminary studies and workshops to define the issues needing resolution by cold or hot testing. Identified in this report are more than 140 Hanford Site tank waste pretreatment and LLW/HLW vitrification technology issues that can only be resolved by testing. The report also broadly characterizes the level of testing needed to resolve each issue. A second report will provide a strategy(ies) for ensuring timely test capability. Later reports will assess the capabilities of existing facilities to support needed testing and will recommend siting of the tests together with needed facility and infrastructure upgrades or additions.

  2. Development of digital interactive processing system for NOAA satellites AVHRR data

    NASA Astrophysics Data System (ADS)

    Gupta, R. K.; Murthy, N. N.

    The paper discusses the digital image processing system for NOAA/AVHRR data including Land applications - configured around VAX 11/750 host computer supported with FPS 100 Array Processor, Comtal graphic display and HP Plotting devices; wherein the system software for relational Data Base together with query and editing facilities, Man-Machine Interface using form, menu and prompt inputs including validation of user entries for data type and range; preprocessing software for data calibration, Sun-angle correction, Geometric Corrections for Earth curvature effect and Earth rotation offsets and Earth location of AVHRR image have been accomplished. The implemented image enhancement techniques such as grey level stretching, histogram equalization and convolution are discussed. The software implementation details for the computation of vegetative index and normalized vegetative index using NOAA/AVHRR channels 1 and 2 data together with output are presented; scientific background for such computations and obtainability of similar indices from Landsat/MSS data are also included. The paper concludes by specifying the further software developments planned and the progress envisaged in the field of vegetation index studies.

  3. Considerations in developing geographic informations systems based on low-cost digital image processing

    NASA Technical Reports Server (NTRS)

    Henderson, F. M.; Dobson, M. W.

    1981-01-01

    The potential of digital image processing systems costing $20,000 or less for geographic information systems is assessed with the emphasis on the volume of data to be handled, the commercial hardware systems available, and the basic software for: (1) data entry, conversion and digitization; (2) georeferencing and geometric correction; (3) data structuring; (4) editing and updating; (5) analysis and retrieval; (6) output drivers; and (7) data management. Costs must also be considered as tangible and intangible factors.

  4. Considerations in developing geographic informations systems based on low-cost digital image processing

    NASA Technical Reports Server (NTRS)

    Henderson, F. M.; Dobson, M. W.

    1981-01-01

    The potential of digital image processing systems costing $20,000 or less for geographic information systems is assessed with the emphasis on the volume of data to be handled, the commercial hardware systems available, and the basic software for: (1) data entry, conversion and digitization; (2) georeferencing and geometric correction; (3) data structuring; (4) editing and updating; (5) analysis and retrieval; (6) output drivers; and (7) data management. Costs must also be considered as tangible and intangible factors.

  5. A Module Experimental Process System Development Unit (MEPSDU). [flat plate solar arrays

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a cost effective process sequence that has the potential for the production of flat plate photovoltaic modules which meet the price goal in 1986 of 70 cents or less per Watt peak is described. The major accomplishments include (1) an improved AR coating technique; (2) the use of sand blast back clean-up to reduce clean up costs and to allow much of the Al paste to serve as a back conductor; and (3) the development of wave soldering for use with solar cells. Cells were processed to evaluate different process steps, a cell and minimodule test plan was prepared and data were collected for preliminary Samics cost analysis.

  6. Renovation of CPF (Chemical Processing Facility) for Development of Advanced Fast Reactor Fuel Cycle System

    SciTech Connect

    Shinichi Aose; Takafumi Kitajima; Kouji Ogasawara; Kazunori Nomura; Shigehiko Miyachi; Yoshiaki Ichige; Tadahiro Shinozaki; Shinichi Ohuchi

    2008-01-15

    CPF (Chemical Processing Facility) was constructed at Nuclear Fuel Cycle Engineering Laboratories of JAEA (Japan Atomic Energy Agency) in 1980 as a basic research field where spent fuel pins from fast reactor (FR) and high level liquid waste can be dealt with. The renovation consists of remodeling of the CA-3 cell and the laboratory A, installation of globe boxes, hoods and analytical equipments to the laboratory C and the analytical laboratory. Also maintenance equipments in the CA-5 cell which had been out of order were repaired. The CA-3 cell is the main cell in which important equipments such as a dissolver, a clarifier and extractors are installed for carrying out the hot test using the irradiated FR fuel. Since the CPF had specialized originally in the research function for the Purex process, it was desired to execute the research and development of such new, various reprocessing processes. Formerly, equipments were arranged in wide space and connected with not only each other but also with utility supply system mainly by fixed stainless steel pipes. It caused shortage of operation space in flexibility for basic experimental study. Old equipments in the CA-3 cell including vessels and pipes were removed after successful decontamination, and new equipments were installed conformably to the new design. For the purpose of easy installation and rearranging the experimental equipments, equipments are basically connected by flexible pipes. Since dissolver is able to be easily replaced, various dissolution experiments is conducted. Insoluble residue generated by dissolution of spent fuel is clarified by centrifugal. This small apparatus is effective to space-saving. Mini mixer settlers or centrifugal contactors are put on to the prescribed limited space in front of the backside wall. Fresh reagents such as solvent, scrubbing and stripping solution are continuously fed from the laboratory A to the extractor by the reagent supply system with semi-automatic observation

  7. Flat-plate solar-array project. Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The engineering design, fabrication, assembly, operation, economic analysis, and process support R and D for an Experimental Process System Development Unit (EPSDU) are reported. About 95% of purchased equipment is received and will be reshipped to the West Coast location. The Data Collection System is completed. In the area of melting/consolidation, to the system using silicon powder transfer, melting and shotting on a pseudocontinuous basis is demonstrated. It is proposed to continue the very promising fluid bed work.

  8. Expert System Development in the Classroom: Processes and Outcomes. Technical Report 91-1.

    ERIC Educational Resources Information Center

    Wideman, Herbert H.; Owston, Ronald D.

    This study examined cognitive processes and outcomes associated with student knowledge base development. Sixty-nine students from two grade 8 classes were randomly assigned to one of three groups: a knowledge base development (KBD) group, a problem-solving software group, and a control group. Those in the KBD group received relevant instruction…

  9. 78 FR 47012 - Developing Software Life Cycle Processes Used in Safety Systems of Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ....'' This RG endorses the Institute of Electrical and Electronic Engineers (IEEE) Standard (Std.) 1074-2006, ``IEEE Standard for Developing a Software Project Life Cycle Process,'' issued 2006, with the... guidance in IEEE Std. 1074- 2006, ``IEEE Standard for Developing a Software Project Life Cycle...

  10. Development and processing of hyperspectral images in optical-electronic remote sensing systems

    NASA Astrophysics Data System (ADS)

    Kozinov, I. A.; Maltsev, G. N.

    2016-12-01

    The development and processing of three-dimensional images as a "hypercube" of spectral data in hyperspectral optical-electronic remote sensing systems are described in a formalized manner. The correlation identification of observed objects on the basis of spectral features is considered. The criterion for determining of similarity between vectors of recorded and reference spectral images of objects is based on their cross-correlation. Taking into the fact that the total spectral data array recorded by currently applicable hyperspectrometers is excessive for the solution of many issues related to remote sensing of the Earth, this paper proposes a method making it possible to reduce spectral data redundancy by selection of the most informative spectral channels. The essential dimension of the spectral data makes it possible to solve issues related to identification and classification of objects by spectral features through a limited number of very informative spectral channels selected in the areas where the function describing a spectral image of the observed object undergoes well-defined changes in behavior. The algorithm for selection of the most informative spectral channels, which is based on the determination of jump coordinates (major changes) of a spectral image, is substantiated. The selected channels meet the maximum likelihood criterion. The obtained experimental research data on object identification quality with involvement of real hyperspectral data of aerospace Earth remote sensing systems are reported. Five to twenty spectral readouts are needed to provide identification by a limited number of very informative spectral channels. This confirms the idea of existing essential dimensionality of the spectral data.

  11. 6-D, A Process Framework for the Design and Development of Web-based Systems.

    ERIC Educational Resources Information Center

    Christian, Phillip

    2001-01-01

    Explores how the 6-D framework can form the core of a comprehensive systemic strategy and help provide a supporting structure for more robust design and development while allowing organizations to support whatever methods and models best suit their purpose. 6-D stands for the phases of Web design and development: Discovery, Definition, Design,…

  12. Development of expert systems for the design of a hot-forging process based on material workability

    NASA Astrophysics Data System (ADS)

    Ravi, R.; Prasad, Y. V. R. K.; Sarma, V. V. S.

    2003-12-01

    Most of the time (and cost) involved in planning hot forging process is related to activities strongly dependent on human expertise, intuition, and creativity, and also to iterative procedure involving extensive experimental work. In this paper, the development of an expert system for forging process design, which emphasizes materials’ workability, is discussed. Details of the forging process design expert system, its basic modules, design and implementation details, and deliverables are explained. The system uses the vast database available on the hot workability of more than 200 technologically important materials and the knowledge acquired from a materials’ expert. The C Language Integrated Production System (CLIPS) has been adopted to develop this expert system. The expert system can address three types of functions, namely, forging process design, materials information system, and forging defect analysis. The expert system will aid and prompt a novice engineer in designing a forging process by providing accurate information of the process parameters, lubricants, type of machine, die material, and type of process (isothermal versus non-isothermal) for a given material with a known specification or code and prior history.

  13. Development of a prototype multi-processing interactive software invocation system

    NASA Technical Reports Server (NTRS)

    Berman, W. J.

    1983-01-01

    The Interactive Software Invocation System (NASA-ISIS) was first transported to the M68000 microcomputer, and then rewritten in the programming language Path Pascal. Path Pascal is a significantly enhanced derivative of Pascal, allowing concurrent algorithms to be expressed using the simple and elegant concept of Path Expressions. The primary results of this contract was to verify the viability of Path Pascal as a system's development language. The NASA-ISIS implementation using Path Pascal is a prototype of a large, interactive system in Path Pascal. As such, it is an excellent demonstration of the feasibility of using Path Pascal to write even more extensive systems. It is hoped that future efforts will build upon this research and, ultimately, that a full Path Pascal/ISIS Operating System (PPIOS) might be developed.

  14. MicroRNAs (MiRs) Precisely Regulate Immune System Development and Function in Immunosenescence Process.

    PubMed

    Aalaei-Andabili, Seyed Hossein; Rezaei, Nima

    2016-01-01

    Human aging is a complex process with pivotal changes in gene expression of biological pathways. Immune system dysfunction has been recognized as one of the most important abnormalities induced by senescent names immunosenescence. Emerging evidences suggest miR role in immunosenescence. We aimed to systemically review all relevant reports to clearly state miR effects on immunosenescence process. Sensitive electronic searches carried out. Quality assessment has been performed. Since majority of the included studies were laboratory works, and therefore heterogen, we discussed miR effects on immunological aging process nonstatically. Forty-six articles were found in the initial search. After exclusion of 34 articles, 12 studies enrolled to the final stage. We found that miRs have crucial roles in exact function of immune system. MiRs are involved in the regulation of the aging process in the immune system components and target certain genes, promoting or inhibiting immune system reaction to invasion. Also, miRs control life span of the immune system members by regulation of the genes involved in the apoptosis. Interestingly, we found that immunosenescence is controllable by proper manipulation of the various miRs expression. DNA methylation and histone acetylation have been discovered as novel strategies, altering NF-κB binding ability to the miR promoter sites. Effect of miRs on impairment of immune system function due to the aging is emerging. Although it has been accepted that miRs have determinant roles in the regulation of the immunosenescence; however, most of the reports are concluded from animal/laboratory works, suggesting the necessity of more investigations in human.

  15. Development of a microblood-typing system using assembly-free process based on virtual environment

    NASA Astrophysics Data System (ADS)

    Lee, Seung-Jae; Kang, Hyun-Wook; Kim, Yonggoo; Lee, Gyoo-Whung; Lim, Geunbae; Cho, Dong-Woo

    2005-02-01

    ABO typing is the first test done on blood that is to be used for transfusion. A person must receive ABO-matched blood, as ABO incompatibility is the major cause of fatal transfusion reactions. Until now, this blood typing has been done manually, and there is therefore a need for an automated typing machine that uses a very small volume of blood. In this paper, we present a new micro blood-typing system with a fully 3-dimentional geometry, which was realized using micro-stereolithography. This system was fabricated with a novel integration process based on a virtual environment and blood typing experiments using this system were successfully performed.

  16. Launch and Propulsion Systems Materials and Process Development for Rocket Engine Components: Turbomachinery

    NASA Technical Reports Server (NTRS)

    Cannon, James L.; Katz, Allan; Bampton, Cliff; Marchol, Paul; Rhemer, Chris; Effinger, Mike; Genge, Gary

    1998-01-01

    This presentation identifies the key materials with the highest payoff to advance the state-of-the-art in materials for turbomachinery. Current show stoppers for advancing the state of the art in materials and processes were identified. Technical issues associated with incorporating key materials into new systems are discussed. Opportunities, where they exist, are identified to overcome these technical challenges.

  17. Launch and Propulsion Systems Materials and Process Development for Rocket Engine Components: Turbomachinery

    NASA Technical Reports Server (NTRS)

    Cannon, James L.; Katz, Allan; Bampton, Cliff; Marchol, Paul; Rhemer, Chris; Effinger, Mike; Genge, Gary

    1998-01-01

    This presentation identifies the key materials with the highest payoff to advance the state-of-the-art in materials for turbomachinery. Current show stoppers for advancing the state of the art in materials and processes were identified. Technical issues associated with incorporating key materials into new systems are discussed. Opportunities, where they exist, are identified to overcome these technical challenges.

  18. The Design, Development and Testing of a Multi-process Real-time Software System

    DTIC Science & Technology

    2007-03-01

    was divided into separate executable processes, which are individually controllable ( Deitel , 2004). Memory space for storing data is normally local...Biloxi, Mississippi. Deitel , H.M, P. Deitelaul, and D. Choffnes, 2004. Operating Systems. Upper Saddle River, NJ: Pearson/Prentice Hall, ISBN 0-13

  19. Development of a System for Thermoelectric Heat Recovery from Stationary Industrial Processes

    NASA Astrophysics Data System (ADS)

    Ebling, D. G.; Krumm, A.; Pfeiffelmann, B.; Gottschald, J.; Bruchmann, J.; Benim, A. C.; Adam, M.; Labs, R.; Herbertz, R. R.; Stunz, A.

    2016-07-01

    The hot forming process of steel requires temperatures of up to 1300°C. Usually, the invested energy is lost to the environment by the subsequent cooling of the forged parts to room temperature. Thermoelectric systems are able to recover this wasted heat by converting the heat into electrical energy and feeding it into the power grid. The proposed thermoelectric system covers an absorption surface of half a square meter, and it is equipped with 50 Bismuth-Telluride based thermoelectric generators, five cold plates, and five inverters. Measurements were performed under production conditions of the industrial environment of the forging process. The heat distribution and temperature profiles are measured and modeled based on the prevailing production conditions and geometric boundary conditions. Under quasi-stationary conditions, the thermoelectric system absorbs a heat radiation of 14.8 kW and feeds electrical power of 388 W into the power grid. The discussed model predicts the measured values with slight deviations.

  20. Advancing biopharmaceutical process development by system-level data analysis and integration of omics data.

    PubMed

    Schaub, Jochen; Clemens, Christoph; Kaufmann, Hitto; Schulz, Torsten W

    2012-01-01

    Development of efficient bioprocesses is essential for cost-effective manufacturing of recombinant therapeutic proteins. To achieve further process improvement and process rationalization comprehensive data analysis of both process data and phenotypic cell-level data is essential. Here, we present a framework for advanced bioprocess data analysis consisting of multivariate data analysis (MVDA), metabolic flux analysis (MFA), and pathway analysis for mapping of large-scale gene expression data sets. This data analysis platform was applied in a process development project with an IgG-producing Chinese hamster ovary (CHO) cell line in which the maximal product titer could be increased from about 5 to 8 g/L.Principal component analysis (PCA), k-means clustering, and partial least-squares (PLS) models were applied to analyze the macroscopic bioprocess data. MFA and gene expression analysis revealed intracellular information on the characteristics of high-performance cell cultivations. By MVDA, for example, correlations between several essential amino acids and the product concentration were observed. Also, a grouping into rather cell specific productivity-driven and process control-driven processes could be unraveled. By MFA, phenotypic characteristics in glycolysis, glutaminolysis, pentose phosphate pathway, citrate cycle, coupling of amino acid metabolism to citrate cycle, and in the energy yield could be identified. By gene expression analysis 247 deregulated metabolic genes were identified which are involved, inter alia, in amino acid metabolism, transport, and protein synthesis.

  1. Development of Energy Models for Production Systems and Processes to Inform Environmentally Benign Decision-Making

    NASA Astrophysics Data System (ADS)

    Diaz-Elsayed, Nancy

    Between 2008 and 2035 global energy demand is expected to grow by 53%. While most industry-level analyses of manufacturing in the United States (U.S.) have traditionally focused on high energy consumers such as the petroleum, chemical, paper, primary metal, and food sectors, the remaining sectors account for the majority of establishments in the U.S. Specifically, of the establishments participating in the Energy Information Administration's Manufacturing Energy Consumption Survey in 2006, the non-energy intensive" sectors still consumed 4*109 GJ of energy, i.e., one-quarter of the energy consumed by the manufacturing sectors, which is enough to power 98 million homes for a year. The increasing use of renewable energy sources and the introduction of energy-efficient technologies in manufacturing operations support the advancement towards a cleaner future, but having a good understanding of how the systems and processes function can reduce the environmental burden even further. To facilitate this, methods are developed to model the energy of manufacturing across three hierarchical levels: production equipment, factory operations, and industry; these methods are used to accurately assess the current state and provide effective recommendations to further reduce energy consumption. First, the energy consumption of production equipment is characterized to provide machine operators and product designers with viable methods to estimate the environmental impact of the manufacturing phase of a product. The energy model of production equipment is tested and found to have an average accuracy of 97% for a product requiring machining with a variable material removal rate profile. However, changing the use of production equipment alone will not result in an optimal solution since machines are part of a larger system. Which machines to use, how to schedule production runs while accounting for idle time, the design of the factory layout to facilitate production, and even the

  2. Enhanced Geothermal Systems Research and Development: Models of Subsurface Chemical Processes Affecting Fluid Flow

    SciTech Connect

    Moller, Nancy; Weare J. H.

    2008-05-29

    Successful exploitation of the vast amount of heat stored beneath the earth’s surface in hydrothermal and fluid-limited, low permeability geothermal resources would greatly expand the Nation’s domestic energy inventory and thereby promote a more secure energy supply, a stronger economy and a cleaner environment. However, a major factor limiting the expanded development of current hydrothermal resources as well as the production of enhanced geothermal systems (EGS) is insufficient knowledge about the chemical processes controlling subsurface fluid flow. With funding from past grants from the DOE geothermal program and other agencies, we successfully developed advanced equation of state (EOS) and simulation technologies that accurately describe the chemistry of geothermal reservoirs and energy production processes via their free energies for wide XTP ranges. Using the specific interaction equations of Pitzer, we showed that our TEQUIL chemical models can correctly simulate behavior (e.g., mineral scaling and saturation ratios, gas break out, brine mixing effects, down hole temperatures and fluid chemical composition, spent brine incompatibilities) within the compositional range (Na-K-Ca-Cl-SO4-CO3-H2O-SiO2-CO2(g)) and temperature range (T < 350°C) associated with many current geothermal energy production sites that produce brines with temperatures below the critical point of water. The goal of research carried out under DOE grant DE-FG36-04GO14300 (10/1/2004-12/31/2007) was to expand the compositional range of our Pitzer-based TEQUIL fluid/rock interaction models to include the important aluminum and silica interactions (T < 350°C). Aluminum is the third most abundant element in the earth’s crust; and, as a constituent of aluminosilicate minerals, it is found in two thirds of the minerals in the earth’s crust. The ability to accurately characterize effects of temperature, fluid mixing and interactions between major rock-forming minerals and hydrothermal and

  3. Education for Development: An Evaluation System. Piagetian Theory Applied to the Evaluation of Educational Outcomes and Processes.

    ERIC Educational Resources Information Center

    Lengel, James G.

    This paper outlines an evaluation system for elementary education based on cognitive and social development research. Piaget has defined certain core mental structures that develop in all humans in the same order. Both Piaget and Kohlberg have done work on the process of developmental change and the conditions necessary for optimal growth of the…

  4. Flat-plate solar array project: Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The process technology for the manufacture of semiconductor-grade silicon in a large commercial plant by 1986, at a price less than $14 per kilogram of silicon based on 1975 dollars is discussed. The engineering design, installation, checkout, and operation of an Experimental Process System Development unit was discussed. Quality control of scaling-up the process and an economic analysis of product and production costs are discussed.

  5. Low cost solar array project: Experimental process system development unit for producing semiconductor-grade silicon using silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The design, fabrication, and installation of an experimental process system development unit (EPSDU) were analyzed. Supporting research and development were performed to provide an information data base usable for the EPSDU and for technological design and economical analysis for potential scale-up of the process. Iterative economic analyses were conducted for the estimated product cost for the production of semiconductor grade silicon in a facility capable of producing 1000-MT/Yr.

  6. A Computer Based Decision Support System (DSS) for Developing Logistic Support Analysis (LSA) Requirements as Part of the System Engineering Process

    DTIC Science & Technology

    1989-09-01

    Requirements for the Degree of Master of Science in Logistics Management Michael G. Heffner, B.S., M.S. Captain, USAF September 1989 Approved for public...system development. System engineering is a process structured to develop the optimum system for a required mission. The Defense System Management ...not the system engineer, was given the responsibility for implementing and managing LSA as part of the overall logistics program. The engineer had

  7. Interactions between glia, the immune system and pain processes during early development.

    PubMed

    Barr, Gordon A; Hunter, Deirtra A

    2014-12-01

    Pain is a serious problem for infants and children and treatment options are limited. Moreover, infants born prematurely or hospitalized for illness likely have concurrent infection that activates the immune system. It is now recognized that the immune system in general and glia in particular influence neurotransmission and that the neural bases of pain are intimately connected to immune function. We know that injuries that induce pain activate immune function and suppressing the immune system alleviates pain. Despite this advance in our understanding, virtually nothing is known of the role that the immune system plays in pain processing in infants and children, even though pain is a serious clinical issue in pediatric medicine. This brief review summarizes the existing data on immune-neural interactions in infants, providing evidence for the immaturity of these interactions. © 2014 Wiley Periodicals, Inc.

  8. Powder towpreg process development

    NASA Technical Reports Server (NTRS)

    Baucom, Robert M.; Marchello, Joseph M.

    1991-01-01

    The process for dry powder impregnation of carbon fiber tows being developed at LaRC overcomes many of the difficulties associated with melt, solution, and slurry prepregging. In the process, fluidized powder is deposited on spread tow bundles and fused to the fibers by radiant heating. Impregnated tows have been produced for preform, weaving, and composite materials applications. Design and operating data correlations were developed for scale up of the process to commercial operation. Bench scale single tow experiments at tow speeds up to 50 cm/sec have demonstrated that the process can be controlled to produce weavable towpreg. Samples were woven and molded into preform material of good quality.

  9. Development of a new flux map processing code for moveable detector system in PWR

    SciTech Connect

    Li, W.; Lu, H.; Li, J.; Dang, Z.; Zhang, X.

    2013-07-01

    This paper presents an introduction to the development of the flux map processing code MAPLE developed by China Nuclear Power Technology Research Institute (CNPPJ), China Guangdong Nuclear Power Group (CGN). The method to get the three-dimensional 'measured' power distribution according to measurement signal has also been described. Three methods, namely, Weight Coefficient Method (WCM), Polynomial Expand Method (PEM) and Thin Plane Spline (TPS) method, have been applied to fit the deviation between measured and predicted results for two-dimensional radial plane. The measured flux map data of the LINGAO nuclear power plant (NPP) is processed using MAPLE as a test case to compare the effectiveness of the three methods, combined with a 3D neutronics code COCO. Assembly power distribution results show that MAPLE results are reasonable and satisfied. More verification and validation of the MAPLE code will be carried out in future. (authors)

  10. Advanced multilateration theory, software development, and data processing: The MICRODOT system

    NASA Technical Reports Server (NTRS)

    Escobal, P. R.; Gallagher, J. F.; Vonroos, O. H.

    1976-01-01

    The process of geometric parameter estimation to accuracies of one centimeter, i.e., multilateration, is defined and applications are listed. A brief functional explanation of the theory is presented. Next, various multilateration systems are described in order of increasing system complexity. Expected systems accuracy is discussed from a general point of view and a summary of the errors is listed. An outline of the design of a software processing system for multilateration, called MICRODOT, is presented next. The links of this software, which can be used for multilateration data simulations or operational data reduction, are examined on an individual basis. Functional flow diagrams are presented to aid in understanding the software capability. MICRODOT capability is described with respect to vehicle configurations, interstation coordinate reduction, geophysical parameter estimation, and orbit determination. Numerical results obtained from MICRODOT via data simulations are displayed both for hypothetical and real world vehicle/station configurations such as used in the GEOS-3 Project. These simulations show the inherent power of the multilateration procedure.

  11. Development of Power Supply System with Distributed Generators using Parallel Processing Method

    NASA Astrophysics Data System (ADS)

    Hirose, Kenichi; Takeda, Takashi; Okui, Yoshiaki; Yukita, Kazuto; Goto, Yasuyuki; Ichiyanagi, Katsuhiro; Matsumura, Toshiro

    This paper describes a novel power system which consists of distributed energy resources (DER) with a static switch at the point of common coupling. Usage of the static switch with a parallel processing control is a new application of line interactive type uninterruptible power supply (UPS). In recent years, various ways of design, operation, and control methods have been studied in order to find more effective ways to utilize renewable energy and to reduce impact for environment. One of features of a proposed power system can interconnect to existing utility grid without interruption. Electrical power distribution to the loads by the power system can be continued between the states of interconnection and isolate operation seamlessly. The novel power system has other benefits such as more efficiency, demand site management, easy to control power system inside, improvement of reliability for power distribution, the minimum requirement of protection relays for grid interconnection. The proposed power system has been operated with the actual loads of 20kW in the campus of the Aichi Institute of Technology since 2007.

  12. Channel simulation and development of signal processing techniques for a scanner-based optical storage system

    NASA Astrophysics Data System (ADS)

    Pillai, Usha; Vijaya Kumar, Bhagavatula

    1998-10-01

    A scanner-based storage system employs a head mounted on a scanner which oscillates over the moving media. The head moves in an approximately sinusoidal path relative to the media at a high frequency, time-multiplexing the read/write signals of several tracks. The resulting multi-channel readback can yield higher data rates over a conventional system with a head that moves linearly relative to the media. Scanner-based storage systems are not commercially available at present. We are envisioning a system that uses an opto-electronic scanner, developed at CMU, in which the deflection of a laser beam is controlled by an input voltage. Since no mechanical motion is involved, this scanner has a high bandwidth which makes it well suited to our application.

  13. The influence of gravity on the process of development of animal systems

    NASA Technical Reports Server (NTRS)

    Malacinski, G. M.; Neff, A. W.

    1984-01-01

    The development of animal systems is described in terms of a series of overlapping phases: pattern specification; differentiation; growth; and aging. The extent to which altered (micro) gravity (g) affects those phases is briefly reviewed for several animal systems. As a model, amphibian egg/early embryo is described. Recent data derived from clinostat protocols indicates that microgravity simulation alters early pattern specification (dorsal/ventral polarity) but does not adversely influence subsequent morphogenesis. Possible explanations for the absence of catastrophic microgravity effects on amphibian embryogenesis are discussed.

  14. The influence of gravity on the process of development of animal systems

    NASA Technical Reports Server (NTRS)

    Malacinski, G. M.; Neff, A. W.

    1984-01-01

    The development of animal systems is described in terms of a series of overlapping phases: pattern specification; differentiation; growth; and aging. The extent to which altered (micro) gravity (g) affects those phases is briefly reviewed for several animal systems. As a model, amphibian egg/early embryo is described. Recent data derived from clinostat protocols indicates that microgravity simulation alters early pattern specification (dorsal/ventral polarity) but does not adversely influence subsequent morphogenesis. Possible explanations for the absence of catastrophic microgravity effects on amphibian embryogenesis are discussed.

  15. Development of a BR-UASB-DHS system for natural rubber processing wastewater treatment.

    PubMed

    Watari, Takahiro; Thanh, Nguyen Thi; Tsuruoka, Natsumi; Tanikawa, Daisuke; Kuroda, Kyohei; Huong, Nguyen Lan; Tan, Nguyen Minh; Hai, Huynh Trung; Hatamoto, Masashi; Syutsubo, Kazuaki; Fukuda, Masao; Yamaguchi, Takashi

    2015-11-21

    Natural rubber processing wastewater contains high concentrations of organic compounds, nitrogen, and other contaminants. In this study, a treatment system composed of a baffled reactor (BR), an upflow anaerobic sludge blanket (UASB) reactor, and a downflow hanging sponge (DHS) reactor was used to treat natural rubber processing wastewater in Vietnam. The BR showed good total suspended solids removal of 47.6%, as well as acidification of wastewater. The UASB reactor achieved a high chemical oxygen demand (COD) removal efficiency of 92.7% ± 2.3% and energy recovery in the form of methane with an organic loading rate of 12.2 ± 6.6 kg-COD·m(-3)·day(-1). The DHS reactor showed a high performance in residual organic matter removal from UASB effluent. In total, the system achieved high-level total COD removal of 98.6% ± 1.2% and total suspended solids removal of 98.0% ± 1.4%. Massive parallel 16S rRNA gene sequencing of the retained sludge in the UASB reactor showed the predominant microbial phyla to be Bacteroidetes, Firmicutes, Proteobacteria, WWE1, and Euryarchaeota. Uncultured bacteria belonging to the phylum Bacteroidetes and Phylum WWE1 were predominant in the UASB reactor. This microbial assemblage utilizes the organic compounds contained in natural rubber processing wastewater. In addition, the methane-producing archaea Methanosaeta sp. and Methanolinea sp. were detected.

  16. Development of Materials Processing Systems for Use in Space on Low-g Simulation Devices

    NASA Technical Reports Server (NTRS)

    Aldrich, B. R.; Whitt, W. D.

    1985-01-01

    Advanced furnace systems are being developed for use in space. Systems are being tested for current experiment applications and modified for future experiment requirements. Future projects are: (1) fabrication and testing of the Advanced Automated Directional Solidification Furnace (AADSF) flight hardware; (2) development of a Heat Pipe Furnace (HPF) for use in space. Heat pipes will be tested for space flight qualification in conjunction with the furnace development. The HPF design will be based on the AADSF development and will be of modular design including capabilities of operating with or without heat pipes; and (3) the AADSF furnace will be modified and tested to operate at temperatures up to 1700 C in the heated cavity. This will be accomplished by developing a new hot end heating module and insulation package for the existing AADSF. Refurbishment of the Drop Tower Furnace (DTF) is under way. The DTF can operate at temperatures up to 1700 C. The sample size will be approximately 3/8 in. dia. x 5/8 in. long. Design improvements for the General Purpose Rocket Furnace (GPRF) for use in the Material Experiment Assembly (MEA) are to be accomplished.

  17. Facilitating the Specification Capture and Transformation Process in the Development of Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Filho, Aluzio Haendehen; Caminada, Numo; Haeusler, Edward Hermann; vonStaa, Arndt

    2004-01-01

    To support the development of flexible and reusable MAS, we have built a framework designated MAS-CF. MAS-CF is a component framework that implements a layered architecture based on contextual composition. Interaction rules, controlled by architecture mechanisms, ensure very low coupling, making possible the sharing of distributed services in a transparent, dynamic and independent way. These properties propitiate large-scale reuse, since organizational abstractions can be reused and propagated to all instances created from a framework. The objective is to reduce complexity and development time of multi-agent systems through the reuse of generic organizational abstractions.

  18. Advanced development of a pressurized ash agglomerating fluidized-bed coal gasification system: Topical report, Process analysis, FY 1983

    SciTech Connect

    1987-07-31

    KRW Energy Systems, Inc., is engaged in the continuing development of a pressurized, fluidized-bed gasification process at its Waltz Mill Site in Madison, Pennsylvania. The overall objective of the program is to demonstrate the viability of the KRW process for the environmentally-acceptable production of low- and medium-Btu fuel gas from a variety of fossilized carbonaceous feedstocks and industrial fuels. This report presents process analysis of the 24 ton-per-day Process Development Unit (PDU) operations and is a continuation of the process analysis work performed in 1980 and 1981. Included is work performed on PDU process data; gasification; char-ash separation; ash agglomeration; fines carryover, recycle, and consumption; deposit formation; materials; and environmental, health, and safety issues. 63 figs., 43 tabs.

  19. Development of automatic movement analysis system for a small laboratory animal using image processing

    NASA Astrophysics Data System (ADS)

    Nagatomo, Satoshi; Kawasue, Kikuhito; Koshimoto, Chihiro

    2013-03-01

    Activity analysis in a small laboratory animal is an effective procedure for various bioscience fields. The simplest way to obtain animal activity data is just observation and recording manually, even though this is labor intensive and rather subjective. In order to analyze animal movement automatically and objectivity, expensive equipment is usually needed. In the present study, we develop animal activity analysis system by means of a template matching method with video recorded movements in laboratory animal at a low cost.

  20. DoD Software Intensive Systems Development: A Hit and Miss Process

    DTIC Science & Technology

    2015-05-01

    from scratch – Software TRLs ineffective at reducing development risk – Contractor is assessed for risk (CMMI), but PM team has no ‘maturity...non-critical system attributes – SEI’s Software Acquisition (SA)-CMM • Assesses the Government’s PM team maturity 7 ATAM Input User Need QAW...Prototyping, Code, Build, Integrate, Test IOT &E Prototype LUT & EUTE CDR CPD QAW Requirements Elicitation Explicit, Derived

  1. Representation in development: from a model system to some general processes.

    PubMed

    Montuori, Luke M; Honey, Robert C

    2015-03-01

    The view that filial imprinting might serve as a useful model system for studying the neurobiological basis of memory was inspired, at least in part, by a simple idea: acquired filial preferences reflect the formation of a memory or representation of the imprinting object itself, as opposed to the change in the efficacy of stimulus-response pathways, for example. We provide a synthesis of the evidence that supports this idea; and show that the processes of memory formation observed in filial imprinting find surprisingly close counterparts in other species, including our own. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Development of an advanced spacecraft water and waste materials processing system

    NASA Technical Reports Server (NTRS)

    Murray, R. W.; Schelkopf, J. D.; Middleton, R. L.

    1975-01-01

    An Integrated Waste Management-Water System (WM-WS) which uses radioisotopes for thermal energy is described and results of its trial in a 4-man, 180 day simulated space mission are presented. It collects urine, feces, trash, and wash water in zero gravity, processes the wastes to a common evaporator, distills and catalytically purifies the water, and separates and incinerates the solid residues using little oxygen and no chemical additives or expendable filters. Technical details on all subsystems are given along with performance specifications. Data on recovered water and heat loss obtained in test trials are presented. The closed loop incinerator and other projects underway to increase system efficiency and capacity are discussed.

  3. Development of an advanced spacecraft water and waste materials processing system

    NASA Technical Reports Server (NTRS)

    Murray, R. W.; Schelkopf, J. D.; Middleton, R. L.

    1975-01-01

    An Integrated Waste Management-Water System (WM-WS) which uses radioisotopes for thermal energy is described and results of its trial in a 4-man, 180 day simulated space mission are presented. It collects urine, feces, trash, and wash water in zero gravity, processes the wastes to a common evaporator, distills and catalytically purifies the water, and separates and incinerates the solid residues using little oxygen and no chemical additives or expendable filters. Technical details on all subsystems are given along with performance specifications. Data on recovered water and heat loss obtained in test trials are presented. The closed loop incinerator and other projects underway to increase system efficiency and capacity are discussed.

  4. Biomass Torrefaction Process Review and Moving Bed Torrefaction System Model Development

    SciTech Connect

    Jaya Shakar Tumuluru; Shahab Sokhansanj; Christopher T. Wright; Richard D. Boardman

    2010-08-01

    Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-230 C and 270-280 C. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, producing a final product that will have a lower mass but a higher heating value. An important aspect of research is to establish a degree of torrefaction where gains in heating value offset the loss of mass. There is a lack of literature on torrefaction reactor designs and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes (a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and (b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed for different capacities, designing the heat loads and gas flow rates, and developing an interactive excel sheet where the user can define design specifications. In this report, 25-1000 kg/hr are used in equations for the design of the torrefier, examples of calculations, and specifications for the torrefier.

  5. Biomass Torrefaction Process Review and Moving Bed Torrefaction System Model Development

    SciTech Connect

    Jaya Shakar Tumuluru; Shahab Sokhansanj; Christopher T. Wright

    2010-08-01

    Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300°C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200–230ºC and 270–280ºC. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, producing a final product that will have a lower mass but a higher heating value. An important aspect of research is to establish a degree of torrefaction where gains in heating value offset the loss of mass. There is a lack of literature on torrefaction reactor designs and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed for different capacities, designing the heat loads and gas flow rates, and developing an interactive excel sheet where the user can define design specifications. In this report, 25–1000 kg/hr are used in equations for the design of the torrefier, examples of calculations, and specifications for the torrefier.

  6. Development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements

    NASA Technical Reports Server (NTRS)

    Rey, Charles A.

    1991-01-01

    The development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements are discussed. Efforts were directed towards the following task areas: design and development of a High Temperature Acoustic Levitator (HAL) for containerless processing and property measurements at high temperatures; testing of the HAL module to establish this technology for use as a positioning device for microgravity uses; construction and evaluation of a brassboard hot wall Acoustic Levitation Furnace; construction and evaluation of a noncontact temperature measurement (NCTM) system based on AGEMA thermal imaging camera; construction of a prototype Division of Amplitude Polarimetric Pyrometer for NCTM of levitated specimens; evaluation of and recommendations for techniques to control contamination in containerless materials processing chambers; and evaluation of techniques for heating specimens to high temperatures for containerless materials experimentation.

  7. Instrument control software development process for the multi-star AO system ARGOS

    NASA Astrophysics Data System (ADS)

    Kulas, M.; Barl, L.; Borelli, J. L.; Gässler, W.; Rabien, S.

    2012-09-01

    The ARGOS project (Advanced Rayleigh guided Ground layer adaptive Optics System) will upgrade the Large Binocular Telescope (LBT) with an AO System consisting of six Rayleigh laser guide stars. This adaptive optics system integrates several control loops and many different components like lasers, calibration swing arms and slope computers that are dispersed throughout the telescope. The purpose of the instrument control software (ICS) is running this AO system and providing convenient client interfaces to the instruments and the control loops. The challenges for the ARGOS ICS are the development of a distributed and safety-critical software system with no defects in a short time, the creation of huge and complex software programs with a maintainable code base, the delivery of software components with the desired functionality and the support of geographically distributed project partners. To tackle these difficult tasks, the ARGOS software engineers reuse existing software like the novel middleware from LINC-NIRVANA, an instrument for the LBT, provide many tests at different functional levels like unit tests and regression tests, agree about code and architecture style and deliver software incrementally while closely collaborating with the project partners. Many ARGOS ICS components are already successfully in use in the laboratories for testing ARGOS control loops.

  8. Development of a prototype spatial information processing system for hydrologic research

    NASA Technical Reports Server (NTRS)

    Sircar, Jayanta K.

    1991-01-01

    Significant advances have been made in the last decade in the areas of Geographic Information Systems (GIS) and spatial analysis technology, both in hardware and software. Science user requirements are so problem specific that currently no single system can satisfy all of the needs. The work presented here forms part of a conceptual framework for an all-encompassing science-user workstation system. While definition and development of the system as a whole will take several years, it is intended that small scale projects such as the current work will address some of the more short term needs. Such projects can provide a quick mechanism to integrate tools into the workstation environment forming a larger, more complete hydrologic analysis platform. Described here are two components that are very important to the practical use of remote sensing and digital map data in hydrology. Described here is a graph-theoretic technique to rasterize elevation contour maps. Also described is a system to manipulate synthetic aperture radar (SAR) data files and extract soil moisture data.

  9. Processing requirements of secure C3/I and battle management systems - Development of Gemini trusted multiple microcomputer base

    NASA Astrophysics Data System (ADS)

    Tao, T. F.; Schell, R. R.

    The present investigation is concerned with the potential applications of trusted computer system technologies in space. It is suggested that the rapidly expanding roles of new space defense missions will require space-borne command, control, communication, intelligence, and battle management (C2/I-BM) systems. The trusted computer system technology can be extended to develop new computer architectures which are able to support the broader requirements of C3/I-BM processing. The Gemini Trusted Multiple Microcomputer Base product is being developed to meet the demanding requirements and to support simultaneously the multiple capabilities. Attention is given to recent important events of trusted computer system developments, and to the Gemini system architecture.

  10. GeVaDSs – decision support system for novel Genetic Vaccine development process

    PubMed Central

    2012-01-01

    Background The lack of a uniform way for qualitative and quantitative evaluation of vaccine candidates under development led us to set up a standardized scheme for vaccine efficacy and safety evaluation. We developed and implemented molecular and immunology methods, and designed support tools for immunization data storage and analyses. Such collection can create a unique opportunity for immunologists to analyse data delivered from their laboratories. Results We designed and implemented GeVaDSs (Genetic Vaccine Decision Support system) an interactive system for efficient storage, integration, retrieval and representation of data. Moreover, GeVaDSs allows for relevant association and interpretation of data, and thus for knowledge-based generation of testable hypotheses of vaccine responses. Conclusions GeVaDSs has been tested by several laboratories in Europe, and proved its usefulness in vaccine analysis. Case study of its application is presented in the additional files. The system is available at: http://gevads.cs.put.poznan.pl/preview/(login: viewer, password: password). PMID:22574945

  11. Analysis of hydrological processes across the Northern Eurasia with recently re-developed online informational system

    NASA Astrophysics Data System (ADS)

    Shiklomanov, A. I.; Proussevitch, A. A.; Gordov, E. P.; Okladnikov, I.; Titov, A. G.

    2016-12-01

    The volume of georeferenced datasets used for hydrology and climate research is growing immensely due to recent advances in modeling, high performance computers, and sensor networks, as well as initiation of a set of large scale complex global and regional monitoring experiments. To facilitate the management and analysis of these extensive data pools we developed Web-based data management, visualization, and analysis system - RIMS - http://earthatlas.sr.unh.edu/ (Rapid Integrated Mapping and Analysis System) with a focus on hydrological applications. Recently, under collaboration with Russian colleagues from the Institute of Monitoring of Climatic and Ecological Systems SB RAS, Russia, we significantly re-designed the RIMS to include the latest Web and GIS technologies in compliance with the Open Geospatial Consortium (OGC) standards. An upgraded RIMS can be successfully applied to address multiple research problems using an extensive data archive and embedded tools for data computations, visualizations and distributions. We will demonstrate current possibility of the system providing several results of applied data analysis fulfilled for territory of the Northern Eurasia. These results will include the analysis of historical, contemporary and future changes in climate and hydrology based on station and gridded data, investigations of recent extreme hydrological events, their anomalies, causes and potential impacts, and creation and analysis of new data sets through integration of social and geophysical data.

  12. Design and development process of patient-centered computer-based support system for patients with schizophrenia spectrum psychosis.

    PubMed

    Valimaki, Maritta; Anttila, Minna; Hatonen, Heli; Koivunen, Marita; Jakobsson, Tiina; Pitkanen, Anneli; Herrala, Jaakko; Kuosmanen, Lauri

    2008-06-01

    Schizophrenia is a serious mental illness requiring self-management skills and information about the illness, its treatment, and where to get help with daily routines. Despite the systematic development of computer-based approaches in mental health, less systematic development of such methods can be found for patients with schizophrenia or psychosis. The aim is to describe the design and development process of patient-centered computer-based support system (Mieli.Net portal) for patients with schizophrenia spectrum psychoses. The process with a mixed methods approach includes four phases: analysis of users' needs, development of key patient information areas, development of a software prototype and to pilot the portal, and user evaluation by health care staff. The computer-based patient support system is a promising health-promoting service to schizophrenic patients. It is important, that users of technology are involved in the development process, which will ensure that sites are user-friendly, information can be personalized, and mental patients' voices are heard in the development of patient education. The effectiveness needs to be evaluated carefully in future clinical trials. This will offer valuable information for policymakers, organizations and health care practitioners about the usability of web-based patient education in the area of mental health care.

  13. Development of metallization process

    NASA Technical Reports Server (NTRS)

    Garcia, A., III

    1983-01-01

    A non lead frit paste is evaluated. A two step process is discussed where the bulk of the metallization is Mo/Sn but a small ohmic pad is silver. A new matrix of paste formulations is developed. A variety of tests are performed on paste samples to determine electrical, thermal and structural properties.

  14. Formic Acid: Development of an Analytical Method and Use as Process Indicator in Anaerobic Systems

    DTIC Science & Technology

    1992-03-01

    analysis #4 90 n < ic UU 0 4 0 00 < + zC 0C4 0 Lii+ /o oo 0 0 - 7/1 "Nw UI30 U 4Y X L L Hr>u- 4.2-Cmaio iOJwt O oCDn~ tomo 0 W 0 + I-. 0 + LL CD3 C- II ...OF TECHNOLOGY A UNIT OF THE UNIVERSITY SYSTEM OF GEORGIA SCHOOL OF CIVIL ENGINEERING ATLANTA, GEORGIA 30332 iIi ii FORMIC ACID: DEVELOPMENT OF AN...Dr. f.. Saunders II I I ACKNOWLEDGEMENT The completion of this research has been a very difficult, but at the same time rewarding task. i wish to

  15. Volcanic alert system (VAS) developed during the 2011-2014 El Hierro (Canary Islands) volcanic process

    NASA Astrophysics Data System (ADS)

    García, Alicia; Berrocoso, Manuel; Marrero, José M.; Fernández-Ros, Alberto; Prates, Gonçalo; De la Cruz-Reyna, Servando; Ortiz, Ramón

    2014-06-01

    The 2011 volcanic unrest at El Hierro Island illustrated the need for a Volcanic Alert System (VAS) specifically designed for the management of volcanic crises developing after long repose periods. The VAS comprises the monitoring network, the software tools for analysis of the monitoring parameters, the Volcanic Activity Level (VAL) management, and the assessment of hazard. The VAS presented here focuses on phenomena related to moderate eruptions, and on potentially destructive volcano-tectonic earthquakes and landslides. We introduce a set of new data analysis tools, aimed to detect data trend changes, as well as spurious signals related to instrumental failure. When data-trend changes and/or malfunctions are detected, a watchdog is triggered, issuing a watch-out warning (WOW) to the Monitoring Scientific Team (MST). The changes in data patterns are then translated by the MST into a VAL that is easy to use and understand by scientists, technicians, and decision-makers. Although the VAS was designed specifically for the unrest episodes at El Hierro, the methodologies may prove useful at other volcanic systems.

  16. Development of signal processing system of avalanche photo diode for space observations by Astro-H

    NASA Astrophysics Data System (ADS)

    Ohno, M.; Goto, K.; Hanabata, Y.; Takahashi, H.; Fukazawa, Y.; Yoshino, M.; Saito, T.; Nakamori, T.; Kataoka, J.; Sasano, M.; Torii, S.; Uchiyama, H.; Nakazawa, K.; Watanabe, S.; Kokubun, M.; Ohta, M.; Sato, T.; Takahashi, T.; Tajima, H.

    2013-01-01

    Astro-H is the sixth Japanese X-ray space observatory which will be launched in 2014. Two of onboard instruments of Astro-H, Hard X-ray Imager and Soft Gamma-ray Detector are surrounded by many number of large Bismuth Germanate (Bi4Ge3O12; BGO) scintillators. Optimum readout system of scintillation lights from these BGOs are essential to reduce the background signals and achieve high performance for main detectors because most of gamma-rays from out of field-of-view of main detectors or radio-isotopes produced inside them due to activation can be eliminated by anti-coincidence technique using BGO signals. We apply Avalanche Photo Diode (APD) for light sensor of these BGO detectors since their compactness and high quantum efficiency make it easy to design such large number of BGO detector system. For signal processing from APDs, digital filter and other trigger logics on the Field-Programmable Gate Array (FPGA) is used instead of discrete analog circuits due to limitation of circuit implementation area on spacecraft. For efficient observations, we have to achieve as low threshold of anti-coincidence signal as possible by utilizing the digital filtering. In addition, such anti-coincident signals should be sent to the main detector within 5 μs to make it in time to veto the A-D conversion. Considering this requirement and constraint from logic size of FPGA, we adopt two types of filter, 8 delay taps filter with only 2 bit precision coefficient and 16 delay taps filter with 8 bit precision coefficient. The data after former simple filter provides anti-coincidence signal quickly in orbit, and the latter filter is used for detail analysis after the data is down-linked.

  17. Development and fabrication of a solar cell junction processing system. Quarterly progress report No. 3, October 1980

    SciTech Connect

    Shiesling, R.

    1980-10-01

    The basic objectives of the program are the following: (1) to design, develop, construct and deliver a junction processing system which will be capable of producing solar cell junctions by means of ion implantation followed by pulsed electron beam annealing; (2) to include in the system a wafer transport mechanism capable of transferring 4-inch-diameter wafers into and out of the vacuum chamber where the ion implantation and pulsed electron beam annealing processes take place; (3) to integrate, test and demonstrate the system prior to its delivery to JPL along with detailed operating and maintenance manuals; and (4) to estimate component lifetimes and costs, as necessary for the contract, for the performance of comprehensive analyses in accordance with the Solar Array Manufacturing Industry Costing Standards (SAMICS). Under this contract the automated junction formation equipment to be developed involves a new system design incorporating a modified, government-owned, JPL-controlled ion implanter into a spire-developed pulsed electron beam annealer and wafer transport system. When modified, the ion implanter will deliver a 16 mA beam of /sup 31/P/sup +/ ions with a fluence of 2.5 x 10/sup 15/ ions per square centimeter at an energy of 10 keV. The throughput design goal rate for the junction processor is 10/sup 7/ four-inch-diameter wafers per year. Work on the pulsed electron beam subsystem development is described. (WHK)

  18. The MINERVA Software Development Process

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  19. Developments in triple quadrupole mass spectrometry. I. Distributed processing control system. II. Screening applications for fuel analysis

    SciTech Connect

    Myerholtz, C.A.

    1984-01-01

    A data acquisition and control system for a triple quadrupole mass spectrometer has been developed using several microprocessors in a distributed processing system. This system includes four processors, one acting as the system master controlling three slave processors. In such a distributed processing system each processor is assigned a specific task. Critical to this application is the allocation of the task of data acquisition, ion path control, and peak finding to separate slave processors. This modular approach leads to a system where each major section of the instrument has it's own dedicated intelligence. This parallel processing system allows operations that are often implemented in hardware (for speed considerations) to be performed in software. The use of triple quadrupole mass spectrometry, and MS/MS technique, to detect selected species in middle distillate fuels was examined. Collision-activated dissociation (CAD) spectra were obtained for reference compounds from several heteroatom-containing compound classes. The CAD results were used to select screening reactions for each compound class. The effectiveness of these screening reactions was demonstrated by identifying the presence of various species in samples of Jet A aviation fuel, a shale oil derived fuel and No. 2 diesel fuel.

  20. The process of development of a prioritization tool for a clinical decision support build within a computerized provider order entry system: Experiences from St Luke's Health System.

    PubMed

    Wolf, Matthew; Miller, Suzanne; DeJong, Doug; House, John A; Dirks, Carl; Beasley, Brent

    2016-09-01

    To establish a process for the development of a prioritization tool for a clinical decision support build within a computerized provider order entry system and concurrently to prioritize alerts for Saint Luke's Health System. The process of prioritizing clinical decision support alerts included (a) consensus sessions to establish a prioritization process and identify clinical decision support alerts through a modified Delphi process and (b) a clinical decision support survey to validate the results. All members of our health system's physician quality organization, Saint Luke's Care as well as clinicians, administrators, and pharmacy staff throughout Saint Luke's Health System, were invited to participate in this confidential survey. The consensus sessions yielded a prioritization process through alert contextualization and associated Likert-type scales. Utilizing this process, the clinical decision support survey polled the opinions of 850 clinicians with a 64.7 percent response rate. Three of the top rated alerts were approved for the pre-implementation build at Saint Luke's Health System: Acute Myocardial Infarction Core Measure Sets, Deep Vein Thrombosis Prophylaxis within 4 h, and Criteria for Sepsis. This study establishes a process for developing a prioritization tool for a clinical decision support build within a computerized provider order entry system that may be applicable to similar institutions. © The Author(s) 2015.

  1. Low cost solar array project production process and equipment task: A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Several major modifications were made to the design presented at the PDR. The frame was deleted in favor of a "frameless" design which will provide a substantially improved cell packing factor. Potential shaded cell damage resulting from operation into a short circuit can be eliminated by a change in the cell series/parallel electrical interconnect configuration. The baseline process sequence defined for the MEPSON was refined and equipment design and specification work was completed. SAMICS cost analysis work accelerated, format A's were prepared and computer simulations completed. Design work on the automated cell interconnect station was focused on bond technique selection experiments.

  2. Development of a next-generation automated DICOM processing system in a PACS-less research environment.

    PubMed

    Ziegler, Scott E

    2012-10-01

    The use of clinical imaging modalities within the pharmaceutical research space provides value and challenges. Typical clinical settings will utilize a Picture Archive and Communication System (PACS) to transmit and manage Digital Imaging and Communications in Medicine (DICOM) images generated by clinical imaging systems. However, a PACS is complex and provides many features that are not required within a research setting, making it difficult to generate a business case and determine the return on investment. We have developed a next-generation DICOM processing system using open-source software, commodity server hardware such as Apple Xserve®, high-performance network-attached storage (NAS), and in-house-developed preprocessing programs. DICOM-transmitted files are arranged in a flat file folder hierarchy easily accessible via our downstream analysis tools and a standard file browser. This next-generation system had a minimal construction cost due to the reuse of all the components from our first-generation system with the addition of a second server for a few thousand dollars. Performance metrics were gathered and the system was found to be highly scalable, performed significantly better than the first-generation system, is modular, has satisfactory image integrity, and is easier to maintain than the first-generation system. The resulting system is also portable across platforms and utilizes minimal hardware resources, allowing for easier upgrades and migration to smaller form factors at the hardware end-of-life. This system has been in production successfully for 8 months and services five clinical instruments and three pre-clinical instruments. This system has provided us with the necessary DICOM C-Store functionality, eliminating the need for a clinical PACS for day-to-day image processing.

  3. Development and verification of signal processing system of avalanche photo diode for the active shields onboard ASTRO-H

    NASA Astrophysics Data System (ADS)

    Ohno, M.; Kawano, T.; Edahiro, I.; Shirakawa, H.; Ohashi, N.; Okada, C.; Habata, S.; Katsuta, J.; Tanaka, Y.; Takahashi, H.; Mizuno, T.; Fukazawa, Y.; Murakami, H.; Kobayashi, S.; Miyake, K.; Ono, K.; Kato, Y.; Furuta, Y.; Murota, Y.; Okuda, K.; Wada, Y.; Nakazawa, K.; Mimura, T.; Kataoka, J.; Ichinohe, Y.; Uchida, Y.; Katsuragawa, M.; Yoneda, H.; Sato, G.; Sato, R.; Kawaharada, M.; Harayama, A.; Odaka, H.; Hayashi, K.; Ohta, M.; Watanabe, S.; Kokubun, M.; Takahashi, T.; Takeda, S.; Kinoshita, M.; Yamaoka, K.; Tajima, H.; Yatsu, Y.; Uchiyama, H.; Saito, S.; Yuasa, T.; Makishima, K.

    2016-09-01

    The hard X-ray Imager and Soft Gamma-ray Detector onboard ASTRO-H demonstrate high sensitivity to hard X-ray (5-80 keV) and soft gamma-rays (60-600 keV), respectively. To reduce the background, both instruments are actively shielded by large, thick Bismuth Germanate scintillators. We have developed the signal processing system of the avalanche photodiode in the BGO active shields and have demonstrated its effectiveness after assembly in the flight model of the HXI/SGD sensor and after integration into the satellite. The energy threshold achieved is about 150 keV and anti-coincidence efficiency for cosmic-ray events is almost 100%. Installed in the BGO active shield, the developed signal processing system successfully reduces the room background level of the main detector.

  4. Develop Recovery Systems for Separations of Salts from Process Streams for use in Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Colon, Guillermo

    1998-01-01

    The main objectives of this project were the development of a four-compartment electrolytic cell using high selective membranes to remove nitrate from crop residue leachate and convert it to nitric acid, and the development of an six compartment electrodialysis cell to remove selectively sodium from urine wastes. The recovery of both plant inedible biomass and human wastes nutrients to sustain a biomass production system are important aspects in the development of a controlled ecological life support system (CELSS) to provide the basic human needs required for life support during long term space missions. A four-compartment electrolytic cell has been proposed to remove selectively nitrate from crop residue and to convert it to nitric acid, which is actually used in the NASA-KSC Controlled Ecological Life Support System to control the pH of the aerobic bioreactors and biomass production chamber. Human activities in a closed system require large amount of air, water and minerals to sustain life and also generate wastes. Before using human wastes as nutrients, these must be treated to reduce organic content and to remove some minerals which have adverse effects on plant growth. Of all the minerals present in human urine, sodium chloride (NACl) is the only one that can not be used as nutrient for most plants. Human activities also requires sodium chloride as part of the diet. Therefore, technology to remove and recover sodium chloride from wastes is highly desirable. A six-compartment electrodialysis cell using high selective membranes has been proposed to remove and recover NaCl from human urine.

  5. Develop Recovery Systems for Separations of Salts from Process Streams for use in Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Colon, Guillermo

    1998-01-01

    The main objectives of this project were the development of a four-compartment electrolytic cell using high selective membranes to remove nitrate from crop residue leachate and convert it to nitric acid, and the development of an six compartment electrodialysis cell to remove selectively sodium from urine wastes. The recovery of both plant inedible biomass and human wastes nutrients to sustain a biomass production system are important aspects in the development of a controlled ecological life support system (CELSS) to provide the basic human needs required for life support during long term space missions. A four-compartment electrolytic cell has been proposed to remove selectively nitrate from crop residue and to convert it to nitric acid, which is actually used in the NASA-KSC Controlled Ecological Life Support System to control the pH of the aerobic bioreactors and biomass production chamber. Human activities in a closed system require large amount of air, water and minerals to sustain life and also generate wastes. Before using human wastes as nutrients, these must be treated to reduce organic content and to remove some minerals which have adverse effects on plant growth. Of all the minerals present in human urine, sodium chloride (NACl) is the only one that can not be used as nutrient for most plants. Human activities also requires sodium chloride as part of the diet. Therefore, technology to remove and recover sodium chloride from wastes is highly desirable. A six-compartment electrodialysis cell using high selective membranes has been proposed to remove and recover NaCl from human urine.

  6. Development Status of a CVD System to Deposit Tungsten onto UO2 Powder via the WCI6 Process

    NASA Technical Reports Server (NTRS)

    Mireles, O. R.; Kimberlin, A.; Broadway, J.; Hickman, R.

    2014-01-01

    Nuclear Thermal Propulsion (NTP) is under development for deep space exploration. NTP's high specific impulse (> 850 second) enables a large range of destinations, shorter trip durations, and improved reliability. W-60vol%UO2 CERMET fuel development efforts emphasize fabrication, performance testing and process optimization to meet service life requirements. Fuel elements must be able to survive operation in excess of 2850 K, exposure to flowing hydrogen (H2), vibration, acoustic, and radiation conditions. CTE mismatch between W and UO2 result in high thermal stresses and lead to mechanical failure as a result UO2 reduction by hot hydrogen (H2) [1]. Improved powder metallurgy fabrication process control and mitigated fuel loss can be attained by coating UO2 starting powders within a layer of high density tungsten [2]. This paper discusses the advances of a fluidized bed chemical vapor deposition (CVD) system that utilizes the H2-WCl6 reduction process.

  7. Data Processing System (DPS) software with experimental design, statistical analysis and data mining developed for use in entomological research.

    PubMed

    Tang, Qi-Yi; Zhang, Chuan-Xi

    2013-04-01

    A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology.

  8. Silicon Web Process Development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Hopkins, R. H.; Mchugh, J. P.; Hill, F. E.; Heimlich, M. E.; Driggers, J. M.

    1978-01-01

    Progress in the development of techniques to grow silicon web at 25 wq cm/min output rate is reported. Feasibility of web growth with simultaneous melt replenishment is discussed. Other factors covered include: (1) tests of aftertrimmers to improve web width; (2) evaluation of growth lid designs to raise speed and output rate; (3) tests of melt replenishment hardware; and (4) investigation of directed gas flow systems to control unwanted oxide deposition in the system and to improve convective cooling of the web. Compatibility with sufficient solar cell performance is emphasized.

  9. Evaluating and Understanding Parameterized Convective Processes and Their Role in the Development of Mesoscale Precipitation Systems

    NASA Technical Reports Server (NTRS)

    Fritsch, J. Michael (Principal Investigator); Kain, John S.

    1995-01-01

    Research efforts during the first year focused on numerical simulations of two convective systems with the Penn State/NCAR mesoscale model. The first of these systems was tropical cyclone Irma, which occurred in 1987 in Australia's Gulf of Carpentaria during the AMEX field program. Comparison simulations of this system were done with two different convective parameterization schemes (CPS's), the Kain-Fritsch (1993 - KF) and the Betts-Miller (Betts 1986- BM) schemes. The second system was the June 10-11 1985 squall line simulation, which occurred over the Kansas-Oklahoma region during the PRE-STORM experiment. Simulations of this system using the KF scheme were examined in detail.

  10. Evaluating and Understanding Parameterized Convective Processes and Their Role in the Development of Mesoscale Precipitation Systems

    NASA Technical Reports Server (NTRS)

    Fritsch, J. Michael; Kain, John S.

    1996-01-01

    Research efforts focused on numerical simulations of two convective systems with the Penn State/NCAR mesoscale model. The first of these systems was tropical cyclone Irma, which occurred in 1987 in Australia's Gulf of Carpentaria during the AMEX field program. Comparison simulations of this system were done with two different convective parameterization schemes (CPS's), the Kain-Fritsch (KF) and the Betts-Miller (BM) schemes. The second system was the June 10-11, 1985 squall line simulation, which occurred over the Kansas-Oklahoma region during the PRE-STORM experiment. Simulations of this system using the KF scheme were examined in detail.

  11. Development and testing of a wet oxidation waste processing system. [for waste treatment aboard manned spacecraft

    NASA Technical Reports Server (NTRS)

    Weitzmann, A. L.

    1977-01-01

    The wet oxidation process is considered as a potential treatment method for wastes aboard manned spacecraft for these reasons: (1) Fecal and urine wastes are processed to sterile water and CO2 gas. However, the water requires post-treatment to remove salts and odor; (2) the residual ash is negligible in quantity, sterile and easily collected; and (3) the product CO2 gas can be processed through a reduction step to aid in material balance if needed. Reaction of waste materials with oxygen at elevated temperature and pressure also produces some nitrous oxide, as well as trace amounts of a few other gases.

  12. Development of a Real-Time General-Purpose Digital Signal Processing Laboratory System.

    DTIC Science & Technology

    1983-12-01

    Structure Chart ..... ............... . 66 21 Block Diagram of Correlation Method ......... . 72 vi A .0 P List of Tables .. .:.,- Table Page I User Interface...transmitted and received by complex electrical apparatus, the performance of which could be subjected to mathematical analysis. Signal processing now ...The definition of the term "signal" now includes almost any physical variable of interest, and the techniques of signal analysis and processing are

  13. Developing a Microcomputer-Based Decision Support System: People and Process.

    ERIC Educational Resources Information Center

    Starratt, Joseph; And Others

    1990-01-01

    Discusses the need for management information and decision support systems in libraries, and identifies inertia and confusion as the main contributors to the lack of successful implementations. An attempt to initiate a decision support system at the University of Nebraska at Omaha is described, and both problems encountered and benefits gained are…

  14. Development of a Natural Language Processing System to Identify Timing and Status of Colonoscopy Testing in Electronic Medical Records

    PubMed Central

    Denny, Joshua C.; Peterson, Josh F.; Choma, Neesha N.; Xu, Hua; Miller, Randolph A.; Bastarache, Lisa; Peterson, Neeraja B.

    2009-01-01

    Colorectal cancer (CRC) screening rates are low despite proven benefits. We developed natural language processing (NLP) algorithms to identify temporal expressions and status indicators, such as “patient refused” or “test scheduled.” The authors incorporated the algorithms into the KnowledgeMap Concept Identifier system in order to detect references to completed colonoscopies within electronic text. The modified NLP system was evaluated using 200 randomly selected electronic medical records (EMRs) from a primary care population aged ≥50 years. The system detected completed colonoscopies with recall and precision of 0.93 and 0.92. The system was superior to a query of colonoscopy billing codes to determine screening status. PMID:20351837

  15. Spitzer Telemetry Processing System

    NASA Technical Reports Server (NTRS)

    Stanboli, Alice; Martinez, Elmain M.; McAuley, James M.

    2013-01-01

    The Spitzer Telemetry Processing System (SirtfTlmProc) was designed to address objectives of JPL's Multi-mission Image Processing Lab (MIPL) in processing spacecraft telemetry and distributing the resulting data to the science community. To minimize costs and maximize operability, the software design focused on automated error recovery, performance, and information management. The system processes telemetry from the Spitzer spacecraft and delivers Level 0 products to the Spitzer Science Center. SirtfTlmProc is a unique system with automated error notification and recovery, with a real-time continuous service that can go quiescent after periods of inactivity. The software can process 2 GB of telemetry and deliver Level 0 science products to the end user in four hours. It provides analysis tools so the operator can manage the system and troubleshoot problems. It automates telemetry processing in order to reduce staffing costs.

  16. Development of a strategy for energy efficiency improvement in a Kraft process based on systems interactions analysis

    NASA Astrophysics Data System (ADS)

    Mateos-Espejel, Enrique

    The objective of this thesis is to develop, validate, and apply a unified methodology for the energy efficiency improvement of a Kraft process that addresses globally the interactions of the various process systems that affect its energy performance. An implementation strategy is the final result. An operating Kraft pulping mill situated in Eastern Canada with a production of 700 adt/d of high-grade bleached pulp was the case study. The Pulp and Paper industry is Canada's premier industry. It is characterized by large thermal energy and water consumption. Rising energy costs and more stringent environmental regulations have led the industry to refocus its efforts toward identifying ways to improve energy and water conservation. Energy and water aspects are usually analyzed independently, but in reality they are strongly interconnected. Therefore, there is a need for an integrated methodology, which considers energy and water aspects, as well as the optimal utilization and production of the utilities. The methodology consists of four successive stages. The first stage is the base case definition. The development of a focused, reliable and representative model of an operating process is a prerequisite to the optimization and fine tuning of its energy performance. A four-pronged procedure has been developed: data gathering, master diagram, utilities systems analysis, and simulation. The computer simulation has been focused on the energy and water systems. The second stage corresponds to the benchmarking analysis. The benchmarking of the base case has the objectives of identifying the process inefficiencies and to establish guidelines for the development of effective enhancement measures. The studied process is evaluated by a comparison of its efficiency to the current practice of the industry and by the application of new energy and exergy content indicators. The minimum energy and water requirements of the process are also determined in this step. The third stage is

  17. Results of the Aeronautical Systems Division Critical Process Team on Integrated Product Development

    DTIC Science & Technology

    1990-11-01

    32 7.3 Facilities ................................................................................ 32 8.0 INTEGRATED BUSINESS REQUIREMENTS .......................................... 33... business requirements The unique offeror’s capabilities must be under- and contracting methods. Key processes that stood in order to evaluate the proposed... business requirements should proper application of activity-based costing and address competition and breakout policies, work measurement techniques

  18. Cognition-based development and evaluation of ergonomic user interfaces for medical image processing and archiving systems.

    PubMed

    Demiris, A M; Meinzer, H P

    1997-01-01

    Whether or not a computerized system enhances the conditions of work in the application domain, very much demands on the user interface. Graphical user interfaces seem to attract the interest of the users but mostly ignore some basic rules of visual information processing thus leading to systems which are difficult to use, lowering productivity and increasing working stress (cognitive and work load). In this work we present some fundamental ergonomic considerations and their application to the medical image processing and archiving domain. We introduce the extensions to an existing concept needed to control and guide the development of GUIs with respect to domain specific ergonomics. The suggested concept, called Model-View-Controller Constraints (MVCC), can be used to programmatically implement ergonomic constraints, and thus has some advantages over written style guides. We conclude with the presentation of existing norms and methods to evaluate user interfaces.

  19. Development of the Process Index for NiCrAlY Coatings with the Mettech Axial III™ System

    NASA Astrophysics Data System (ADS)

    Gao, Feng; Yang, Qi; Huang, Xiao; Liu, Rong

    2013-03-01

    NiCrAlY coatings were deposited using the Mettech Axial III™ plasma spray system. The microstructural features of the coatings, such as the porosity, crack, un-melted particle, and oxide content, were analyzed to investigate the effects of the spray process parameters on these features. Two Taguchi arrays were used to examine the effects of the spray process parameters such as powder size, ratio of (H2 + N2) gas flow over total gas flow, current, spray-gun nozzle size, and spray distance, on the microstructural features of the coatings. The results from statistical analysis are used to create regression equations to predict the microstructural features of the coatings. In the regression equations, a process index (PI) is used as a complex variable incorporating a number of process parameters. The results from an additional set of experiments are used to verify the validity of the regression equations. It has been demonstrated that the equations correlate well with the results from the subsequent set of experiments. It is concluded from this study that the PI can be used to categorize coating qualities with respect to the extent of crack, porosity, unmelted particle, and oxide content in the coating. These equations can also serve as an initial step in developing process parameters by means of the Mettech Axial III™ System.

  20. SAR processing using SHARC signal processing systems

    NASA Astrophysics Data System (ADS)

    Huxtable, Barton D.; Jackson, Christopher R.; Skaron, Steve A.

    1998-09-01

    Synthetic aperture radar (SAR) is uniquely suited to help solve the Search and Rescue problem since it can be utilized either day or night and through both dense fog or thick cloud cover. Other papers in this session, and in this session in 1997, describe the various SAR image processing algorithms that are being developed and evaluated within the Search and Rescue Program. All of these approaches to using SAR data require substantial amounts of digital signal processing: for the SAR image formation, and possibly for the subsequent image processing. In recognition of the demanding processing that will be required for an operational Search and Rescue Data Processing System (SARDPS), NASA/Goddard Space Flight Center and NASA/Stennis Space Center are conducting a technology demonstration utilizing SHARC multi-chip modules from Boeing to perform SAR image formation processing.

  1. Development and implementation of the Caribbean Laboratory Quality Management Systems Stepwise Improvement Process (LQMS-SIP) Towards Accreditation

    PubMed Central

    Alemnji, George; Edghill, Lisa; Wallace-Sankarsingh, Sacha; Albalak, Rachel; Cognat, Sebastien; Nkengasong, John; Gabastou, Jean-Marc

    2017-01-01

    Background Implementing quality management systems and accrediting laboratories in the Caribbean has been a challenge. Objectives We report the development of a stepwise process for quality systems improvement in the Caribbean Region. Methods The Caribbean Laboratory Stakeholders met under a joint Pan American Health Organization/US Centers for Disease Control and Prevention initiative and developed a user-friendly framework called ‘Laboratory Quality Management System – Stepwise Improvement Process (LQMS-SIP) Towards Accreditation’ to support countries in strengthening laboratory services through a stepwise approach toward fulfilling the ISO 15189: 2012 requirements. Results This approach consists of a three-tiered framework. Tier 1 represents the minimum requirements corresponding to the mandatory criteria for obtaining a licence from the Ministry of Health of the participating country. The next two tiers are quality improvement milestones that are achieved through the implementation of specific quality management system requirements. Laboratories that meet the requirements of the three tiers will be encouraged to apply for accreditation. The Caribbean Regional Organisation for Standards and Quality hosts the LQMS-SIP Secretariat and will work with countries, including the Ministry of Health and stakeholders, including laboratory staff, to coordinate and implement LQMS-SIP activities. The Caribbean Public Health Agency will coordinate and advocate for the LQMS-SIP implementation. Conclusion This article presents the Caribbean LQMS-SIP framework and describes how it will be implemented among various countries in the region to achieve quality improvement. PMID:28879149

  2. Scaled Vitrification System III (SVS III) Process Development and Laboratory Tests at the West Valley Demonstration Project

    SciTech Connect

    V. Jain; S. M. Barnes; B. G. Bindi; R. A. Palmer

    2000-04-30

    At the West Valley Demonstration Project (WVDP),the Vitrification Facility (VF)is designed to convert the high-level radioactive waste (HLW)stored on the site to a stable glass for disposal at a Department of Energy (DOE)-specified federal repository. The Scaled Vitrification System III (SVS-III)verification tests were conducted between February 1995 and August 1995 as a supplemental means to support the vitrification process flowsheet, but at only one seventh the scale.During these tests,the process flowsheet was refined and optimized. The SVS-III test series was conducted with a focus on confirming the applicability of the Redox Forecasting Model, which was based on the Index of Feed Oxidation (IFO)developed during the Functional and Checkout Testing of Systems (FACTS)and SVS-I tests. Additional goals were to investigate the prototypical feed preparation cycle and test the new target glass composition. Included in this report are the basis and current designs of the major components of the Scale Vitrification System and the results of the SVS-III tests.The major subsystems described are the feed preparation and delivery, melter, and off-gas treatment systems. In addition,the correlation between the melter's operation and its various parameters;which included feed rate,cold cap coverage,oxygen reduction (redox)state of the glass,melter power,plenum temperature,and airlift analysis;were developed.

  3. Development of a Fully Automated Guided Wave System for In-Process Cure Monitoring of CFRP Composite Laminates

    NASA Technical Reports Server (NTRS)

    Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.; Yaun, Fuh-Gwo

    2016-01-01

    A guided wave-based in-process cure monitoring technique for carbon fiber reinforced polymer (CFRP) composites was investigated at NASA Langley Research Center. A key cure transition point (vitrification) was identified and the degree of cure was monitored using metrics such as amplitude and time of arrival (TOA) of guided waves. Using an automated system preliminarily developed in this work, high-temperature piezoelectric transducers were utilized to interrogate a twenty-four ply unidirectional composite panel fabricated from Hexcel (Registered Trademark) IM7/8552 prepreg during cure. It was shown that the amplitude of the guided wave increased sharply around vitrification and the TOA curve possessed an inverse relationship with degree of cure. The work is a first step in demonstrating the feasibility of transitioning the technique to perform in-process cure monitoring in an autoclave, defect detection during cure, and ultimately a closed-loop process control to maximize composite part quality and consistency.

  4. High-throughput downstream process development for cell-based products using aqueous two-phase systems.

    PubMed

    Zimmermann, Sarah; Gretzinger, Sarah; Schwab, Marie-Luise; Scheeder, Christian; Zimmermann, Philipp K; Oelmeier, Stefan A; Gottwald, Eric; Bogsnes, Are; Hansson, Mattias; Staby, Arne; Hubbuch, Jürgen

    2016-09-16

    As the clinical development of cell-based therapeutics has evolved immensely within the past years, downstream processing strategies become more relevant than ever. Aqueous two-phase systems (ATPS) enable the label-free, scalable, and cost-effective separation of cells, making them a promising tool for downstream processing of cell-based therapeutics. Here, we report the development of an automated robotic screening that enables high-throughput cell partitioning analysis in ATPS. We demonstrate that this setup enables fast and systematic investigation of factors influencing cell partitioning. Moreover, we examined and optimized separation conditions for the differentiable promyelocytic cell line HL-60 and used a counter-current distribution-model to investigate optimal separation conditions for a multi-stage purification process. Finally, we show that the separation of CD11b-positive and CD11b-negative HL-60 cells is possible after partial DMSO-mediated differentiation towards the granulocytic lineage. The modeling data indicate that complete peak separation is possible with 30 transfers, and >93% of CD11b-positive HL-60 cells can be recovered with >99% purity. The here described screening platform facilitates faster, cheaper, and more directed downstream process development for cell-based therapeutics and presents a powerful tool for translational research. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Design Process for the Development of a New Truck Monitoring System - 13306

    SciTech Connect

    LeBlanc, P.J.; Bronson, Frazier

    2013-07-01

    Canberra Industries, Inc. has designed a new truck monitoring system for a facility in Japan. The customer desires to separately quantify the Cs-137 and Cs-134 content of truck cargo entering and leaving a Waste Consolidation Area. The content of the trucks will be some combination of sand, soil, and vegetation with densities ranging from 0.3 g/cc - 1.6 g/cc. The typical weight of the trucks will be approximately 10 tons, but can vary between 4 and 20 tons. The system must be sensitive enough to detect 100 Bq/kg in 10 seconds (with less than 10% relative standard deviation) but still have enough dynamic range to measure 1,000,000 Bq/kg material. The system will be operated in an outdoor environment. Starting from these requirements, Canberra explored all aspects of the counting system in order to provide the customer with the optimized solution. The desire to separately quantify Cs-137 and Cs-134 favors the use of a spectroscopic system as a solution. Using the In Situ Object Counting System (ISOCS) mathematical efficiency calculation tool, we explored various detector types, number, and physical arrangement for maximum performance. Given the choice of detector, the ISOCS software was used to investigate which geometric parameters (fill height, material density, etc.) caused the most fluctuations in the efficiency results. Furthermore, these variations were used to obtain quantitative estimates of the uncertainties associated with the possible physical variations in the truck size, detector positioning, and material composition, density, and fill height. Various shielding options were also explored to ensure that any measured Cs content would be from the truck and not from the surrounding area. The details of the various calculations along with the final design are given. (authors)

  6. Review on Biomass Torrefaction Process and Product Properties and Design of Moving Bed Torrefaction System Model Development

    SciTech Connect

    Jaya Shankar Tumuluru; Christopher T. Wright; Shahab Sokhansanj

    2011-08-01

    A Review on Torrefaction Process and Design of Moving Bed Torrefaction System for Biomass Processing Jaya Shankar Tumuluru1, Shahab Sokhansanj2 and Christopher T. Wright1 Idaho National Laboratory Biofuels and Renewable Energy Technologies Department Idaho Falls, Idaho 83415 Oak Ridge National Laboratory Bioenergy Resource and Engineering Systems Group Oak Ridge, TN 37831 Abstract Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-230 C and 270-280 C. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, which produces a final product that will have a lower mass but a higher heating value. There is a lack of literature on the design aspects of torrefaction reactor and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes (a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and (b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed torrefier for different capacities ranging from 25-1000 kg/hr, designing the heat loads and gas flow rates, and

  7. Object-Oriented Development Process for Department of Defense Information Systems.

    DTIC Science & Technology

    1995-07-01

    Incremental: total system |Jsystem[ 1/ specification 1• /• increment 1k..n" __ /" • /1..n.n •emonstratiOevelopmentPo ucinOperations" Defintion & Validationj...by the Assuming that mutual inheritance and looping are excluded. A-24 Object F Physical-object Even Abstract-object [ Animate -object I Inanimate

  8. Development of a Scalable Process Control System for Chemical Soil Washing to Remove Uranyl Oxide

    DTIC Science & Technology

    2015-05-01

    management under both the Clean Water (CWA) and Resource Conservation and Recovery Acts (RCRA). Likewise, material contaminated with explosive...volatile organic pollutants. This contamination can include greasing solvents such as trichloroethane petroleum products from leaking underground storage...engineering and environmental challenges. ERDC develops innovative solutions in civil and military engineering, geospatial sciences, water resources, and

  9. High-Performance Digital Imaging System for Development and Characterization of Novel Materials and Processes

    DTIC Science & Technology

    2006-08-08

    taking into account the effects of polycrystalline microstructures, elastic anisotropy of the crystals, and material damages due to microplasticity and...anisotropic crystal elasticity, intragranular microplasticity and intergranular microfracture have been developed and implemented into the ABAQUS codes...Zhang, K. S., Wu, M. S., and Feng, R. (2005). Simulation of microplasticity -induced deformation in uniaxially strained ceramics by 3-D Voronoi

  10. Development of Neural Systems for Processing Social Exclusion from Childhood to Adolescence

    ERIC Educational Resources Information Center

    Bolling, Danielle Z.; Pitskel, Naomi B.; Deen, Ben; Crowley, Michael J.; Mayes, Linda C.; Pelphrey, Kevin A.

    2011-01-01

    Adolescence is a period of development in which peer relationships become especially important. A computer-based game (Cyberball) has been used to explore the effects of social exclusion in adolescents and adults. The current functional magnetic resonance imaging (fMRI) study used Cyberball to extend prior work to the cross-sectional study of…

  11. Development of Neural Systems for Processing Social Exclusion from Childhood to Adolescence

    ERIC Educational Resources Information Center

    Bolling, Danielle Z.; Pitskel, Naomi B.; Deen, Ben; Crowley, Michael J.; Mayes, Linda C.; Pelphrey, Kevin A.

    2011-01-01

    Adolescence is a period of development in which peer relationships become especially important. A computer-based game (Cyberball) has been used to explore the effects of social exclusion in adolescents and adults. The current functional magnetic resonance imaging (fMRI) study used Cyberball to extend prior work to the cross-sectional study of…

  12. Volcanic Alert System (VAS) developed during the (2011-2013) El Hierro (Canary Islands) volcanic process

    NASA Astrophysics Data System (ADS)

    Ortiz, Ramon; Berrocoso, Manuel; Marrero, Jose Manuel; Fernandez-Ros, Alberto; Prates, Gonçalo; De la Cruz-Reyna, Servando; Garcia, Alicia

    2014-05-01

    In volcanic areas with long repose periods (as El Hierro), recently installed monitoring networks offer no instrumental record of past eruptions nor experience in handling a volcanic crisis. Both conditions, uncertainty and inexperience, contribute to make the communication of hazard more difficult. In fact, in the initial phases of the unrest at El Hierro, the perception of volcanic risk was somewhat distorted, as even relatively low volcanic hazards caused a high political impact. The need of a Volcanic Alert System became then evident. In general, the Volcanic Alert System is comprised of the monitoring network, the software tools for the analysis of the observables, the management of the Volcanic Activity Level, and the assessment of the threat. The Volcanic Alert System presented here places special emphasis on phenomena associated to moderate eruptions, as well as on volcano-tectonic earthquakes and landslides, which in some cases, as in El Hierro, may be more destructive than an eruption itself. As part of the Volcanic Alert System, we introduce here the Volcanic Activity Level which continuously applies a routine analysis of monitoring data (particularly seismic and deformation data) to detect data trend changes or monitoring network failures. The data trend changes are quantified according to the Failure Forecast Method (FFM). When data changes and/or malfunctions are detected, by an automated watchdog, warnings are automatically issued to the Monitoring Scientific Team. Changes in the data patterns are then translated by the Monitoring Scientific Team into a simple Volcanic Activity Level, that is easy to use and understand by the scientists and technicians in charge for the technical management of the unrest. The main feature of the Volcanic Activity Level is its objectivity, as it does not depend on expert opinions, which are left to the Scientific Committee, and its capabilities for early detection of precursors. As a consequence of the El Hierro

  13. Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hill, F. E.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.

    1980-01-01

    A barrier crucible design which consistently maintains melt stability over long periods of time was successfully tested and used in long growth runs. The pellet feeder for melt replenishment was operated continuously for growth runs of up to 17 hours. The liquid level sensor comprising a laser/sensor system was operated, performed well, and meets the requirements for maintaining liquid level height during growth and melt replenishment. An automated feedback loop connecting the feed mechanism and the liquid level sensing system was designed and constructed and operated successfully for 3.5 hours demonstrating the feasibility of semi-automated dendritic web growth. The sensitivity of the cost of sheet, to variations in capital equipment cost and recycling dendrites was calculated and it was shown that these factors have relatively little impact on sheet cost. Dendrites from web which had gone all the way through the solar cell fabrication process, when melted and grown into web, produce crystals which show no degradation in cell efficiency. Material quality remains high and cells made from web grown at the start, during, and the end of a run from a replenished melt show comparable efficiencies.

  14. Industrial Process Surveillance System

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W; Singer, Ralph M.; Mott, Jack E.

    2001-01-30

    A system and method for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy.

  15. Industrial process surveillance system

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W.; Singer, Ralph M.; Mott, Jack E.

    1998-01-01

    A system and method for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy.

  16. Industrial process surveillance system

    DOEpatents

    Gross, K.C.; Wegerich, S.W.; Singer, R.M.; Mott, J.E.

    1998-06-09

    A system and method are disclosed for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy. 96 figs.

  17. Development of a Cost Estimation Process for Human Systems Integration Practitioners During the Analysis of Alternatives

    DTIC Science & Technology

    2010-12-01

    Canadian Defense Technology Center (DTC) (2006); Brooks, Greenley , Dyck, Salwaycott, Scipione and Shaw (2008); and Liu (2009) provide guidance for...effective operations ( Greenley & Associates, 2008). However, these costs were not used in a total cost-benefit analysis because of their multi...December 6, 2010, from http://www.hfidtc.com/ Brooks, J., Greenley , M., Salwaycott, A., & Scipione, A. (2008). The development and validation of a

  18. Customer information and the quality improvement process: developing a customer information system.

    PubMed

    Orme, C N; Parsons, R J; McBride, G Z

    1992-01-01

    As growing numbers of health care organizations institute quality improvement programs, the demand within these organizations for reliable information about customers increases. By establishing a customer information system (CIS)--a model for collecting, archiving, and accessing customer information--health care organizations can eliminate the duplication of research, ensure that customer information is properly collected and interpreted, and provide decision makers access to better, more reliable customer information. Customer-supplier relationships are defined, guidelines for determining information needs are provided, and ways to set up and manage a CIS are suggested.

  19. Development of Conceptual Design Support Tool Founded on Formalization of Conceptual Design Process for Regenerative Life Support Systems

    NASA Astrophysics Data System (ADS)

    Miyajima, Hiroyuki; Yuhara, Naohiro

    Regenerative Life Support Systems (RLSS), which maintain human lives by recycling substances essential for living, are comprised of humans, plants, and material circulation systems. The plants supply food to the humans or reproduce water and gases by photosynthesis, while the material circulation systems recycle physicochemically and circulate substances disposed by humans and plants. RLSS attracts attention since manned space activities have been shifted from previous short trips to long-term stay activities as such base as a space station, a lunar base, and a Mars base. The present typical space base is the International Space Station (ISS), a manned experimental base for prolonged stays, where RLSS recycles only water and air. In order to accommodate prolonged and extended manned activity in future space bases, developing RLSS that implements food production and regeneration of resources at once using plants is expected. The configuration of RLSS should be designed to suit its own duty, for which design requirements for RLSS with an unprecedented configuration may arise. Accordingly, it is necessary to establish a conceptual design method for generalized RLSS. It is difficult, however, to systematize the design process by analyzing previous design because there are only a few ground-experimental facilities, namely CEEF (Closed Ecology Experiment Facilities) of Japan, BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) of the U.S., and BIOS3 of Russia. Thus a conceptual design method which doesn’t rely on previous design examples is required for generalized RLSS from the above reasons. This study formalizes a conceptual design process, and develops a conceptual design support tool for RLSS based on this design process.

  20. Standardization developments for large scale biobanks in smoking related diseases - a model system for blood sample processing and storage.

    PubMed

    Malm, Johan; Fehniger, Thomas E; Danmyr, Pia; Végvári, Ákos; Welinder, Charlotte; Lindberg, Henrik; Upton, Paul; Carter, Stephanie; Appelqvist, Roger; Sjödin, Karin; Wieslander, Elisabet; Dahlbäck, Magnus; Rezeli, Melinda; Erlinge, David; Marko-Varga, György

    2013-12-01

    Biobank samples stored in biobanks give researchers and respiratory healthcare institutions access to datasets of analytes valuable for both diagnostic and research practices. The usefulness of these samples in clinical decision-making is highly dependent on their quality and integrity. New procedures that better preserve sample integrity and reduce degradation are being developed to meet the needs of both present and future biobanking. Hereby we present an automatic sample workflow scheme that is designed to handle high numbers of blood samples. Blood fractions are aliquoted, heat sealed using novel technology, and stored in 384 tube high-density sample arrays. The newly developed 384 biobank rack system is especially suited for preserving identical small aliquots. We provide data on robotic processing of clinical samples at -80°C, following initial processing, analysis and shipping between laboratories throughout Europe. Subsequent to unpacking, re-sorting, and storage at these sites, the samples have been returned for analysis. Biomarker analysis of 13 common tests in the clinical chemistry unit of the hospital provides evidence of qualitative and stable logistics using the 384-sample tube system. This technology development allows rapid access to a given sample in the frozen archive while maintaining individual sample integrity with sample tube confinement and quality management.

  1. Touchscreen questionnaire patient data collection in rheumatology practice: development of a highly successful system using process redesign.

    PubMed

    Newman, Eric D; Lerch, Virginia; Jones, J B; Stewart, Walter

    2012-04-01

    While questionnaires have been developed to capture patient-reported outcomes (PROs) in rheumatology practice, these instruments are not widely used. We developed a touchscreen interface designed to provide reliable and efficient data collection. Using the touchscreen to obtain PROs, we compared 2 different workflow models implemented separately in 2 rheumatology clinics. The Plan-Do-Study-Act methodology was used in 2 cycles of workflow redesign. Cycle 1 relied on off-the-shelf questionnaire builder software, and cycle 2 relied on a custom programmed software solution. During cycle 1, clinic 1 (private practice model, resource replete, simple flow) demonstrated a high completion rate at the start, averaging between 74% and 92% for the first 12 weeks. Clinic 2 (academic model, resource deficient, complex flow) did not achieve a consistent completion rate above 60%. The revised cycle 2 implementation protocol incorporated a 15-minute "nurse visit," an instant messaging system, and a streamlined authentication process, all of which contributed to substantial improvement in touchscreen questionnaire completion rates of ∼80% that were sustained without the need for any additional clinic staff support. Process redesign techniques and touchscreen technology were used to develop a highly successful, efficient, and effective process for the routine collection of PROs in a busy, complex, and resource-depleted academic practice and in typical private practice. The successful implementation required both a touchscreen questionnaire, human behavioral redesign, and other technical solutions. Copyright © 2012 by the American College of Rheumatology.

  2. Development of neural systems for processing social exclusion from childhood to adolescence.

    PubMed

    Bolling, Danielle Z; Pitskel, Naomi B; Deen, Ben; Crowley, Michael J; Mayes, Linda C; Pelphrey, Kevin A

    2011-11-01

    Adolescence is a period of development in which peer relationships become especially important. A computer-based game (Cyberball) has been used to explore the effects of social exclusion in adolescents and adults. The current functional magnetic resonance imaging (fMRI) study used Cyberball to extend prior work to the cross-sectional study of younger children and adolescents (7 to 17 years), identifying age-related changes in the neural correlates of social exclusion across the important transition from middle childhood into adolescence. Additionally, a control task illustrated the specificity of these age-related changes for social exclusion as distinct from expectancy violation more generally. During exclusion, activation in and functional connectivity between ventrolateral prefrontal cortex and ventral anterior cingulate cortex increased with age. These effects were specific to social exclusion and did not exist for expectancy violation. Our results illustrate developmental changes from middle childhood through adolescence in both affective and regulatory brain regions during social exclusion.

  3. Process evaluation distributed system

    NASA Technical Reports Server (NTRS)

    Moffatt, Christopher L. (Inventor)

    2006-01-01

    The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

  4. Development & Optimization of Materials and Processes for a Cost Effective Photoelectrochemical Hydrogen Production System. Final report

    SciTech Connect

    McFarland, Eric W

    2011-01-17

    The overall project objective was to apply high throughput experimentation and combinatorial methods together with novel syntheses to discover and optimize efficient, practical, and economically sustainable materials for photoelectrochemical production of bulk hydrogen from water. Automated electrochemical synthesis and photoelectrochemical screening systems were designed and constructed and used to study a variety of new photoelectrocatalytic materials. We evaluated photocatalytic performance in the dark and under illumination with or without applied bias in a high-throughput manner and did detailed evaluation on many materials. Significant attention was given to -Fe2O3 based semiconductor materials and thin films with different dopants were synthesized by co-electrodeposition techniques. Approximately 30 dopants including Al, Zn, Cu, Ni, Co, Cr, Mo, Ti, Pt, etc. were investigated. Hematite thin films doped with Al, Ti, Pt, Cr, and Mo exhibited significant improvements in efficiency for photoelectrochemical water splitting compared with undoped hematite. In several cases we collaborated with theorists who used density functional theory to help explain performance trends and suggest new materials. The best materials were investigated in detail by X-ray diffraction (XRD), scanning electron microscopy (SEM), ultraviolet-visual spectroscopy (UV-Vis), X-ray photoelectron spectroscopy (XPS). The photoelectrocatalytic performance of the thin films was evaluated and their incident photon

  5. Protein Crystallization in Agarose Gel with High Strength: Developing an Automated System for Protein Crystallographic Processes

    NASA Astrophysics Data System (ADS)

    Sugiyama, Shigeru; Tanabe, Kana; Hirose, Mika; Kitatani, Tomoya; Hasenaka, Hitoshi; Takahashi, Yoshinori; Adachi, Hiroaki; Takano, Kazufumi; Murakami, Satoshi; Mori, Yusuke; Inoue, Tsuyoshi; Matsumura, Hiroyoshi

    2009-07-01

    Agarose gel media reduce convection and prevent crystal sedimentation, resulting in the production of high-quality protein crystals. However, crystallographers have only tested agarose gel at concentrations between 0.0 and 0.6% (w/v), where it exhibits low gel strength. The effect of agarose gel on protein structures remains to be elucidated, because only a few structural studies have been performed using gel-grown protein crystals. Here, we crystallize thaumatin and elastase using a variety of crystallization methods in 2.0% (w/v) agarose gels, which are completely gellified and have sufficiently high-strength. This new crystallization approach using semi-solid agarose gels is compatible with several conventional crystallization techniques. A comparison of structures crystallized in non-gelled solution and those crystallized in 2.0% (w/v) agarose gels indicates that the crystal structures were not affected by the high-concentration agarose gels. This technique offers the practical advantages of efficient protection by the semi-solid gel media surrounding the protein crystals, allowing them to be handled and transported without affecting any later crystallographic analysis, and thereby providing an automated system for crystal capturing and mounting.

  6. Research and Development in the Computer and Information Sciences. Volume 2, Processing, Storage, and Output Requirements in Information Processing Systems: A Selective Literature Review.

    ERIC Educational Resources Information Center

    Stevens, Mary Elizabeth

    Areas of concern with respect to processing, storage, and output requirements of a generalized information processing system are considered. Special emphasis is placed on multiple-access systems. Problems of system management and control are discussed, including hierarchies of storage levels. Facsimile, digital, and mass random access storage…

  7. Software Engineering Processes Used to Develop the NIF Integrated Computer Control System

    SciTech Connect

    Ludwigsen, A P; Carey, R W; Demaret, R D; Lagin, L J; Reddi, U P; Van Arsdall, P J

    2007-10-03

    We have developed a new target platform to study Laser Plasma Interaction in ignition-relevant condition at the Omega laser facility (LLE/Rochester)[1]. By shooting an interaction beam along the axis of a gas-filled hohlraum heated by up to 17 kJ of heater beam energy, we were able to create a millimeter-scale underdense uniform plasma at electron temperatures above 3 keV. Extensive Thomson scattering measurements allowed us to benchmark our hydrodynamic simulations performed with HYDRA [1]. As a result of this effort, we can use with much confidence these simulations as input parameters for our LPI simulation code pF3d [2]. In this paper, we show that by using accurate hydrodynamic profiles and full three-dimensional simulations including a realistic modeling of the laser intensity pattern generated by various smoothing options, fluid LPI theory reproduces the SBS thresholds and absolute reflectivity values and the absence of measurable SRS. This good agreement was made possible by the recent increase in computing power routinely available for such simulations.

  8. Fabrication process development of SiC/superalloy composite sheet for exhaust system components

    NASA Technical Reports Server (NTRS)

    Cornie, J. A.; Cook, C. S.; Anderson, C. A.

    1976-01-01

    A chemical compatibility study was conducted between SiC filament and the following P/M matrix alloys: Waspaloy, Hastelloy-X, NiCrAlY, Ha-188, S-57, FeCrAlY, and Incoloy 800. None of the couples demonstrated sufficient chemical compatibility to withstand the minimum HIP consolidation temperatures (996 C) or intended application temperature of the composite (982 C). However, Waspaloy, Haynes 188, and Hastelloy-X were the least reactive with SiC of the candidate alloys. Chemical vapor deposited tungsten was shown to be an effective diffusion barrier between the superalloy matrix and SiC filament providing a defect-free coating of sufficient thickness. However, the coating breaks down when the tungsten is converted into intermetallic compounds by interdiffusion with matrix constituents. Waspaloy was demonstrated to be the most effective matrix alloy candidate in contact with the CVD tungsten barrier because of its relatively low growth rate constant of the intermediate compound and the lack of formation of Kirkendall voids at the matrix-barrier interface. Fabrication methods were developed for producing panels of uniaxial and angle ply composites utilizing CVD tungsten coated filament.

  9. Laser material processing system

    DOEpatents

    Dantus, Marcos

    2015-04-28

    A laser material processing system and method are provided. A further aspect of the present invention employs a laser for micromachining. In another aspect of the present invention, the system uses a hollow waveguide. In another aspect of the present invention, a laser beam pulse is given broad bandwidth for workpiece modification.

  10. Problem Solving for Volatilizing Situation in Nursing: Developing Thinking Process Supporting System using NursingNAVI® Contents.

    PubMed

    Tsuru, Satoko; Wako, Fumiko; Omori, Miho; Sudo, Kumiko

    2015-01-01

    We have identified three foci of the nursing observation and nursing action respectively. Using these frameworks, we have developed the structured knowledge model for a number of diseases and medical interventions. We developed this structure based NursingNAVI® contents collaborated with some quality centered hospitals. Authors analysed the nursing care documentations of post-gastrectomy patients in light of the standardized nursing care plan in the "NursingNAVI®" developed by ourselves and revealed the "failure to observe" and "failure to document", which leaded to the volatility of the patients' data, conditions and some situation. This phenomenon should have been avoided if nurses had employed a standardized nursing care plan. So, we developed thinking process support system for planning, delivering, recording and evaluating in daily nursing using NursingNAVI® contents. A hospital decided to use NursingNAVI® contents in HIS. It was suggested that the system has availability for nursing OJT and time reduction of planning and recording without volatilizing situation.

  11. Development of Fast Measurement System of Neutron Emission Profile Using a Digital Signal Processing Technique in JT-60U

    SciTech Connect

    Ishikawa, M.; Shinohara, K.; Itoga, T.; Okuji, T.; Nakhostin, M.; Baba, M.; Nishitani, T.

    2008-03-12

    Neutron emission profiles are routinely measured in JT-60U Tokamak. Stinbene neuron detectors (SNDs), which combine a Stilbene organic crystal scintillation detector (Stilbene detector) with an analog neutron-gamma pulse shape discrimination (PSD) circuit, have been used to measure neutron flux efficiently. Although the SND has many advantages as a neutron detector, the maximum count rate is limited up to {approx}1x 10{sup 5} counts/s due to the dead time of the analog PSD circuit. To overcome this issue, a digital signal processing (DSP) system using a Flash-ADC has been developed. In this system, anode signals from the photomultiplier of the Stilbene detector are fed to the Flash ADC and digitized. Then, the PSD between neutrons and gamma-rays are performed using software. The photomultiplier tube is also modified to suppress and correct gain fluctuation of the photomultiplier. The DSP system has been installed in the center channel of the vertical neutron collimator system in JT-60U and applied to measurements of neutron flux in JT-60U experiments. Neutron flux are successfully measured with count rate up to {approx}1x 10{sup 6} counts/s without the effect of pile up of detected pulses. The performance of the DSP system as a neutron detector is demonstrated.

  12. Development of an image processing support system based on fluorescent dye to prevent elderly people with dementia from wandering.

    PubMed

    Nishigaki, Yutaka; Tanaka, Kentaro; Kim, Juhyon; Nakajima, Kazuki

    2013-01-01

    The wandering of elderly people with dementia is a significant behavioral problem and is a heavy burden on caregivers in residential and nursing homes. Thus, warning systems have been developed to prevent elderly people with dementia from leaving the premises. Some of these systems use radio waves. However, systems based on radio waves present several practical problems. For instance, the transmitter must be carried and may become lost; in addition, the battery of the transmitter must be changed. To solve these problems, we developed a support system that prevents elderly people with dementia from wandering. The system employs image processing technology based on fluorescent dye. The composition of the support system can be described as follows: fluorescent dye is painted in a simple shape on the clothes of an elderly person. The fluorescent color becomes visible by irradiation with a long wavelength of ultraviolet light. In the present paper, the relationship between the color of the dye and the cloth was investigated. A 3D video camera was used to acquire a 3D image and detect the simple shape. As a preliminary experiment, 3 colors (red, green and blue) of fluorescent dye were applied to cloths of 9 different colors. All fluorescent colors were detected on 6 of the cloths, but red and blue dye could not be detected on the other 3 cloths. In contrast, green dye was detectable on all 9 of the cloths. Additionally, we determined whether green dye could be detected in an actual environment. A rectangular shaped patch of green fluorescent dye was painted on the shoulder area of a subject, from the scapula to the clavicle. As a result, the green dye was detected on all 9 different colored cloths.

  13. Low cost solar aray project: Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1981-01-01

    This phase consists of the engineering design, fabrication, assembly, operation, economic analysis, and process support R&D for an Experimental Process System Development Unit (EPSDU). The mechanical bid package was issued and the bid responses are under evaluation. Similarly, the electrical bid package was issued, however, responses are not yet due. The majority of all equipment is on order or has been received at the EPSDU site. The pyrolysis/consolidation process design package was issued. Preparation of process and instrumentation diagram for the free-space reactor was started. In the area of melting/consolidation, Kayex successfully melted chunk silicon and have produced silicon shot. The free-space reactor powder was successfully transported pneumatically from a storage bin to the auger feeder twenty-five feet up and was melted. The fluid-bed PDU has successfully operated at silane feed concentrations up to 21%. The writing of the operating manual has started. Overall, the design phase is nearing completion.

  14. Development of Production PVD-AIN Buffer Layer System and Processes to Reduce Epitaxy Costs and Increase LED Efficiency

    SciTech Connect

    Cerio, Frank

    2013-09-14

    was analyzed and improvements implemented to the Veeco PVD-AlN prototype system to establish a specification and baseline PVD-AlN films on sapphire and in parallel the evaluation of PVD AlN on silicon substrates began. In Phase II of the project a Beta tool based on a scaled-up process module capable of depositing uniform films on batches of 4”or 6” diameter substrates in a production worthy operation was developed and qualified. In Phase III, the means to increase the throughput of the PVD-AlN system was evaluated and focused primarily on minimizing the impact of the substrate heating and cooling times that dominated the overall cycle time.

  15. Transparent materials processing system

    NASA Technical Reports Server (NTRS)

    Hetherington, J. S.

    1977-01-01

    A zero gravity processing furnace system was designed that will allow acquisition of photographic or other visual information while the sample is being processed. A low temperature (30 to 400 C) test model with a flat specimen heated by quartz-halide lamps was constructed. A high temperature (400 to 1000 C) test model heated by resistance heaters, utilizing a cylindrical specimen and optics, was also built. Each of the test models is discussed in detail. Recommendations are given.

  16. Business Development Process

    DTIC Science & Technology

    2001-10-31

    entity’s real estate situation and condition for use by 17 customers including (but not limited to) the business entity. 18 Information is processed to...the score to provide a well- 3 rounded picture of a particular real estate situation. 4 Stratmann discloses a method for assisting an individual in...identify a potential 9 flaw in the opportunity analysis . These criteria include whether 10 the process is dealing with a real customer, if it is

  17. Development of an alternating magnetic-field-assisted finishing process for microelectromechanical systems micropore x-ray optics.

    PubMed

    Riveros, Raul E; Yamaguchi, Hitomi; Mitsuishi, Ikuyuki; Takagi, Utako; Ezoe, Yuichiro; Kato, Fumiki; Sugiyama, Susumu; Yamasaki, Noriko; Mitsuda, Kazuhisa

    2010-06-20

    X-ray astronomy research is often limited by the size, weight, complexity, and cost of functioning x-ray optics. Micropore optics promises an economical alternative to traditional (e.g., glass or foil) x-ray optics; however, many manufacturing difficulties prevent micropore optics from being a viable solution. Ezoe et al. introduced microelectromechanical systems (MEMS) micropore optics having curvilinear micropores in 2008. Made by either deep reactive ion etching or x-ray lithography, electroforming, and molding (LIGA), MEMS micropore optics suffer from high micropore sidewall roughness (10-30nmrms) which, by current standards, cannot be improved. In this research, a new alternating magnetic-field-assisted finishing process was developed using a mixture of ferrofluid and microscale abrasive slurry. A machine was built, and a set of working process parameters including alternating frequency, abrasive size, and polishing time was selected. A polishing experiment on a LIGA-fabricated MEMS micropore optic was performed, and a change in micropore sidewall roughness of 9.3+/-2.5nmrms to 5.7+/-0.7nmrms was measured. An improvement in x-ray reflectance was also seen. This research shows the feasibility and confirms the effects of this new polishing process on MEMS micropore optics.

  18. Development of an alternating magnetic-field-assisted finishing process for microelectromechanical systems micropore x-ray optics

    SciTech Connect

    Riveros, Raul E.; Yamaguchi, Hitomi; Mitsuishi, Ikuyuki; Takagi, Utako; Ezoe, Yuichiro; Kato, Fumiki; Sugiyama, Susumu; Yamasaki, Noriko; Mitsuda, Kazuhisa

    2010-06-20

    X-ray astronomy research is often limited by the size, weight, complexity, and cost of functioning x-ray optics. Micropore optics promises an economical alternative to traditional (e.g., glass or foil) x-ray optics; however, many manufacturing difficulties prevent micropore optics from being a viable solution. Ezoe et al. introduced microelectromechanical systems (MEMS) micropore optics having curvilinear micropores in 2008. Made by either deep reactive ion etching or x-ray lithography, electroforming, and molding (LIGA), MEMS micropore optics suffer from high micropore sidewall roughness (10-30nmrms) which, by current standards, cannot be improved. In this research, a new alternating magnetic-field-assisted finishing process was developed using a mixture of ferrofluid and microscale abrasive slurry. A machine was built, and a set of working process parameters including alternating frequency, abrasive size, and polishing time was selected. A polishing experiment on a LIGA-fabricated MEMS micropore optic was performed, and a change in micropore sidewall roughness of 9.3{+-}2.5nmrms to 5.7{+-}0.7nmrms was measured. An improvement in x-ray reflectance was also seen. This research shows the feasibility and confirms the effects of this new polishing process on MEMS micropore optics.

  19. Development of a flash flood warning system based on real-time radar data and process-based erosion modelling

    NASA Astrophysics Data System (ADS)

    Schindewolf, Marcus; Kaiser, Andreas; Buchholtz, Arno; Schmidt, Jürgen

    2017-04-01

    Extreme rainfall events and resulting flash floods led to massive devastations in Germany during spring 2016. The study presented aims on the development of a early warning system, which allows the simulation and assessment of negative effects on infrastructure by radar-based heavy rainfall predictions, serving as input data for the process-based soil loss and deposition model EROSION 3D. Our approach enables a detailed identification of runoff and sediment fluxes in agricultural used landscapes. In a first step, documented historical events were analyzed concerning the accordance of measured radar rainfall and large scale erosion risk maps. A second step focused on a small scale erosion monitoring via UAV of source areas of heavy flooding events and a model reconstruction of the processes involved. In all examples damages were caused to local infrastructure. Both analyses are promising in order to detect runoff and sediment delivering areas even in a high temporal and spatial resolution. Results prove the important role of late-covering crops such as maize, sugar beet or potatoes in runoff generation. While e.g. winter wheat positively affects extensive runoff generation on undulating landscapes, massive soil loss and thus muddy flows are observed and depicted in model results. Future research aims on large scale model parameterization and application in real time, uncertainty estimation of precipitation forecast and interface developments.

  20. Automated system for the on-line monitoring of powder blending processes using near-infrared spectroscopy. Part I. System development and control.

    PubMed

    Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K

    1996-03-01

    An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.

  1. Development of a web-based support system for both homogeneous and heterogeneous air quality control networks: process and product.

    PubMed

    Andrade, J; Ares, J; García, R; Presa, J; Rodríguez, S; Piñeiro-Iglesias, M; López-Mahía, P; Muniategui, S; Prada, D

    2007-10-01

    The Environmental Laboratories Automation Software System or PALMA (Spanish abbreviation) was developed by a multidisciplinary team in order to support the main tasks of heterogeneous air quality control networks. The software process for PALMA development, which can be perfectly applied to similar multidisciplinary projects, was (a) well-defined, (b) arranged between environmental technicians and informatics, (c) based on quality guides, and (d) clearly user-centred. Moreover, it introduces some interesting advantages with regard to the classical step-by-step approaches. PALMA is a web-based system that allows 'off-line' and automated telematic data acquisition from distributed inmission stations belonging not only to homogeneous but also to heterogeneous air quality control networks. It provides graphic and tabular representations for a comprehensive and centralised analysis of acquired data, and considers the daily work that is associated with such networks: validation of the acquired data, alerts with regard to (periodical) tasks (e.g., analysers verification), downloading of files with environmental information (e.g., dust forecasts), etc. The implantation of PALMA has provided qualitative and quantitative improvements in the work performed by the people in charge of the considered control network.

  2. Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.

    1981-01-01

    The silicon web process takes advantage of natural crystallographic stabilizing forces to grow long, thin single crystal ribbons directly from liquid silicon. The ribbon, or web, is formed by the solidification of a liquid film supported by surface tension between two silicon filaments, called dendrites, which border the edges of the growing strip. The ribbon can be propagated indefinitely by replenishing the liquid silicon as it is transformed to crystal. The dendritic web process has several advantages for achieving low cost, high efficiency solar cells. These advantages are discussed.

  3. Quartz resonator processing system

    DOEpatents

    Peters, Roswell D. M.

    1983-01-01

    Disclosed is a single chamber ultra-high vacuum processing system for the oduction of hermetically sealed quartz resonators wherein electrode metallization and sealing are carried out along with cleaning and bake-out without any air exposure between the processing steps. The system includes a common vacuum chamber in which is located a rotatable wheel-like member which is adapted to move a plurality of individual component sets of a flat pack resonator unit past discretely located processing stations in said chamber whereupon electrode deposition takes place followed by the placement of ceramic covers over a frame containing a resonator element and then to a sealing stage where a pair of hydraulic rams including heating elements effect a metallized bonding of the covers to the frame.

  4. Low cost solar array project. Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Technical activities are reported in the design of process, facilities, and equipment for producing silicon at a rate and price comensurate with production goals for low cost solar cell modules. The silane-silicone process has potential for providing high purity poly-silicon on a commercial scale at a price of fourteen dollars per kilogram by 1986, (1980 dollars). Commercial process, economic analysis, process support research and development, and quality control are discussed.

  5. Log processing systems

    SciTech Connect

    Bowlin, W.P.; Kneer, M.P.; Ballance, J.D.

    1989-11-07

    This patent describes an improvement in a computer controlled processing system for lumber production. It comprises: a computer, a sequence of processing stations for processing a log segment including; an excess material removing station for generating opposed flat side surfaces on the log segment. The flat side surfaces determined by the computer to become sides of boards to be severed from the log segments; a profiling station for forming profiled edges above and below the flat side surfaces to become the side edges of the boards to be severed from the log segment, and a severing station for severing the boards from the log segments, a conveyance means establishing a path of conveyance and having continuous control of the log segment on conveying the log segment along the path and through the above defined sequence of processing stations.

  6. Plutonium process monitoring (PPM) system

    NASA Astrophysics Data System (ADS)

    Wong, A. S.; Ricketts, T. E.; Pansoy-Hejlvik, M. E.; Ramsey, K. B.; Hansel, K. M.; Romero, M. K.

    2000-07-01

    In mid-1980, Marsh and Pope developed an online gamma system to monitor americium, uranium and plutonium gamma rays during anion-exchange process for plutonium aqueous recovery operations. It has been shown that the real-time elution profiles of actinide impurities are important for plutonium loss via break-through, waste minimization, and process monitoring. However, the current monitoring equipment and data acquisition software are obsolete and are frequently problematic. In 1999, we redesigned the on-line gamma monitoring system in collaboration with Perkin-Elmer ORTEC (Oak Ridge, TN) to enhance and upgrade the current system. This paper describes the new integrated plutonium process monitoring (PPM) system for the aqueous plutonium recovery and anion-exchange processes at the Los Alamos Plutonium Facility.

  7. Intelligent Work Process Engineering System

    NASA Technical Reports Server (NTRS)

    Williams, Kent E.

    2003-01-01

    Optimizing performance on work activities and processes requires metrics of performance for management to monitor and analyze in order to support further improvements in efficiency, effectiveness, safety, reliability and cost. Information systems are therefore required to assist management in making timely, informed decisions regarding these work processes and activities. Currently information systems regarding Space Shuttle maintenance and servicing do not exist to make such timely decisions. The work to be presented details a system which incorporates various automated and intelligent processes and analysis tools to capture organize and analyze work process related data, to make the necessary decisions to meet KSC organizational goals. The advantages and disadvantages of design alternatives to the development of such a system will be discussed including technologies, which would need to bedesigned, prototyped and evaluated.

  8. Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Blais, P. D.; Davis, J. R., Jr.

    1977-01-01

    Thirty-five (35) furnace runs were carried out during this quarter, of which 25 produced a total of 120 web crystals. The two main thermal models for the dendritic growth process were completed and are being used to assist the design of the thermal geometry of the web growth apparatus. The first model, a finite element representation of the susceptor and crucible, was refined to give greater precision and resolution in the critical central region of the melt. The second thermal model, which describes the dissipation of the latent heat to generate thickness-velocity data, was completed. Dendritic web samples were fabricated into solar cells using a standard configuration and a standard process for a N(+) -P-P(+) configuration. The detailed engineering design was completed for a new dendritic web growth facility of greater width capability than previous facilities.

  9. Application of high-throughput mini-bioreactor system for systematic scale-down modeling, process characterization, and control strategy development.

    PubMed

    Janakiraman, Vijay; Kwiatkowski, Chris; Kshirsagar, Rashmi; Ryll, Thomas; Huang, Yao-Ming

    2015-01-01

    High-throughput systems and processes have typically been targeted for process development and optimization in the bioprocessing industry. For process characterization, bench scale bioreactors have been the system of choice. Due to the need for performing different process conditions for multiple process parameters, the process characterization studies typically span several months and are considered time and resource intensive. In this study, we have shown the application of a high-throughput mini-bioreactor system viz. the Advanced Microscale Bioreactor (ambr15(TM) ), to perform process characterization in less than a month and develop an input control strategy. As a pre-requisite to process characterization, a scale-down model was first developed in the ambr system (15 mL) using statistical multivariate analysis techniques that showed comparability with both manufacturing scale (15,000 L) and bench scale (5 L). Volumetric sparge rates were matched between ambr and manufacturing scale, and the ambr process matched the pCO2 profiles as well as several other process and product quality parameters. The scale-down model was used to perform the process characterization DoE study and product quality results were generated. Upon comparison with DoE data from the bench scale bioreactors, similar effects of process parameters on process yield and product quality were identified between the two systems. We used the ambr data for setting action limits for the critical controlled parameters (CCPs), which were comparable to those from bench scale bioreactor data. In other words, the current work shows that the ambr15(TM) system is capable of replacing the bench scale bioreactor system for routine process development and process characterization.

  10. Beam Instrument Development System

    SciTech Connect

    DOOLITTLE, LAWRENCE; HUANG, GANG; DU, QIANG; SERRANO, CARLOS

    2016-01-08

    Beam Instrumentation Development System (BIDS) is a collection of common support libraries and modules developed during a series of Low-Level Radio Frequency (LLRF) control and timing/synchronization projects. BIDS includes a collection of Hardware Description Language (HDL) libraries and software libraries. The BIDS can be used for the development of any FPGA-based system, such as LLRF controllers. HDL code in this library is generic and supports common Digital Signal Processing (DSP) functions, FPGA-specific drivers (high-speed serial link wrappers, clock generation, etc.), ADC/DAC drivers, Ethernet MAC implementation, etc.

  11. The development of a coal-fired combustion system for industrial process heating applications. Quarterly technical progress report, January 1992--March 1992

    SciTech Connect

    Not Available

    1992-07-16

    PETC has implemented a number of advanced combustion research projects that will lead to the establishment of a broad, commercially acceptable engineering data base for the advancement of coal as the fuel of choice for boilers, furnaces, and process heaters. Vortec Corporation`s Coal-Fired Combustion System for Industrial Process Heating Applications has been selected for Phase III development under contract DE-AC22-91PC91161. This advanced combustion system research program is for the development of innovative coal-fired process heaters which can be used for high temperature melting, smelting, recycling, and refining processes. The process heater concepts to be developed are based on advanced glass melting and ore smelting furnaces developed and patented by Vortec Corporation. The process heater systems to be developed have multiple use applications; however, the Phase HI research effort is being focused on the development of a process heater system to be used for producing glass frits and wool fiber from boiler and incinerator ashes. The primary objective of the Phase III project is to develop and integrate all the system components, from fuel through total system controls, and then test the complete system in order to evaluate its potential marketability. The economic evaluation of commercial scale CMS processes has begun. In order to accurately estimate the cost of the primary process vessels, preliminary designs for 25, 50, and 100 ton/day systems have been started under Task 1. This data will serve as input data for life cycle cost analysis performed as part of techno-economic evaluations. The economic evaluations of commercial CMS systems will be an integral part of the commercialization plan.

  12. SIRU development. Volume 1: System development

    NASA Technical Reports Server (NTRS)

    Gilmore, J. P.; Cooper, R. J.

    1973-01-01

    A complete description of the development and initial evaluation of the Strapdown Inertial Reference Unit (SIRU) system is reported. System development documents the system mechanization with the analytic formulation for fault detection and isolation processing structure; the hardware redundancy design and the individual modularity features; the computational structure and facilities; and the initial subsystem evaluation results.

  13. Technology development life cycle processes.

    SciTech Connect

    Beck, David Franklin

    2013-05-01

    This report and set of appendices are a collection of memoranda originally drafted in 2009 for the purpose of providing motivation and the necessary background material to support the definition and integration of engineering and management processes related to technology development. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. As presented herein, the material begins with a survey of open literature perspectives on technology development life cycles, including published data on %E2%80%9Cwhat went wrong.%E2%80%9D The main thrust of the material presents a rational expose%CC%81 of a structured technology development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of the systems engineering process. The material concludes with a discussion on the use of multiple measures to assess technology maturity, including consideration of the viewpoint of potential users.

  14. Advanced information processing system

    NASA Technical Reports Server (NTRS)

    Lala, J. H.

    1984-01-01

    Design and performance details of the advanced information processing system (AIPS) for fault and damage tolerant data processing on aircraft and spacecraft are presented. AIPS comprises several computers distributed throughout the vehicle and linked by a damage tolerant data bus. Most I/O functions are available to all the computers, which run in a TDMA mode. Each computer performs separate specific tasks in normal operation and assumes other tasks in degraded modes. Redundant software assures that all fault monitoring, logging and reporting are automated, together with control functions. Redundant duplex links and damage-spread limitation provide the fault tolerance. Details of an advanced design of a laboratory-scale proof-of-concept system are described, including functional operations.

  15. Annotation methods to develop and evaluate an expert system based on natural language processing in electronic medical records.

    PubMed

    Gicquel, Quentin; Tvardik, Nastassia; Bouvry, Côme; Kergourlay, Ivan; Bittar, André; Segond, Frédérique; Darmoni, Stefan; Metzger, Marie-Hélène

    2015-01-01

    The objective of the SYNODOS collaborative project was to develop a generic IT solution, combining a medical terminology server, a semantic analyser and a knowledge base. The goal of the project was to generate meaningful epidemiological data for various medical domains from the textual content of French medical records. In the context of this project, we built a care pathway oriented conceptual model and corresponding annotation method to develop and evaluate an expert system's knowledge base. The annotation method is based on a semi-automatic process, using a software application (MedIndex). This application exchanges with a cross-lingual multi-termino-ontology portal. The annotator selects the most appropriate medical code proposed for the medical concept in question by the multi-termino-ontology portal and temporally labels the medical concept according to the course of the medical event. This choice of conceptual model and annotation method aims to create a generic database of facts for the secondary use of electronic health records data.

  16. Exploring the Dynamics and Modeling National Budget as a Supply Chain System: A Proposal for Reengineering the Budgeting Process and for Developing a Management Flight Simulator

    DTIC Science & Technology

    2012-09-01

    beer production and distribution. The whole system consists of four entities: Retailer , Wholesaler, Distributor, and Factory (R, W, D, and F). It is...EXPLORING THE DYNAMICS AND MODELING NATIONAL BUDGET AS A SUPPLY CHAIN SYSTEM : A PROPOSAL FOR...MODELING NATIONAL BUDGET AS A SUPPLY CHAIN SYSTEM : A PROPOSAL FOR REENGINEERING THE BUDGETING PROCESS AND FOR DEVELOPING A MANAGEMENT FLIGHT

  17. Processes and process development in Japan

    NASA Technical Reports Server (NTRS)

    Noda, T.

    1986-01-01

    The commercialization of solar power generation necessitates the development of low cost manufacturing method of silicon suitable for solar cells. The manufacturing methods of semiconductor grade silicon (SEG-Si) and the development of solar grade silicon (SOG-Si) in foreign countries was investigated. It was concluded that the most efficient method of developing such materials was the hydrogen reduction process of trichlorosilane (TCS), using a fluidized bed reactor. The low cost reduction of polysilicon requires cost reductions of raw materials, energy, labor, and capital. These conditions were carefully reviewed. The overall conclusion was that a development program should be based on the TCS-FBR process and that the experimental program should be conducted in test facilities capable of producing 10 tons of silicon granules per year.

  18. Development of a Software Evolution Process for Military Systems Composed of Integrated Commercial off the Shelf Components

    DTIC Science & Technology

    2000-03-01

    system architectures. Traditional DoD source code development and evolution methodologies do not effectively support COTS-intensive systems. To fully...realize the benefits of COTS technologies and products, the DoD must adopt new ways to sustain system evolution in the face of a dynamic market...environment subject to constant change. This thesis proposes a new software evolution methodology to effectively maintain COTS-intensive military systems

  19. Network command processing system overview

    NASA Technical Reports Server (NTRS)

    Nam, Yon-Woo; Murphy, Lisa D.

    1993-01-01

    The Network Command Processing System (NCPS) developed for the National Aeronautics and Space Administration (NASA) Ground Network (GN) stations is a spacecraft command system utilizing a MULTIBUS I/68030 microprocessor. This system was developed and implemented at ground stations worldwide to provide a Project Operations Control Center (POCC) with command capability for support of spacecraft operations such as the LANDSAT, Shuttle, Tracking and Data Relay Satellite, and Nimbus-7. The NCPS consolidates multiple modulation schemes for supporting various manned/unmanned orbital platforms. The NCPS interacts with the POCC and a local operator to process configuration requests, generate modulated uplink sequences, and inform users of the ground command link status. This paper presents the system functional description, hardware description, and the software design.

  20. Integrated water system simulation by considering hydrological and biogeochemical processes: model development, with parameter sensitivity and autocalibration

    NASA Astrophysics Data System (ADS)

    Zhang, Y. Y.; Shao, Q. X.; Ye, A. Z.; Xing, H. T.; Xia, J.

    2016-02-01

    Integrated water system modeling is a feasible approach to understanding severe water crises in the world and promoting the implementation of integrated river basin management. In this study, a classic hydrological model (the time variant gain model: TVGM) was extended to an integrated water system model by coupling multiple water-related processes in hydrology, biogeochemistry, water quality, and ecology, and considering the interference of human activities. A parameter analysis tool, which included sensitivity analysis, autocalibration and model performance evaluation, was developed to improve modeling efficiency. To demonstrate the model performances, the Shaying River catchment, which is the largest highly regulated and heavily polluted tributary of the Huai River basin in China, was selected as the case study area. The model performances were evaluated on the key water-related components including runoff, water quality, diffuse pollution load (or nonpoint sources) and crop yield. Results showed that our proposed model simulated most components reasonably well. The simulated daily runoff at most regulated and less-regulated stations matched well with the observations. The average correlation coefficient and Nash-Sutcliffe efficiency were 0.85 and 0.70, respectively. Both the simulated low and high flows at most stations were improved when the dam regulation was considered. The daily ammonium-nitrogen (NH4-N) concentration was also well captured with the average correlation coefficient of 0.67. Furthermore, the diffuse source load of NH4-N and the corn yield were reasonably simulated at the administrative region scale. This integrated water system model is expected to improve the simulation performances with extension to more model functionalities, and to provide a scientific basis for the implementation in integrated river basin managements.

  1. 77 FR 50724 - Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... nuclear power plants described in the Institute of Electrical and Electronic Engineers (IEEE) Standard 1074-2006, ``IEEE Standard for Developing a Software Project Life Cycle Process,'' issued 2006. DATES... 1997. This RG endorses IEEE Std. 1074-2006, ``IEEE Standard for Developing a Software Project...

  2. Turbine Blade Image Processing System

    NASA Astrophysics Data System (ADS)

    Page, Neal S.; Snyder, Wesley E.; Rajala, Sarah A.

    1983-10-01

    A vision system has been developed at North Carolina State University to identify the orientation and three dimensional location of steam turbine blades that are stacked in an industrial A-frame cart. The system uses a controlled light source for structured illumination and a single camera to extract the information required by the image processing software to calculate the position and orientation of a turbine blade in real time.

  3. Development and field application of a littoral processes monitoring system for examination of the relevant time scales of sediment suspension processes

    NASA Astrophysics Data System (ADS)

    Thosteson, Eric David

    A microcontroller-based system of oceanographic instrumentation providing a comprehensive set of measurements relevant to sediment transport processes has been developed. Analysis of the data provided by the system yields time series of vertical profiles of mean sediment size and concentration, horizontal profiles of bedform geometry, and single location measurements of flow velocity, pressure, turbidity, and water temperature. Details of the system architecture, including capabilities provided by both hardware and software contained within the system are given. An improved method for the determination of suspended sediment size and concentration from the system's acoustic backscatter intensity measurements is presented. By retaining the size dependence throughout the derivation for an explicit solution for concentration, a new explicit solution to the acoustic backscatter equation results. This new concentration solution improves the technique for determining median sediment size by incorporating sediment attenuation in the calculation. Because this new technique relies on the minimization of the variance in concentration as determined by different frequency transducers, the previous technique of pairing transducers of different frequencies is replaced by a technique making use of any number of different frequency transducers. The new size/concentration inversion technique is tested using both simulated and laboratory data. Numerical precision is shown to be the only source of error with the use of simulated data. Laboratory tests result in less than 20% error in the determination of both concentration and size over a range of nearly one meter. Finally, suspended sediment concentration data from the nearshore region obtained from an experiment performed in Duck, North Carolina, are examined to find the relevant time scales of sediment suspension. In this location, low frequency forcing mechanisms are as significant in suspending sediment as the incident-band wave

  4. Liga developer apparatus system

    DOEpatents

    Boehme, Dale R.; Bankert, Michelle A.; Christenson, Todd R.

    2003-01-01

    A system to fabricate precise, high aspect ratio polymeric molds by photolithograpic process is described. The molds for producing micro-scale parts from engineering materials by the LIGA process. The invention is a developer system for developing a PMMA photoresist having exposed patterns comprising features having both very small sizes, and very high aspect ratios. The developer system of the present invention comprises a developer tank, an intermediate rinse tank and a final rinse tank, each tank having a source of high frequency sonic agitation, temperature control, and continuous filtration. It has been found that by moving a patterned wafer, through a specific sequence of developer/rinse solutions, where an intermediate rinse solution completes development of those portions of the exposed resist left undeveloped after the development solution, by agitating the solutions with a source of high frequency sonic vibration, and by adjusting and closely controlling the temperatures and continuously filtering and recirculating these solutions, it is possible to maintain the kinetic dissolution of the exposed PMMA polymer as the rate limiting step.

  5. Developing Data System Engineers

    NASA Astrophysics Data System (ADS)

    Behnke, J.; Byrnes, J. B.; Kobler, B.

    2011-12-01

    In the early days of general computer systems for science data processing, staff members working on NASA's data systems would most often be hired as mathematicians. Computer engineering was very often filled by those with electrical engineering degrees. Today, the Goddard Space Flight Center has special position descriptions for data scientists or as they are more commonly called: data systems engineers. These staff members are required to have very diverse skills, hence the need for a generalized position description. There is always a need for data systems engineers to develop, maintain and operate the complex data systems for Earth and space science missions. Today's data systems engineers however are not just mathematicians, they are computer programmers, GIS experts, software engineers, visualization experts, etc... They represent many different degree fields. To put together distributed systems like the NASA Earth Observing Data and Information System (EOSDIS), staff are required from many different fields. Sometimes, the skilled professional is not available and must be developed in-house. This paper will address the various skills and jobs for data systems engineers at NASA. Further it explores how to develop staff to become data scientists.

  6. Femtosecond Laser Processing of Agarose Gel Surrounding Protein Crystals for Development of an Automated Crystal Capturing System

    NASA Astrophysics Data System (ADS)

    Sugiyama, Shigeru; Hasenaka, Hitoshi; Hirose, Mika; Shimizu, Noriko; Kitatani, Tomoya; Takahashi, Yoshinori; Adachi, Hiroaki; Takano, Kazufumi; Murakami, Satoshi; Inoue, Tsuyoshi; Mori, Yusuke; Matsumura, Hiroyoshi

    2009-10-01

    Protein crystals must be captured to be mounted onto the goniometer head of X-ray diffraction equipment for structural analysis. However, this capturing operation has to be performed manually under microscopic observation. Crystallographers often face problems with this operation because protein crystals are very soft and fragile. Here, we crystallized elastase, thaumatin, glucose isomerase, and lysozyme in 2.0% (w/v) agarose gels and applied a femtosecond laser to process the agarose gel surrounding the protein crystals. A software-based operation system was established to enable automated laser processing. This new approach allows high-speed, high-precision, and reproducible processing of the gel without unsealing the crystallization trays. The processed gel containing crystals could be captured using a nylon loop without difficulty, followed by mounting the crystal onto the goniometer head of the X-ray diffraction equipment. X-ray diffraction analysis of such crystals suggested that the processed agarose gel with a thickness of approximately <0.2 mm has little effect on the background X-ray scattering. Furthermore, the effect of laser irradiation was investigated by X-ray diffraction and subsequent structural analyses, which demonstrated that the quality of the diffraction data and obtained electron density was essentially the same as that obtained before laser irradiation. On the other hand, the manually processed gel-grown crystals gave higher values on the background X-ray scattering. These comparative experimental results show clear advantages of our laser processing system. This approach leads to the possibility that protein crystals can be captured reproducibly without affecting any later crystallographic analysis, thereby providing an automated system for crystal capture.

  7. Mars Aqueous Processing System

    NASA Technical Reports Server (NTRS)

    Berggren, Mark; Wilson, Cherie; Carrera, Stacy; Rose, Heather; Muscatello, Anthony; Kilgore, James; Zubrin, Robert

    2012-01-01

    The goal of the Mars Aqueous Processing System (MAPS) is to establish a flexible process that generates multiple products that are useful for human habitation. Selectively extracting useful components into an aqueous solution, and then sequentially recovering individual constituents, can obtain a suite of refined or semi-refined products. Similarities in the bulk composition (although not necessarily of the mineralogy) of Martian and Lunar soils potentially make MAPS widely applicable. Similar process steps can be conducted on both Mars and Lunar soils while tailoring the reaction extents and recoveries to the specifics of each location. The MAPS closed-loop process selectively extracts, and then recovers, constituents from soils using acids and bases. The emphasis on Mars involves the production of useful materials such as iron, silica, alumina, magnesia, and concrete with recovery of oxygen as a byproduct. On the Moon, similar chemistry is applied with emphasis on oxygen production. This innovation has been demonstrated to produce high-grade materials, such as metallic iron, aluminum oxide, magnesium oxide, and calcium oxide, from lunar and Martian soil simulants. Most of the target products exhibited purities of 80 to 90 percent or more, allowing direct use for many potential applications. Up to one-fourth of the feed soil mass was converted to metal, metal oxide, and oxygen products. The soil residue contained elevated silica content, allowing for potential additional refining and extraction for recovery of materials needed for photovoltaic, semiconductor, and glass applications. A high-grade iron oxide concentrate derived from lunar soil simulant was used to produce a metallic iron component using a novel, combined hydrogen reduction/metal sintering technique. The part was subsequently machined and found to be structurally sound. The behavior of the lunar-simulant-derived iron product was very similar to that produced using the same methods on a Michigan iron

  8. Developing the JPL Engineering Processes

    NASA Technical Reports Server (NTRS)

    Linick, Dave; Briggs, Clark

    2004-01-01

    This paper briefly recounts the recent history of process reengineering at the NASA Jet Propulsion Laboratory, with a focus on the engineering processes. The JPL process structure is described and the process development activities of the past several years outlined. The main focus of the paper is on the current process structure, the emphasis on the flight project life cycle, the governance approach that lead to Flight Project Practices, and the remaining effort to capture process knowledge at the detail level of the work group.

  9. A Prototyping Environment for Research on Human-Machine Interfaces in Process Control: Use of Microsoft WPF for Microworld and Distributed Control System Development

    SciTech Connect

    Roger Lew; Ronald L. Boring; Thomas A. Ulrich

    2014-08-01

    Operators of critical processes, such as nuclear power production, must contend with highly complex systems, procedures, and regulations. Developing human-machine interfaces (HMIs) that better support operators is a high priority for ensuring the safe and reliable operation of critical processes. Human factors engineering (HFE) provides a rich and mature set of tools for evaluating the performance of HMIs, but the set of tools for developing and designing HMIs is still in its infancy. Here we propose that Microsoft Windows Presentation Foundation (WPF) is well suited for many roles in the research and development of HMIs for process control.

  10. Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, & Visualization ...

    SciTech Connect

    Powers, Michael H.

    2003-06-01

    The Department of Energy has identified the location and characterization of subsurface contaminants and the characterization of the subsurface as a priority need. Many DOE facilities are in need of subsurface imaging in the vadose and saturated zones. This includes (1) the detection and characterization of metal and concrete structures, (2) the characterization of waste pits (for both contents and integrity) and (3) mapping the complex geological/hydrological framework of the vadose and saturated zones. The DOE has identified ground penetrating radar (GPR) as a method that can non-invasively map transportation pathways and vadose zone heterogeneity. An advanced GPR system and advanced subsurface modeling, processing, imaging, and inversion techniques can be directly applied to several DOE science needs in more than one focus area and at many sites. Needs for enhanced subsurface imaging have been identified at Hanford, INEEL, SRS, ORNL, LLNL, SNL, LANL, and many other sites. In fact, needs for better subsurface imaging probably exist at all DOE sites. However, GPR performance is often inadequate due to increased attenuation and dispersion when soil conductivities are high. Our objective is to extend the limits of performance of GPR by improvements to both hardware and numerical computation. The key features include (1) greater dynamic range through real time digitizing, receiver gain improvements, and high output pulser, (2) modified, fully characterized antennas with sensors to allow dynamic determination of the changing radiated waveform, (3) modified deconvolution and depth migration algorithms exploiting the new antenna output information, (4) development of automatic full waveform inversion made possible by the known radiated pulse shape.

  11. Development of a real-time lumbar ultrasound image processing system for epidural needle entry site localization.

    PubMed

    Yusong Leng; Shuang Yu; Kok Kiong Tan; Tildsley, Philip; Sia, Alex Tiong Heng; Ban Leong Sng

    2016-08-01

    A fully-automatic ultrasound image processing system that can determine the needle entry site for epidural anesthesia (EA) in real time is presented in this paper. Neither the knowledge of anesthetists nor additional hardware is required to operate the system, which firstly directs the anesthetists to reach the desired insertion region in the longitudinal view, i.e., lumbar level L3-L4, and then locates the ideal puncture site by instructing the anesthetists to rotate and slightly adjust the position of ultrasound probe. In order to implement these functions, modules including image processing, panorama stitching, feature extraction/selection, template matching and support vector machine (SVM) classification are incorporated in this system. Additionally, a user-friendly graphical user interface (GUI), which displays the processing results and guides anesthetists intuitively, is further designed to conceal the intricacy of algorithms. Feasibility and effectiveness of the proposed system has been evaluated through a set of realtime tests on 53 volunteers from a local hospital.

  12. WRAP process area development control work plan

    SciTech Connect

    Leist, K.L., Fluor Daniel Hanford

    1997-02-27

    This work plan defines the manner in which the Waste Receiving and Processing Facility, Module I Process Area will be maintained under development control status. This status permits resolution of identified design discrepancies, control system changes, as-building of equipment, and perform modifications to increase process operability and maintainability as parallel efforts. This work plan maintains configuration control as these efforts are undertaken. This task will end with system testing and reissue of field verified design drawings.

  13. A Quality Improvement Customer Service Process and CSS [Customer Service System]. Burlington County College Employee Development Series, Volumes I & II.

    ERIC Educational Resources Information Center

    Burlington County Coll., Pemberton, NJ.

    Prepared for use by staff in development workshops at Burlington County College (BCC), in New Jersey, this handbook offers college-wide guidelines for improving the quality of service provided to internal and external customers, and reviews key elements of BCC's Customer Service System (CSS), a computerized method of recording and following-up on…

  14. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  15. Development and Testing of an Experimental Polysensory Instructional System for Teaching Electric Arc Welding Processes. Report No. 24. Final Report.

    ERIC Educational Resources Information Center

    Sergeant, Harold A.

    The population of the study consisted of 15 high school industrial arts students, 10 freshman and sophomore college students, and 10 adults. A polysensory, self-pacing instructional system was developed which included (1) pretests and post tests, (2) a general instruction book, (3) equipment to practice arc welding, (4) programed instruction…

  16. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  17. The Impact of City-level Permitting Processes on Residential Photovoltaic Installation Prices and Development Times: An Empirical Analysis of Solar Systems in California Cities

    SciTech Connect

    Wiser, Ryan; Dong, Changgui

    2013-04-01

    Business process or “soft” costs account for well over 50% of the installed price of residential photovoltaic (PV) systems in the United States, so understanding these costs is crucial for identifying PV cost-reduction opportunities. Among these costs are those imposed by city-level permitting processes, which may add both expense and time to the PV development process. Building on previous research, this study evaluates the effect of city-level permitting processes on the installed price of residential PV systems and on the time required to develop and install those systems. The study uses a unique dataset from the U.S. Department of Energy’s Rooftop Solar Challenge Program, which includes city-level permitting process “scores,” plus data from the California Solar Initiative and the U.S. Census. Econometric methods are used to quantify the price and development-time effects of city-level permitting processes on more than 3,000 PV installations across 44 California cities in 2011. Results indicate that city-level permitting processes have a substantial and statistically significant effect on average installation prices and project development times. The results suggest that cities with the most favorable (i.e., highest-scoring) permitting practices can reduce average residential PV prices by $0.27–$0.77/W (4%–12% of median PV prices in California) compared with cities with the most onerous (i.e., lowest-scoring) permitting practices, depending on the regression model used. Though the empirical models for development times are less robust, results suggest that the most streamlined permitting practices may shorten development times by around 24 days on average (25% of the median development time). These findings illustrate the potential price and development-time benefits of streamlining local permitting procedures for PV systems.

  18. Development of a coal-fired combustion system for industrial process heating applications. Quarterly technical progress report, January 1993--March 1993

    SciTech Connect

    Not Available

    1993-04-30

    This advanced combustion system research program is for the development of innovative coal-fired process heaters which can be used for high temperature melting, smelting and waste vitrification processes. The process heater concepts to be developed are based on advanced glass melting and ore smelting furnaces developed and patented by Vortec Corporation. The process heater systems to be developed have multiple use applications; however, the Phase III research effort is being focused on the development of a process heater system to be used for producing value added vitrified glass products from boiler/incinerator ashes and industrial wastes. The primary objective of the Phase III project is to develop and integrate all the system components, from fuel through total system controls, and then test the complete system in order to evaluate its potential marketability. During the current reporting period, a majority of the effort was spent performing the initial industrial proof-of-concept test and installing and integrating the Wet Electrostatic Precipitator (WESP). The other system modifications are well underway with the designs of the modifications to the batch/coal feed system being completed. A Purchase Order has been issued to a material conveying equipment vendor for the purchase of the batch/coal feeding equipment. The delivery and installation of the material conveying equipment is expected to occur in July and early August. The commercialization planning is continuing with the completion of a draft Business Plan. This plan is currently undergoing internal review, and will be submitted to Dawnbreaker, a DOE contracted small business consulting firm, for review.

  19. Modeling Kanban Processes in Systems Engineering

    DTIC Science & Technology

    2012-06-01

    Modeling Kanban Processes in Systems Engineering Richard Turner School of Systems and Enterprises Stevens Institute of Technology Hoboken, NJ...dingold@usc.edu, jolane@usc.edu Abstract—Systems engineering processes using pull scheduling methods ( kanban ) are being evaluated with hybrid...development projects incrementally evolve capabilities of existing systems and/or systems of systems. A kanban -based scheduling system was defined and

  20. Lunar materials processing system integration

    NASA Technical Reports Server (NTRS)

    Sherwood, Brent

    1992-01-01

    The theme of this paper is that governmental resources will not permit the simultaneous development of all viable lunar materials processing (LMP) candidates. Choices will inevitably be made, based on the results of system integration trade studies comparing candidates to each other for high-leverage applications. It is in the best long-term interest of the LMP community to lead the selection process itself, quickly and practically. The paper is in five parts. The first part explains what systems integration means and why the specialized field of LMP needs this activity now. The second part defines the integration context for LMP -- by outlining potential lunar base functions, their interrelationships and constraints. The third part establishes perspective for prioritizing the development of LMP methods, by estimating realistic scope, scale, and timing of lunar operations. The fourth part describes the use of one type of analytical tool for gaining understanding of system interactions: the input/output model. A simple example solved with linear algebra is used to illustrate. The fifth and closing part identifies specific steps needed to refine the current ability to study lunar base system integration. Research specialists have a crucial role to play now in providing the data upon which this refinement process must be based.

  1. Monolithic Fuel Fabrication Process Development

    SciTech Connect

    C. R. Clark; N. P. Hallinan; J. F. Jue; D. D. Keiser; J. M. Wight

    2006-05-01

    The pursuit of a high uranium density research reactor fuel plate has led to monolithic fuel, which possesses the greatest possible uranium density in the fuel region. Process developments in fabrication development include friction stir welding tool geometry and cooling improvements and a reduction in the length of time required to complete the transient liquid phase bonding process. Annealing effects on the microstructures of the U-10Mo foil and friction stir welded aluminum 6061 cladding are also examined.

  2. Improved television signal processing system

    NASA Technical Reports Server (NTRS)

    Wong, R. Y.

    1967-01-01

    Digital system processes spacecraft television pictures by converting images sensed on a photostorage vidicon to pulses which can be transmitted by telemetry. This system can be applied in the processing of medical X ray photographs and in electron microscopy.

  3. Processes of Expressive Behavior Development.

    ERIC Educational Resources Information Center

    Zivin, Gail

    1986-01-01

    Seventeen processes in the development of expressive behavior are reviewed and coordinated in a framework that is shown to accommodate current perspectives on expressive behavior development. Works of Ekman, Izard, Lewis and Michalson, and Sroufe are briefly reviewed. Neglected areas of research are indicated and the course of expressive behavior…

  4. Processing Poetry to Develop Literacy.

    ERIC Educational Resources Information Center

    Nelson, Marguerite Hansen

    1994-01-01

    Explains the use of experimental poetry forms for decoding practice to help develop reading skills in elementary school students. The use of computers for word processing capabilities is discussed; seven forms of poetry are described; and results are examined in terms of motivation and the development of literacy. (four references) (LRW)

  5. Development of a Reference Image Collection Library for Histopathology Image Processing, Analysis and Decision Support Systems Research.

    PubMed

    Kostopoulos, Spiros; Ravazoula, Panagiota; Asvestas, Pantelis; Kalatzis, Ioannis; Xenogiannopoulos, George; Cavouras, Dionisis; Glotsos, Dimitris

    2017-06-01

    Histopathology image processing, analysis and computer-aided diagnosis have been shown as effective assisting tools towards reliable and intra-/inter-observer invariant decisions in traditional pathology. Especially for cancer patients, decisions need to be as accurate as possible in order to increase the probability of optimal treatment planning. In this study, we propose a new image collection library (HICL-Histology Image Collection Library) comprising 3831 histological images of three different diseases, for fostering research in histopathology image processing, analysis and computer-aided diagnosis. Raw data comprised 93, 116 and 55 cases of brain, breast and laryngeal cancer respectively collected from the archives of the University Hospital of Patras, Greece. The 3831 images were generated from the most representative regions of the pathology, specified by an experienced histopathologist. The HICL Image Collection is free for access under an academic license at http://medisp.bme.teiath.gr/hicl/ . Potential exploitations of the proposed library may span over a board spectrum, such as in image processing to improve visualization, in segmentation for nuclei detection, in decision support systems for second opinion consultations, in statistical analysis for investigation of potential correlations between clinical annotations and imaging findings and, generally, in fostering research on histopathology image processing and analysis. To the best of our knowledge, the HICL constitutes the first attempt towards creation of a reference image collection library in the field of traditional histopathology, publicly and freely available to the scientific community.

  6. Cell culture process development: advances in process engineering.

    PubMed

    Heath, Carole; Kiss, Robert

    2007-01-01

    Representatives from the cell culture process development community met on September 11 and 12, 2006 at the ACS National Meeting in San Francisco to discuss "Cell Culture Process Development: Advances in Process Engineering". This oral session was held as part of the Division of Biochemical Technology (BIOT) program. The presentations addressed the very small scale (less than 1 mL) to the very large scale (20,000 L). The topics covered included development of high throughput cell culture screening systems, modeling and characterization of bioreactor environments from mixing and shear perspectives at both small and large scales, systematic approaches for improving scale-up and scale-down activities, development of disposable bioreactor technologies, and novel perfusion culture approaches. All told, this well-attended session resulted in a valuable exchange of technical information and demonstrated a high level of interest within the process development community.

  7. RSMASS system model development

    SciTech Connect

    Marshall, A.C.; Gallup, D.R.

    1998-07-01

    RSMASS system mass models have been used for more than a decade to make rapid estimates of space reactor power system masses. This paper reviews the evolution of the RSMASS models and summarizes present capabilities. RSMASS has evolved from a simple model used to make rough estimates of space reactor and shield masses to a versatile space reactor power system model. RSMASS uses unique reactor and shield models that permit rapid mass optimization calculations for a variety of space reactor power and propulsion systems. The RSMASS-D upgrade of the original model includes algorithms for the balance of the power system, a number of reactor and shield modeling improvements, and an automatic mass optimization scheme. The RSMASS-D suite of codes cover a very broad range of reactor and power conversion system options as well as propulsion and bimodal reactor systems. Reactor choices include in-core and ex-core thermionic reactors, liquid metal cooled reactors, particle bed reactors, and prismatic configuration reactors. Power conversion options include thermoelectric, thermionic, Stirling, Brayton, and Rankine approaches. Program output includes all major component masses and dimensions, efficiencies, and a description of the design parameters for a mass optimized system. In the past, RSMASS has been used as an aid to identify and select promising concepts for space power applications. The RSMASS modeling approach has been demonstrated to be a valuable tool for guiding optimization of the power system design; consequently, the model is useful during system design and development as well as during the selection process. An improved in-core thermionic reactor system model RSMASS-T is now under development. The current development of the RSMASS-T code represents the next evolutionary stage of the RSMASS models. RSMASS-T includes many modeling improvements and is planned to be more user-friendly. RSMASS-T will be released as a fully documented, certified code at the end of

  8. Chemical roots of biological evolution: the origins of life as a process of development of autonomous functional systems

    PubMed Central

    Ruiz-Mirazo, Kepa; Briones, Carlos

    2017-01-01

    In recent years, an extension of the Darwinian framework is being considered for the study of prebiotic chemical evolution, shifting the attention from homogeneous populations of naked molecular species to populations of heterogeneous, compartmentalized and functionally integrated assemblies of molecules. Several implications of this shift of perspective are analysed in this critical review, both in terms of the individual units, which require an adequate characterization as self-maintaining systems with an internal organization, and also in relation to their collective and long-term evolutionary dynamics, based on competition, collaboration and selection processes among those complex individuals. On these lines, a concrete proposal for the set of molecular control mechanisms that must be coupled to bring about autonomous functional systems, at the interface between chemistry and biology, is provided. PMID:28446711

  9. Chemical roots of biological evolution: the origins of life as a process of development of autonomous functional systems.

    PubMed

    Ruiz-Mirazo, Kepa; Briones, Carlos; de la Escosura, Andrés

    2017-04-01

    In recent years, an extension of the Darwinian framework is being considered for the study of prebiotic chemical evolution, shifting the attention from homogeneous populations of naked molecular species to populations of heterogeneous, compartmentalized and functionally integrated assemblies of molecules. Several implications of this shift of perspective are analysed in this critical review, both in terms of the individual units, which require an adequate characterization as self-maintaining systems with an internal organization, and also in relation to their collective and long-term evolutionary dynamics, based on competition, collaboration and selection processes among those complex individuals. On these lines, a concrete proposal for the set of molecular control mechanisms that must be coupled to bring about autonomous functional systems, at the interface between chemistry and biology, is provided. © 2017 The Authors.

  10. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  11. Central waste processing system

    NASA Technical Reports Server (NTRS)

    Kester, F. L.

    1973-01-01

    A new concept for processing spacecraft type wastes has been evaluated. The feasibility of reacting various waste materials with steam at temperatures of 538 - 760 C in both a continuous and batch reactor with residence times from 3 to 60 seconds has been established. Essentially complete gasification is achieved. Product gases are primarily hydrogen, carbon dioxide, methane, and carbon monoxide. Water soluble synthetic wastes are readily processed in a continuous tubular reactor at concentrations up to 20 weight percent. The batch reactor is able to process wet and dry wastes at steam to waste weight ratios from 2 to 20. Feces, urine, and synthetic wastes have been successfully processed in the batch reactor.

  12. The standards process: X3 information processing systems

    NASA Technical Reports Server (NTRS)

    Emard, Jean-Paul

    1993-01-01

    The topics are presented in viewgraph form and include the following: International Organization for Standards (ISO); International Electrotechnical Committee (IEC); ISO/IEC Joint Technical Committee 1 (JTC-1); U.S. interface to JTC-1; ANSI; national organizations; U.S. standards development processes; national and international standards developing organizations; regional organizations; and X3 information processing systems.

  13. Development of data processing interpretation and analysis system for the remote sensing of trace atmospheric gas species

    NASA Technical Reports Server (NTRS)

    Casas, J. C.; Koziana, J. V.; Saylor, M. S.; Kindle, E. C.

    1982-01-01

    Problems associated with the development of the measurement of air pollution from satellites (MAPS) experiment program are addressed. The primary thrust of this research was the utilization of the MAPS experiment data in three application areas: low altitude aircraft flights (one to six km); mid altitude aircraft flights (eight to 12 km); and orbiting space platforms. Extensive research work in four major areas of data management was the framework for implementation of the MAPS experiment technique. These areas are: (1) data acquisition; (2) data processing, analysis and interpretation algorithms; (3) data display techniques; and (4) information production.

  14. Course Development: Industrial or Social Process.

    ERIC Educational Resources Information Center

    Kaufman, David

    The development of course materials at the Open Learning Institute, British Columbia, Canada, is examined from two perspectives: as an industrial process and as a social process. The public institute provides distance education through paced home-study courses. The course team model used at the Institute is a system approach. Course development…

  15. Course Development: Industrial or Social Process.

    ERIC Educational Resources Information Center

    Kaufman, David

    The development of course materials at the Open Learning Institute, British Columbia, Canada, is examined from two perspectives: as an industrial process and as a social process. The public institute provides distance education through paced home-study courses. The course team model used at the Institute is a system approach. Course development…

  16. Software Model Of Software-Development Process

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Synott, Debra J.; Levary, Reuven R.

    1990-01-01

    Collection of computer programs constitutes software tool for simulation of medium- to large-scale software-development projects. Necessary to include easily identifiable and more-readily quantifiable characteristics like costs, times, and numbers of errors. Mathematical model incorporating these and other factors of dynamics of software-development process implemented in the Software Life Cycle Simulator (SLICS) computer program. Simulates dynamics of software-development process. In combination with input and output expert software systems and knowledge-based management software system, develops information for use in managing large software-development project. Intended to aid managers in planning, managing, and controlling software-development processes by reducing uncertainties in budgets, required personnel, and schedules.

  17. Improving Process Heating System Performance v3

    SciTech Connect

    2016-04-11

    Improving Process Heating System Performance: A Sourcebook for Industry is a development of the U.S. Department of Energy (DOE) Advanced Manufacturing Office (AMO) and the Industrial Heating Equipment Association (IHEA). The AMO and IHEA undertook this project as part of an series of sourcebook publications developed by AMO on energy-consuming industrial systems, and opportunities to improve performance. Other topics in this series include compressed air systems, pumping systems, fan systems, steam systems, and motors and drives

  18. Development of a coal-fired combustion system for industrial process heating applications. Quarterly technical progress report, January--March 1994

    SciTech Connect

    Not Available

    1994-04-30

    This advanced combustion system research program is for the development of innovative coal-fired process heaters which can be used for high temperature melting, smelting and waste vitrification processes. The process heater systems to be developed have multiple use applications; however, the Phase III research effort is being focused on the development of a process heater system to be used for producing value added vitrified glass products from boiler/incinerator ashes and industrial wastes. The primary objective of the Phase III project is to develop and integrate all the system components, from fuel through total system, controls, and then test the complete system in order to evaluate its potential marketability. The past quarter began with a two-day test performed in January to determine the cause of pulsations in the batch feed system observed during pilot-scale testing of surrogate TSCA incinerator ash performed in December of 1993. Two different batch feedstocks were used during this test: flyash and cullet. The cause of the pulsations was traced to a worn part in the feeder located at the bottom of the batch feed tank. The problem was corrected by replacing the wom part with the corresponding part on the existing coal feed tank. A new feeder for the existing coal tank, which had previously been ordered as part of the new coal handling system, was procured and installed. The data from the pilot-scale tests performed on surrogate TSCA incinerator ash during December of 1993 was collected and analyzed. All of the glass produced during the test passed both the Toxicity characteristics Leach Procedure (TCLP) and the Product Consistency Test (PCT) by approximately two orders of magnitude.

  19. Developing a dynamic framework to examine the interplay between environmental stress, stakeholder participation processes and hydrological systems

    NASA Astrophysics Data System (ADS)

    Carr, G.; Blöschl, G.; Loucks, D. P.

    2014-09-01

    Stakeholder participation is increasingly discussed as essential for sustainable water resource management. Yet detailed understanding of the factors driving its use, the processes by which it is employed, and the outcomes or achievements it can realise remains highly limited, and often contested. This understanding is essential to enable water policy to be shaped for efficient and effective water management. This research proposes and applies a dynamic framework that can explore in which circumstances environmental stress events, such as floods, droughts or pollution, drive changes in water governance towards a more participatory approach, and how this shapes the processes by which participation or stakeholder engagement takes place, and the subsequent water management outcomes that emerge. The framework is able to assess the extent to which environmental events in combination with favourable contextual factors (e.g. institutional support for participatory activities) lead to good participatory processes (e.g. well facilitated and representative) that then lead to good outcomes (e.g. improved ecological conditions). Through applying the framework to case studies from the literature it becomes clear that environmental stress events can stimulate participatory governance changes, when existing institutional conditions promote participatory approaches. The work also suggests that intermediary outcomes, which may be tangible (such as reaching an agreement) or non-tangible (such as developing shared knowledge and understanding among participants, or creating trust), may provide a crucial link between processes and resource management outcomes. If this relationship can be more strongly confirmed, the presence or absence of intermediary outcomes may even be used as a valuable proxy to predict future resource management outcomes.

  20. From Process Models to Decision Making: The Use of Data Mining Techniques for Developing Effect Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Conrads, P. A.; Roehl, E. A.

    2010-12-01

    Natural-resource managers face the difficult problem of controlling the interactions between hydrologic and man-made systems in ways that preserve resources while optimally meeting the needs of disparate stakeholders. Finding success depends on obtaining and employing detailed scientific knowledge about the cause-effect relations that govern the physics of these hydrologic systems. This knowledge is most credible when derived from large field-based datasets that encompass the wide range of variability in the parameters of interest. The means of converting data into knowledge of the hydrologic system often involves developing computer models that predict the consequences of alternative management practices to guide resource managers towards the best path forward. Complex hydrologic systems are typically modeled using computer programs that implement traditional, generalized, physical equations, which are calibrated to match the field data as closely as possible. This type of model commonly is limited in terms of demonstrable predictive accuracy, development time, and cost. The science of data mining presents a powerful complement to physics-based models. Data mining is a relatively new science that assists in converting large databases into knowledge and is uniquely able to leverage the real-time, multivariate data now being collected for hydrologic systems. In side-by-side comparisons with state-of-the-art physics-based hydrologic models, the authors have found data-mining solutions have been substantially more accurate, less time consuming to develop, and embeddable into spreadsheets and sophisticated decision support systems (DSS), making them easy to use by regulators and stakeholders. Three data-mining applications will be presented that demonstrate how data-mining techniques can be applied to existing environmental databases to address regional concerns of long-term consequences. In each case, data were transformed into information, and ultimately, into

  1. Development and Testing of an Ultra Low Power System-On-Chip (SOC) Platform for Marine Mammal Tags and Passive Acoustic Signal Processing

    DTIC Science & Technology

    2014-09-30

    tag, called Nano -power Electronics MOdule (NEMO). The NEMO tag has the specific application goal of determining the response of deep diving whales to...flowchart. IMPACT/ APPLICATIONS The goal of this project is to develop a marine mammal tag to last for several weeks. This tag will allow data...Development and Testing of an Ultra Low Power System-On-Chip (SOC) Platform for Marine Mammal Tags and Passive Acoustic Signal Processing Benton H

  2. Multipurpose Vacuum Induction Processing System

    NASA Astrophysics Data System (ADS)

    Govindaraju, M.; Kulkarni, Deepak; Balasubramanian, K.

    2012-11-01

    Multipurpose vacuum processing systems are cost effective; occupy less space, multiple functional under one roof and user friendly. A multipurpose vacuum induction system was designed, fabricated and installed in a record time of 6 months time at NFTDC Hyderabad. It was designed to function as a) vacuum induction melting/refining of oxygen free electronic copper/pure metals, b) vacuum induction melting furnace for ferrous materials c) vacuum induction melting for non ferrous materials d) large vacuum heat treatment chamber by resistance heating (by detachable coil and hot zone) e) bottom discharge vacuum induction melting system for non ferrous materials f) Induction heat treatment system and g) directional solidification /investment casting. It contains provision for future capacity addition. The attachments require to manufacture multiple shaped castings and continuous rod casting can be added whenever need arises. Present capacity is decided on the requirement for 10years of development path; presently it has 1.2 ton liquid copper handling capacity. It is equipped with provision for capacity addition up to 2 ton liquid copper handling capacity in future. Provision is made to carry out the capacity addition in easy steps quickly. For easy operational maintenance and troubleshooting, design was made in easily detachable sections. High vacuum system is also is detachable, independent and easily movable which is first of its kind in the country. Detailed design parameters, advantages and development history are presented in this paper.

  3. Program Development and Evaluation: A Modeling Process.

    ERIC Educational Resources Information Center

    Green, Donald W.; Corgiat, RayLene

    A model of program development and evaluation was developed at Genesee Community College, utilizing a system theory/process of deductive and inductive reasoning to ensure coherence and continuity within the program. The model links activities to specific measurable outcomes. Evaluation checks and feedback are built in at various levels so that…

  4. Software Development to Assist in the Processing and Analysis of Data Obtained Using Fiber Bragg Grating Interrogation Systems

    NASA Technical Reports Server (NTRS)

    Hicks, Rebecca

    2010-01-01

    capable of processing massive amounts of data in both real-time and post-flight settings, and to produce software segments that can be integrated to assist in the task as well. The selected software must be able to: (1) process massive amounts of data (up to 4GB) at a speed useful in a real-time settings (small fractions of a second); (2) process data in post-flight settings to allow test reproduction or further data analysis, inclusive; (3) produce, or make easier to produce, three-dimensional plots/graphs to make the data accessible to flight test engineers; and (4) be customized to allow users to use their own processing formulas or functions and display the data in formats they prefer. Several software programs were evaluated to determine their utility in completing the research objectives. These programs include: OriginLab, Graphis, 3D Grapher, Visualization Sciences Group (VSG) Avizo Wind, Interactive Analysis and Display System (IADS), SigmaPlot, and MATLAB.

  5. Managing Risk in Systems Development.

    ERIC Educational Resources Information Center

    DePaoli, Marilyn M.; And Others

    Stanford University's use of a risk assessment methodology to improve the management of systems development projects is discussed. After examining the concepts of hazard, peril, and risk as they relate to the system development process, three ways to assess risk are covered: size, structure, and technology. The overall objective for Stanford…

  6. Managing Risk in Systems Development.

    ERIC Educational Resources Information Center

    DePaoli, Marilyn M.; And Others

    Stanford University's use of a risk assessment methodology to improve the management of systems development projects is discussed. After examining the concepts of hazard, peril, and risk as they relate to the system development process, three ways to assess risk are covered: size, structure, and technology. The overall objective for Stanford…

  7. Phase equilibria in fullerene-containing systems as a basis for development of manufacture and application processes for nanocarbon materials

    NASA Astrophysics Data System (ADS)

    Semenov, K. N.; Charykov, N. A.; Postnov, V. N.; Sharoyko, V. V.; Murin, I. V.

    2016-01-01

    This review is the first attempt to integrate the available data on all types of phase equilibria (solubility, extraction and sorption) in systems containing light fullerenes (C60 and C70). In the case of solubility diagrams, the following types of phase equilibria are considered: individual fullerene (C60 or C70)-solvent under polythermal and polybaric conditions; C60-C70-solvent, individual fullerene-solvent(1)-solvent(2), as well as multicomponent systems comprising a single fullerene or an industrial mixture of fullerenes and vegetable oils, animal fats or essential oils under polythermal conditions. All published experimental data on the extraction equilibria in C60-C70-liquid phase(1)-liquid phase(2) systems are described systematically and the sorption characteristics of various materials towards light fullerenes are estimated. The possibility of application of these experimental data for development of pre-chromatographic and chromatographic methods for separation of fullerene mixtures and application of fullerenes as nanomodifiers are described. The bibliography includes 87 references.

  8. Firmware Development Improves System Efficiency

    NASA Technical Reports Server (NTRS)

    Chern, E. James; Butler, David W.

    1993-01-01

    Most manufacturing processes require physical pointwise positioning of the components or tools from one location to another. Typical mechanical systems utilize either stop-and-go or fixed feed-rate procession to accomplish the task. The first approach achieves positional accuracy but prolongs overall time and increases wear on the mechanical system. The second approach sustains the throughput but compromises positional accuracy. A computer firmware approach has been developed to optimize this point wise mechanism by utilizing programmable interrupt controls to synchronize engineering processes 'on the fly'. This principle has been implemented in an eddy current imaging system to demonstrate the improvement. Software programs were developed that enable a mechanical controller card to transmit interrupts to a system controller as a trigger signal to initiate an eddy current data acquisition routine. The advantages are: (1) optimized manufacturing processes, (2) increased throughput of the system, (3) improved positional accuracy, and (4) reduced wear and tear on the mechanical system.

  9. Development of a software and hardware system for monitoring the air cleaning process using a cyclone-separator

    NASA Astrophysics Data System (ADS)

    Nicolaeva, B. K.; Borisov, A. P.; Zlochevskiy, V. L.

    2017-08-01

    The article is devoted to the development of a hardware-software complex for monitoring and controlling the process of air purification by means of a cyclone-separator. The hardware of this complex is the Arduino platform, to which are connected pressure sensors, air velocities, dustmeters, which allow monitoring of the main parameters of the cyclone-separator. Also, a frequency converter was developed to regulate the rotation speed of an asynchronous motor necessary to correct the flow rate, the control signals of which come with Arduino. The program part of the complex is written in the form of a web application in the programming language JavaScript and inserts into CSS and HTML for the user interface. This program allows you to receive data from sensors, build dependencies in real time and control the speed of rotation of an asynchronous electric drive. The conducted experiment shows that the cleaning efficiency is 95-99.9%, while the airflow at the cyclone inlet is 16-18 m/s, and at the exit 50-70 m/s.

  10. NORSAR Detection Processing System.

    DTIC Science & Technology

    1987-05-31

    systems have been reliable. NTA/Lillestrom and Hamar will take a new initiative medio April regarding 04C. The line will be remeasured and if a certain...estimate of the ambient noise level at the site of the FINESA array, ground motion spectra were calculated for four time intervals. Two intervals were

  11. Challenges and opportunities for policy decisions to address health equity in developing health systems: case study of the policy processes in the Indian state of Orissa

    PubMed Central

    2011-01-01

    Introduction Achieving health equity is a pertinent need of the developing health systems. Though policy process is crucial for planning and attaining health equity, the existing evidences on policy processes are scanty in this regard. This article explores the magnitude, determinants, challenges and prospects of 'health equity approach' in various health policy processes in the Indian State of Orissa - a setting comparable with many other developing health systems. Methods A case-study involving 'Walt-Gilson Policy Triangle' employed key-informant interviews and documentary reviews. Key informants (n = 34) were selected from the departments of Health and Family Welfare, Rural Development, and Women and Child Welfare, and civil societies. The documentary reviews involved various published and unpublished reports, policy pronouncements and articles on health equity in Orissa and similar settings. Results The 'health policy agenda' of Orissa was centered on 'health equity' envisaging affordable and equitable healthcare to all, integrated with public health interventions. However, the subsequent stages of policy process such as 'development, implementation and evaluation' experienced leakage in the equity approach. The impediment for a comprehensive approach towards health equity was the nexus among the national and state health priorities; role, agenda and capacity of actors involved; and existing constraints of the healthcare delivery system. Conclusion The health equity approach of policy processes was incomprehensive, often inadequately coordinated, and largely ignored the right blend of socio-medical determinants. A multi-sectoral, unified and integrated approach is required with technical, financial and managerial resources from different actors for a comprehensive 'health equity approach'. If carefully geared, the ongoing health sector reforms centered on sector-wide approaches, decentralization, communitization and involvement of non-state actors can

  12. A Flash-ADC data acquisition system developed for a drift chamber array and a digital filter algorithm for signal processing

    NASA Astrophysics Data System (ADS)

    Yi, Han; Lü, Li-Ming; Zhang, Zhao; Cheng, Wen-Jing; Ji, Wei; Huang, Yan; Zhang, Yan; Li, Hong-Jie; Cui, Yin-Ping; Lin, Ming; Wang, Yi-Jie; Duan, Li-Min; Hu, Rong-Jiang; Xiao, Zhi-Gang

    2016-11-01

    A Flash-ADC data acquisition (DAQ) system has been developed for the drift chamber array designed for the External-Target-Experiment at the Cooling Storage Ring at the Heavy Ion Research Facility, Lanzhou. The simplified readout electronics system has been developed using the Flash-ADC modules and the whole waveform in the sampling window is obtained, with which the time and energy information can be deduced with an offline processing. A digital filter algorithm has been developed to discriminate the noise and the useful signal. With the digital filtering process, the signal to noise ratio (SNR) is increased and a better time and energy resolution can be obtained. Supported by National Basic Research Program of China (973) (2015CB856903 and 2014CB845405), partly by National Science Foundation of China (U1332207 and 11375094), and by Tsinghua University Initiative Scientific Research Program

  13. Development of a user-friendly system for image processing of electron microscopy by integrating a web browser and PIONE with Eos.

    PubMed

    Tsukamoto, Takafumi; Yasunaga, Takuo

    2014-11-01

    Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also

  14. Digital TV processing system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Two digital video data compression systems directly applicable to the Space Shuttle TV Communication System were described: (1) For the uplink, a low rate monochrome data compressor is used. The compression is achieved by using a motion detection technique in the Hadamard domain. To transform the variable source rate into a fixed rate, an adaptive rate buffer is provided. (2) For the downlink, a color data compressor is considered. The compression is achieved first by intra-color transformation of the original signal vector, into a vector which has lower information entropy. Then two-dimensional data compression techniques are applied to the Hadamard transformed components of this last vector. Mathematical models and data reliability analyses were also provided for the above video data compression techniques transmitted over a channel encoded Gaussian channel. It was shown that substantial gains can be achieved by the combination of video source and channel coding.

  15. Secure Reliable Processing Systems

    DTIC Science & Technology

    1981-07-01

    independent security control, i.e. when access control decisions do not depend on stored application data values. This particular case is of considerable prac...kernel supports. It is true that the values stored as access con- trol data , the information used by the system to determine which users may access...following • I-’, responsibilities: 1. assure that a given data item is stored with the correct name labelling it, 2. check the access control

  16. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  17. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  18. The Postnatal Development of Spinal Sensory Processing

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Maria; Jennings, Ernest

    1999-07-01

    The mechanisms by which infants and children process pain should be viewed within the context of a developing sensory nervous system. The study of the neurophysiological properties and connectivity of sensory neurons in the developing spinal cord dorsal horn of the intact postnatal rat has shed light on the way in which the newborn central nervous system analyzes cutaneous innocuous and noxious stimuli. The receptive field properties and evoked activity of newborn dorsal horn cells to single repetitive and persistent innocuous and noxious inputs are developmentally regulated and reflect the maturation of excitatory transmission within the spinal cord. These changes will have an important influence on pain processing in the postnatal period.

  19. Gravimelt Process development. Final report

    SciTech Connect

    Not Available

    1983-06-01

    This final report contains the results of a bench-scale program to continue the development of the TRW proprietary Gravimelt Process for chemically cleaning coal. This project consisted of two major efforts, a laboratory study aimed at identifying parameters which would influence the operation of a bench unit for desulfurization and demineralization of coal and the design, construction and operation of two types of continuous plug-flow type bench-scale fused caustic leachers. This present bench scale project has demonstrated modes for the continuous operation of fused caustic leaching of coal at coal throughputs of 1 to 5 pounds per hour. The remaining process unit operations of leach solutions regeneration and coal washing and filtration should be tested at bench scale together with fused caustic leaching of coal to demonstrate the complete Gravimelt Process. 22 figures, 11 tables.

  20. AVIRIS ground data-processing system

    NASA Technical Reports Server (NTRS)

    Reimer, John H.; Heyada, Jan R.; Carpenter, Steve C.; Deich, William T. S.; Lee, Meemong

    1987-01-01

    The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) has been under development at JPL for the past four years. During this time, a dedicated ground data-processing system has been designed and implemented to store and process the large amounts of data expected. This paper reviews the objectives of this ground data-processing system and describes the hardware. An outline of the data flow through the system is given, and the software and incorporated algorithms developed specifically for the systematic processing of AVIRIS data are described.

  1. Processes and process development in Taiwan

    NASA Technical Reports Server (NTRS)

    Hwang, H. L.

    1986-01-01

    Silicon material research in the Republic of China (ROC) parallels its development in the electronic industry. A brief outline of the historical development in ROC silicon material research is given. Emphasis is placed on the recent Silane Project managed by the National Science Council, ROC, including project objectives, task forces, and recent accomplishments. An introduction is also given to industrialization of the key technologies developed in this project.

  2. Autism and the development of face processing.

    PubMed

    Golarai, Golijeh; Grill-Spector, Kalanit; Reiss, Allan L

    2006-10-01

    Autism is a pervasive developmental condition, characterized by impairments in non-verbal communication, social relationships and stereotypical patterns of behavior. A large body of evidence suggests that several aspects of face processing are impaired in autism, including anomalies in gaze processing, memory for facial identity and recognition of facial expressions of emotion. In search of neural markers of anomalous face processing in autism, much interest has focused on a network of brain regions that are implicated in social cognition and face processing. In this review, we will focus on three such regions, namely the STS for its role in processing gaze and facial movements, the FFA in face detection and identification and the amygdala in processing facial expressions of emotion. Much evidence suggests that a better understanding of the normal development of these specialized regions is essential for discovering the neural bases of face processing anomalies in autism. Thus, we will also examine the available literature on the normal development of face processing. Key unknowns in this research area are the neuro-developmental processes, the role of experience and the interactions among components of the face processing system in shaping each of the specialized regions for processing faces during normal development and in autism.

  3. Software Development Standard Processes (SDSP)

    NASA Technical Reports Server (NTRS)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; Crean, Kathleen A.; Rinker, George C.; Smith, Thomas P.; Lum, Karen T.; Hanna, Robert A.; Erickson, Daniel E.; Gamble, Edward B., Jr.; Morgan, Scott C.; Kelsay, Michael G.; Newport, Brian J.; Lewicki, Scott A.; Stipanuk, Jeane G.; Cooper, Tonja M.; Meshkat, Leila

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  4. Process gas solidification system

    DOEpatents

    Fort, William G. S.; Lee, Jr., William W.

    1978-01-01

    It has been the practice to (a) withdraw hot, liquid UF.sub.6 from various systems, (b) direct the UF.sub.6 into storage cylinders, and (c) transport the filled cylinders to another area where the UF.sub.6 is permitted to solidify by natural cooling. However, some hazard attends the movement of cylinders containing liquid UF.sub.6, which is dense, toxic, and corrosive. As illustrated in terms of one of its applications, the invention is directed to withdrawing hot liquid UF.sub.6 from a system including (a) a compressor for increasing the pressure and temperature of a stream of gaseous UF.sub.6 to above its triple point and (b) a condenser for liquefying the compressed gas. A network containing block valves and at least first and second portable storage cylinders is connected between the outlet of the condenser and the suction inlet of the compressor. After an increment of liquid UF.sub.6 from the condenser has been admitted to the first cylinder, the cylinder is connected to the suction of the compressor to flash off UF.sub.6 from the cylinder, thus gradually solidifying UF.sub.6 therein. While the first cylinder is being cooled in this manner, an increment of liquid UF.sub.6 from the condenser is transferred into the second cylinder. UF.sub.6 then is flashed from the second cylinder while another increment of liquid UF.sub.6 is being fed to the first. The operations are repeated until both cylinders are filled with solid UF.sub.6, after which they can be moved safely. As compared with the previous technique, this procedure is safer, faster, and more economical. The method also provides the additional advantage of removing volatile impurities from the UF.sub.6 while it is being cooled.

  5. Expert system for the plasma spray process

    SciTech Connect

    Wang, H.; Petrone, S.

    1994-12-31

    The plasma spray process, like other thermal spray processes, has few on-line monitoring sensors and many process variables which cannot be easily and precisely formulated. This provides an opportunity for improving and controlling the process through artificial intelligence. An expert system has been constructed for selecting plasma spray parameters in the development of new coatings. The expert system is based on operator experience and heuristics on the subject using symbolic reasoning, and coupled with numerical calculations. For less experienced users, the system can assist in solving process problems.

  6. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  7. Development of a system for treatment of coconut industry wastewater using electrochemical processes followed by Fenton reaction.

    PubMed

    Gomes, Lúcio de Moura; Duarte, José Leandro da Silva; Pereira, Nathalia Marcelino; Martínez-Huitle, Carlos A; Tonholo, Josealdo; Zanta, Carmen Lúcia de Paiva E Silva

    2014-01-01

    The coconut processing industry generates a significant amount of liquid waste. New technologies targeting the treatment of industrial effluents have emerged, including advanced oxidation processes, the Fenton reaction, and electrochemical processes, which produce strong oxidizing species to remove organic matter. In this study we combined the Fenton reaction and electrochemical process to treat wastewater generated by the coconut industry. We prepared a synthetic wastewater consisting of a mixture of coconut milk and water and assessed how the Fenton reagents' concentration, the cathode material, the current density, and the implementation of associated technologies affect its treatment. Electrochemical treatment followed by the Fenton reaction diminished turbidity and chemical oxygen demand (COD) by 85 and 95%, respectively. The Fenton reaction followed by the electrochemical process reduced turbidity and COD by 93 and 85%, respectively. Therefore, a combination of the Fenton and electrochemical technologies can effectively treat the effluent from the coconut processing industry.

  8. POWER SYSTEMS DEVELOPMENT FACILITY

    SciTech Connect

    Unknown

    2002-11-01

    This report discusses test campaign GCT4 of the Kellogg Brown & Root, Inc. (KBR) transport reactor train with a Siemens Westinghouse Power Corporation (Siemens Westinghouse) particle filter system at the Power Systems Development Facility (PSDF) located in Wilsonville, Alabama. The transport reactor is an advanced circulating fluidized-bed reactor designed to operate as either a combustor or a gasifier using one of two possible particulate control devices (PCDs). The transport reactor was operated as a pressurized gasifier during GCT4. GCT4 was planned as a 250-hour test run to continue characterization of the transport reactor using a blend of several Powder River Basin (PRB) coals and Bucyrus limestone from Ohio. The primary test objectives were: Operational Stability--Characterize reactor loop and PCD operations with short-term tests by varying coal-feed rate, air/coal ratio, riser velocity, solids-circulation rate, system pressure, and air distribution. Secondary objectives included the following: Reactor Operations--Study the devolatilization and tar cracking effects from transient conditions during transition from start-up burner to coal. Evaluate the effect of process operations on heat release, heat transfer, and accelerated fuel particle heat-up rates. Study the effect of changes in reactor conditions on transient temperature profiles, pressure balance, and product gas composition. Effects of Reactor Conditions on Synthesis Gas Composition--Evaluate the effect of air distribution, steam/coal ratio, solids-circulation rate, and reactor temperature on CO/CO{sub 2} ratio, synthesis gas Lower Heating Value (LHV), carbon conversion, and cold and hot gas efficiencies. Research Triangle Institute (RTI) Direct Sulfur Recovery Process (DSRP) Testing--Provide syngas in support of the DSRP commissioning. Loop Seal Operations--Optimize loop seal operations and investigate increases to previously achieved maximum solids-circulation rate.

  9. Precision Pointing System Development

    SciTech Connect

    BUGOS, ROBERT M.

    2003-03-01

    The development of precision pointing systems has been underway in Sandia's Electronic Systems Center for over thirty years. Important areas of emphasis are synthetic aperture radars and optical reconnaissance systems. Most applications are in the aerospace arena, with host vehicles including rockets, satellites, and manned and unmanned aircraft. Systems have been used on defense-related missions throughout the world. Presently in development are pointing systems with accuracy goals in the nanoradian regime. Future activity will include efforts to dramatically reduce system size and weight through measures such as the incorporation of advanced materials and MEMS inertial sensors.

  10. Developing Software Requirements for a Knowledge Management System That Coordinates Training Programs with Business Processes and Policies in Large Organizations

    ERIC Educational Resources Information Center

    Kiper, J. Richard

    2013-01-01

    For large organizations, updating instructional programs presents a challenge to keep abreast of constantly changing business processes and policies. Each time a process or policy changes, significant resources are required to locate and modify the training materials that convey the new content. Moreover, without the ability to track learning…

  11. Developing Software Requirements for a Knowledge Management System That Coordinates Training Programs with Business Processes and Policies in Large Organizations

    ERIC Educational Resources Information Center

    Kiper, J. Richard

    2013-01-01

    For large organizations, updating instructional programs presents a challenge to keep abreast of constantly changing business processes and policies. Each time a process or policy changes, significant resources are required to locate and modify the training materials that convey the new content. Moreover, without the ability to track learning…

  12. Development of mild gasification process

    SciTech Connect

    Chu, C.I.C.; Gillespie, B.L.

    1988-02-01

    Under a previous contract with Morgantown Energy Technology Center (METC), Department of Energy (DOE) Contract No. DE-AC21-84MC21108, UCC Research Corporation (UCCRC) built and tested a 1500 lb/day Mild Gasification Process Development Unit (MGU). The MGU, as tested under the previous contract, is shown in Figure 1. Testing completed under the previous contract showed that good quality hydrocarbon liquids and good quality char can be produced in the MGU. However, the MGU is not optimized. The primary objectives of the current project are to optimize the MGU and determine the suitability of char for several commercial applications. The program consists of four tasks; Task 1-Test Plan; Task 2-Optimization of Mild Gasification Process; Task 3-Evaluation of Char and Char/Coal Blends as a Boiler/Blast Furnace Fuel; and Task 4-Analysis of Data and Preparation of Final Report. Task 1 has been completed while work continued on Task 2.

  13. Development of mild gasification process

    SciTech Connect

    Chu, C.I.C.; Gillespie, B.L.

    1987-11-01

    Under a previous contract with Morgantown Energy Technology Center (METC), Department of Energy (DOE) Contract No. AC21-84MC21108, UCC Research Corporation (UCCRC) built and tested a 1500 lb/day Mild Gasification Process Development Unit (MGU). The MGU, as tested under the previous contract, is shown in Figure 1. Testing completed under the previous contract showed that good quality hydrocarbon liquids and good quality char can be produced in the MGU. However, the MGU is not optimized. The primary objectives of the current project are to optimize the MGU and determine the suitability of char for several commercial applications. The program consists of four tasks; Task 1 -- Test Plan; Task 2 -- Optimization of Mild Gasification Process; Task 3 -- Evaluation of Char and Char/Coal Blends as a Boiler/Blast Furnace Fuel; and Task 4 -- Analysis of Data and Preparation of Final Report. Task 1 has been completed while work continued on Task 2.

  14. Development of mild gasification process

    SciTech Connect

    Chu, C.I.C.; Derting, T.M.

    1988-07-01

    Under a previous contract with Morgantown Energy Technology Center (METC), Department of Energy (DOE) Contract No. AC21-84MC21108, UCC Research Corporation (UCCRC) built and tested a 1500 lb/day Mild Gasification Process Development Unit (MGU). The MGU, as tested under the previous contract, is shown in Figure 1. Testing completed under the previous contract showed that good quality hydrocarbon liquids and good quality char can be produced in the MGU. However, the MGU is not optimized. The primary objectives of the current project are to optimize the MGU and determine the suitability of char for several commercial applications. The program consists of four tasks; Task 1 -- Test Plan; Task 2 -- Optimization of Mild Gasification Process; Task 3 -- Evaluation of Char and Char/Coal Blends as a Boiler/Blast Furnace Fuel; and Task 4 -- Analysis of Data and Preparation of Final Report. Task 1 has been completed while work continued on Task 2.

  15. Development of mild gasification process

    SciTech Connect

    Chu, C.I.C.; Williams, S.W.

    1989-01-01

    Under a previous contract with Morgantown Energy Technology Center (METC), Department of Energy (DOE) Contract No. AC21-84MC21108, UCC Research Corporation (UCCRC) built and tested a 1500 lb/day Mild Gasification Process Development Unit (MGU). The MGU, as tested under the previous contract, is shown in Figure 1. Testing completed under the previous contract showed that good quality hydrocarbon liquids and good quality char can be produced in the MGU. However, the MGU is not optimized. The primary objectives of the current project are to optimize the MGU and determine the suitability of char for several commercial applications. The program consists of four tasks; Task 1 -- Test Plan; Task 2 -- Optimization of Mild Gasification Process; Task 3 -- Evaluation of Char and Char/Coal Blends as a Boiler/Blast Furnace Fuel; and Task 4 -- Analysis of Data and Preparation of Final Report. Task 1 has been completed while work continued on Task 2.

  16. Development process of automotive microsensors

    NASA Astrophysics Data System (ADS)

    Tang, William C.

    1995-05-01

    The phased product development approach can be applied advantageously to develop and manufacture automotive microsensors. The phased approach involves a multifunctional team from innovation to development to eventual production and maintenance phases. The key advantage of this approach is the shortened development cycle and fast product introduction, while minimizing waste of resources and lowering risk of product failure. When applied to the product cycles of automotive sensors based on micromachining technology, this approach elucidates several critical considerations. In particular, since industrial application of micromachining technology is still at the infant stage, standards and design rules are not firmly established. Therefore, several important activities must be initiated simultaneously from the start of the innovation phase, which proves to be crucial to the prudent decision of technology alternatives and sensor system configuration. The use of a multifunctional team, as mandated in the phased approach, enables coherent development and optimization of the sense element, the fabrication technology, the packaging approach, the interface circuit configuration, and design features that allow efficient test and assembly flow. Also, with intermediate milestones within each phase, risk assessment and necessary midcourse adjustment to technology trade- offs can be both timely and accurate. Accelerometers, one of the most developed micromachined sensors, serve as representative examples that illustrate how the phased approach can benefits the commercialization of the newly established and rapidly expanding field of micromechanics.

  17. Development of an Integrated Multi-Contaminant Removal Process Applied to Warm Syngas Cleanup for Coal-Based Advanced Gasification Systems

    SciTech Connect

    Meyer, Howard

    2010-11-30

    This project met the objective to further the development of an integrated multi-contaminant removal process in which H2S, NH3, HCl and heavy metals including Hg, As, Se and Cd present in the coal-derived syngas can be removed to specified levels in a single/integrated process step. The process supports the mission and goals of the Department of Energy's Gasification Technologies Program, namely to enhance the performance of gasification systems, thus enabling U.S. industry to improve the competitiveness of gasification-based processes. The gasification program will reduce equipment costs, improve process environmental performance, and increase process reliability and flexibility. Two sulfur conversion concepts were tested in the laboratory under this project, i.e., the solventbased, high-pressure University of California Sulfur Recovery Process High Pressure (UCSRP-HP) and the catalytic-based, direct oxidation (DO) section of the CrystaSulf-DO process. Each process required a polishing unit to meet the ultra-clean sulfur content goals of <50 ppbv (parts per billion by volume) as may be necessary for fuel cells or chemical production applications. UCSRP-HP was also tested for the removal of trace, non-sulfur contaminants, including ammonia, hydrogen chloride, and heavy metals. A bench-scale unit was commissioned and limited testing was performed with simulated syngas. Aspen-Plus®-based computer simulation models were prepared and the economics of the UCSRP-HP and CrystaSulf-DO processes were evaluated for a nominal 500 MWe, coal-based, IGCC power plant with carbon capture. This report covers the progress on the UCSRP-HP technology development and the CrystaSulf-DO technology.

  18. Development of a parallel detection and processing system using a multidetector array for wave field restoration in scanning transmission electron microscopy.

    PubMed

    Taya, Masaki; Matsutani, Takaomi; Ikuta, Takashi; Saito, Hidekazu; Ogai, Keiko; Harada, Yoshihito; Tanaka, Takeo; Takai, Yoshizo

    2007-08-01

    A parallel image detection and image processing system for scanning transmission electron microscopy was developed using a multidetector array consisting of a multianode photomultiplier tube arranged in an 8 x 8 square array. The system enables the taking of 64 images simultaneously from different scattered directions with a scanning time of 2.6 s. Using the 64 images, phase and amplitude contrast images of gold particles on an amorphous carbon thin film could be separately reconstructed by applying respective 8 shaped bandpass Fourier filters for each image and multiplying the phase and amplitude reconstructing factors.

  19. Development of a parallel detection and processing system using a multidetector array for wave field restoration in scanning transmission electron microscopy

    SciTech Connect

    Taya, Masaki; Matsutani, Takaomi; Ikuta, Takashi; Saito, Hidekazu; Ogai, Keiko; Harada, Yoshihito; Tanaka, Takeo; Takai, Yoshizo

    2007-08-15

    A parallel image detection and image processing system for scanning transmission electron microscopy was developed using a multidetector array consisting of a multianode photomultiplier tube arranged in an 8x8 square array. The system enables the taking of 64 images simultaneously from different scattered directions with a scanning time of 2.6 s. Using the 64 images, phase and amplitude contrast images of gold particles on an amorphous carbon thin film could be separately reconstructed by applying respective 8 shaped bandpass Fourier filters for each image and multiplying the phase and amplitude reconstructing factors.

  20. The Process of Systemic Change

    ERIC Educational Resources Information Center

    Duffy, Francis M.; Reigeluth, Charles M.; Solomon, Monica; Caine, Geoffrey; Carr-Chellman, Alison A.; Almeida, Luis; Frick, Theodore; Thompson, Kenneth; Koh, Joyce; Ryan, Christopher D.; DeMars, Shane

    2006-01-01

    This paper presents several brief papers about the process of systemic change. These are: (1) Step-Up-To-Excellence: A Protocol for Navigating Whole-System Change in School Districts by Francis M. Duffy; (2) The Guidance System for Transforming Education by Charles M. Reigeluth; (3) The Schlechty Center For Leadership In School Reform by Monica…

  1. The Process of Systemic Change

    ERIC Educational Resources Information Center

    Duffy, Francis M.; Reigeluth, Charles M.; Solomon, Monica; Caine, Geoffrey; Carr-Chellman, Alison A.; Almeida, Luis; Frick, Theodore; Thompson, Kenneth; Koh, Joyce; Ryan, Christopher D.; DeMars, Shane

    2006-01-01

    This paper presents several brief papers about the process of systemic change. These are: (1) Step-Up-To-Excellence: A Protocol for Navigating Whole-System Change in School Districts by Francis M. Duffy; (2) The Guidance System for Transforming Education by Charles M. Reigeluth; (3) The Schlechty Center For Leadership In School Reform by Monica…

  2. Development of a coal-fired combustion system for industrial process heating applications. Phase 3 final report, November 1992--December 1994

    SciTech Connect

    1995-09-26

    A three phase research and development program has resulted in the development and commercialization of a Cyclone Melting System (CMS{trademark}), capable of being fueled by pulverized coal, natural gas, and other solid, gaseous, or liquid fuels, for the vitrification of industrial wastes. The Phase 3 research effort focused on the development of a process heater system to be used for producing value added glass products from the vitrification of boiler/incinerator ashes and industrial wastes. The primary objective of the Phase 3 project was to develop and integrate all the system components, from fuel through total system controls, and then test the complete system in order to evaluate its potential for successful commercialization. The demonstration test consisted of one test run with a duration of 105 hours, approximately one-half (46 hours) performed with coal as the primary fuel source (70% to 100%), the other half with natural gas. Approximately 50 hours of melting operation were performed vitrifying approximately 50,000 lbs of coal-fired utility boiler flyash/dolomite mixture, producing a fully-reacted vitrified product.

  3. Development of a coal-fired combustion system for industrial processing heating applications: Appendix A. Phase 3 final report, November 1992--December 1994

    SciTech Connect

    1995-09-26

    A three phase research and development program has resulted in the development and commercialization of a Cyclone Melting System (CMS{trademark}), capable of being fueled by pulverized coal, natural gas, and other solid, gaseous, or liquid fuels, for the vitrification of industrial wastes. The Phase 3 research effort focused on the development of a process heater system to be used for producing value added glass products from the vitrification of boiler/incinerator ashes and industrial wastes. The primary objective of the Phase 3 project was to develop and integrate all the system components, from fuel through total system controls, and then test the complete system in order to evaluate its potential for successful commercialization. The demonstration test consisted of one test run with a duration of 105 hours, approximately one-half (46 hours) performed with coal as the primary fuel source (70% to 100%), the other half with natural gas. Approximately 50 hours of melting operation were performed vitrifying approximately 50,000 lbs of coal-fired utility boiler flyash/dolomite mixture, producing a fully-reacted vitrified product. Appendix A contains 89 figures containing the data from the demonstration tests undertaken under Phase 3.

  4. A Study to Develop an Industrial-Scale, Computer-Controlled High Magnetic Field Processing (HMFP) System to Assist in Commercializing the Novel, Enabling HMFP Manufacturing Technology

    SciTech Connect

    Ludtka, Gail Mackiewicz-; Chourey, Aashish

    2010-08-01

    As the original magnet designer and manufacturer of ORNL s 9T, 5-inch ID bore magnet, American Magnetics Inc. (AMI) has collaborated with ORNL s Materials Processing Group s and this partnership has been instrumental in the development of our unique thermo-magnetic facilities and expertise. Consequently, AMI and ORNL have realized that the commercial implementation of the High Magnetic Field Processing (HMFP) technology will require the evolution of robust, automated superconducting (SC) magnet systems that will be cost-effective and easy to operate in an industrial environment. The goal of this project and CRADA is to significantly expedite the timeline for implementing this revolutionary and pervasive cross-cutting technology for future US produced industrial components. The successful completion of this project is anticipated to significantly assist in the timely commercialization and licensing of our HMFP intellectual property for a broad spectrum of industries; and to open up a new market for AMI. One notable outcome of this project is that the ThermoMagnetic Processing Technology WON a prestigious 2009 R&D 100 Awards. This award acknowledges and recognizes our TMP Technology as one of the top 100 innovative US technologies in 2009. By successfully establishing the design requirements for a commercial scale magnetic processing system, this project effort has accomplished a key first step in facilitating the building and demonstration of a superconducting magnetic processing coil, enabling the transition of the High Magnetic Field Processing Technology beyond a laboratory novelty into a commercially viable and industrially scalable Manufacturing Technology.

  5. Process Development for Nanostructured Photovoltaics

    SciTech Connect

    Elam, Jeffrey W.

    2015-01-01

    Photovoltaic manufacturing is an emerging industry that promises a carbon-free, nearly limitless source of energy for our nation. However, the high-temperature manufacturing processes used for conventional silicon-based photovoltaics are extremely energy-intensive and expensive. This high cost imposes a critical barrier to the widespread implementation of photovoltaic technology. Argonne National Laboratory and its partners recently invented new methods for manufacturing nanostructured photovoltaic devices that allow dramatic savings in materials, process energy, and cost. These methods are based on atomic layer deposition, a thin film synthesis technique that has been commercialized for the mass production of semiconductor microelectronics. The goal of this project was to develop these low-cost fabrication methods for the high efficiency production of nanostructured photovoltaics, and to demonstrate these methods in solar cell manufacturing. We achieved this goal in two ways: 1) we demonstrated the benefits of these coatings in the laboratory by scaling-up the fabrication of low-cost dye sensitized solar cells; 2) we used our coating technology to reduce the manufacturing cost of solar cells under development by our industrial partners.

  6. Launch processing system concept to reality

    NASA Technical Reports Server (NTRS)

    Bailey, W. W.

    1985-01-01

    The Launch Processing System represents Kennedy Space Center's role in providing a major integrated hardware and software system for the test, checkout and launch of a new space vehicle. Past programs considered the active flight vehicle to ground interfaces as part of the flight systems and therefore the related ground system was provided by the Development Center. The major steps taken to transform the Launch Processing System from a concept to reality with the successful launches of the Shuttle Programs Space Transportation System are addressed.

  7. Development of a continuous roll-to-roll processing system for mass production of plastic optical film

    NASA Astrophysics Data System (ADS)

    Chang, Chih-Yuan; Tsai, Meng-Hsun

    2015-12-01

    This paper reports a highly effective method for the mass production of large-area plastic optical films with a microlens array pattern based on a continuous roll-to-roll film extrusion and roller embossing process. In this study, a thin steel mold with a micro-circular hole array pattern is fabricated by photolithography and a wet chemical etching process. The thin steel mold was then wrapped onto a metal cylinder to form an embossing roller mold. During the roll-to-roll process operation, a thermoplastic raw material (polycarbonate grains) was put into the barrel of the plastic extruder with a flat T-die. Then, the molten polymer film was extruded and immediately pressed against the surface of the embossing roller mold. Under the proper processing conditions, the molten polymer will just partially fill the micro-circular holes of the mold and due to surface tension form a convex lens surface. A continuous plastic optical film with a microlens array pattern was obtained. Experiments are carried out to investigate the effect of plastic microlens formation on the roll-to-roll process. Finally, the geometrical and optical properties of the fabricated plastic optical film were measured and proved satisfactory. This technique shows great potential for the mass production of large-area plastic optical films with a microlens array pattern.

  8. Control of Total Ownership Costs of DoD Acquisition Development Programs Through Integrated Systems Engineering Processes and Metrics

    DTIC Science & Technology

    2011-04-30

    Airborne, Maritime and Fixed Station Joint Tactical Radio System ^Åèìáëáíáçå oÉëÉ~êÅÜW `ob^qfkd pvkbodv clo fkclojba `e^kdb - ii - • Program...perspective to cost estimation and control and may be enriched by enhancing with system engineering activities that are also focused and similar areas ...Mission and requirements definition, Functional analysis, Alternative synthesis, and Evaluation , trade-off, and selection. This methodology is

  9. TECHNOLOGY DEVELOPMENT AND DEPLOYMENT OF SYSTEMS FOR THE RETRIEVAL AND PROCESSING OF REMOTE-HANDLED SLUDGE FROM HANFORD K-WEST FUEL STORAGE BASIN

    SciTech Connect

    RAYMOND RE

    2011-12-27

    In 2011, significant progress was made in developing and deploying technologies to remove, transport, and interim store remote-handled sludge from the 105-K West Fuel Storage Basin on the Hanford Site in south-central Washington State. The sludge in the 105-K West Basin is an accumulation of degraded spent nuclear fuel and other debris that collected during long-term underwater storage of the spent fuel. In 2010, an innovative, remotely operated retrieval system was used to successfully retrieve over 99.7% of the radioactive sludge from 10 submerged temporary storage containers in the K West Basin. In 2011, a full-scale prototype facility was completed for use in technology development, design qualification testing, and operator training on systems used to retrieve, transport, and store highly radioactive K Basin sludge. In this facility, three separate systems for characterizing, retrieving, pretreating, and processing remote-handled sludge were developed. Two of these systems were successfully deployed in 2011. One of these systems was used to pretreat knockout pot sludge as part of the 105-K West Basin cleanup. Knockout pot sludge contains pieces of degraded uranium fuel ranging in size from 600 {mu}m to 6350 {mu}m mixed with pieces of inert material, such as aluminum wire and graphite, in the same size range. The 2011 pretreatment campaign successfully removed most of the inert material from the sludge stream and significantly reduced the remaining volume of knockout pot product material. Removing the inert material significantly minimized the waste stream and reduced costs by reducing the number of transportation and storage containers. Removing the inert material also improved worker safety by reducing the number of remote-handled shipments. Also in 2011, technology development and final design were completed on the system to remove knockout pot material from the basin and transport the material to an onsite facility for interim storage. This system is

  10. Development of a portable hyperspectral imaging system for monitoring the efficacy of sanitation procedures in food processing facilities

    USDA-ARS?s Scientific Manuscript database

    Cleaning and sanitation in food processing facilities is a critical step in reducing the risk of transfer of pathogenic organisms to food consumed by the public. Current methods to check the effectiveness of sanitation procedures rely on visual observation and sub-sampling tests such as ATP biolumin...

  11. The Beady Eye of the Professional Development Appraisal System: A Foucauldian Cross-Case Analysis of the Teacher Evaluation Process

    ERIC Educational Resources Information Center

    Torres, Dalia

    2012-01-01

    The purpose of this deconstructive case study was to conduct a Foucauldian power/knowledge analysis constructed from the perceptions of three teachers at an intermediate school in South Texas regarding the role of the teacher evaluation process and its influence on instructional practices. Using Foucault's (1977a) work on power/knowledge, of…

  12. The Development of a System of Study Credits in Ukraine: The Case of Policy Layering in the Bologna Process

    ERIC Educational Resources Information Center

    Kushnir, Iryna

    2017-01-01

    The Bologna Process is an intergovernmental initiative aimed to make higher education degrees compatible in Europe. Previous research into the implementation of the Bologna objectives (or action lines) views the influence of the context as a challenge. This article suggests a different approach for analysing the implementation of the Bologna…

  13. Cascade Distillation System Development

    NASA Technical Reports Server (NTRS)

    Callahan, Michael R.; Sargushingh, Miriam; Shull, Sarah

    2014-01-01

    NASA's Advanced Exploration Systems (AES) Life Support System (LSS) Project is chartered with de-veloping advanced life support systems that will ena-ble NASA human exploration beyond low Earth orbit (LEO). The goal of AES is to increase the affordabil-ity of long-duration life support missions, and to re-duce the risk associated with integrating and infusing new enabling technologies required to ensure mission success. Because of the robust nature of distillation systems, the AES LSS Project is pursuing develop-ment of the Cascade Distillation Subsystem (CDS) as part of its technology portfolio. Currently, the system is being developed into a flight forward Generation 2.0 design.

  14. Open source clinical portals: a model for healthcare information systems to support care processes and feed clinical research. An Italian case of design, development, reuse, and exploitation.

    PubMed

    Locatelli, Paolo; Baj, Emanuele; Restifo, Nicola; Origgi, Gianni; Bragagia, Silvia

    2011-01-01

    Open source is a still unexploited chance for healthcare organizations and technology providers to answer to a growing demand for innovation and to join economical benefits with a new way of managing hospital information systems. This chapter will present the case of the web enterprise clinical portal developed in Italy by Niguarda Hospital in Milan with the support of Fondazione Politecnico di Milano, to enable a paperless environment for clinical and administrative activities in the ward. This represents also one rare case of open source technology and reuse in the healthcare sector, as the system's porting is now taking place at Besta Neurological Institute in Milan. This institute is customizing the portal to feed researchers with structured clinical data collected in its portal's patient records, so that they can be analyzed, e.g., through business intelligence tools. Both organizational and clinical advantages are investigated, from process monitoring, to semantic data structuring, to recognition of common patterns in care processes.

  15. High-throughput downstream process development for cell-based products using aqueous two-phase systems (ATPS) - A case study.

    PubMed

    Zimmermann, Sarah; Scheeder, Christian; Zimmermann, Philipp K; Bogsnes, Are; Hansson, Mattias; Staby, Arne; Hubbuch, Jürgen

    2017-02-01

    The availability of preparative-scale downstream processing strategies for cell-based products presents a critical juncture between fundamental research and clinical development. Aqueous two-phase systems (ATPS) present a gentle, scalable, label-free, and cost-effective method for cell purification, and are thus a promising tool for downstream processing of cell-based therapeutics. Here, the application of a previously developed robotic screening platform that enables high-throughput cell partitioning analysis in ATPS is reported. In the present case study a purification strategy for two model cell lines based on high-throughput screening (HTS)-data and countercurrent distribution (CCD)-modeling, and validated the CCD-model experimentally is designed. The obtained data are shown an excellent congruence between CCD-model and experimental data, indicating that CCD-models in combination with HTS-data are a powerful tool in downstream process development. Finally, the authors are shown that while cell cycle phase significantly influences cell partitioning, cell type specific differences in surface properties are the main driving force in charge-dependent separation of HL-60 and L929 cells. In order to design a highly robust purification process it is, however, advisable to maintain constant growth conditions. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Development of data processing, interpretation and analysis system for the remote sensing of trace atmospheric gas species

    NASA Technical Reports Server (NTRS)

    Casas, Joseph C.; Saylor, Mary S.; Kindle, Earl C.

    1987-01-01

    The major emphasis is on the advancement of remote sensing technology. In particular, the gas filter correlation radiometer (GFCR) technique was applied to the measurement of trace gas species, such as carbon monoxide (CO), from airborne and Earth orbiting platforms. Through a series of low altitude aircraft flights, high altitude aircraft flights, and orbiting space platform flights, data were collected and analyzed, culminating in the first global map of carbon monoxide concentration in the middle troposphere and stratosphere. The four major areas of this remote sensing program, known as the Measurement of Air Pollution from Satellites (MAPS) experiment, are: (1) data acquisition, (2) data processing, analysis, and interpretation algorithms, (3) data display techniques, and (4) information processing.

  17. Organizations and Information Processing: A Field Study of Research and Development Units within the United States Air Force Systems Command.

    DTIC Science & Technology

    1984-08-01

    uncertainty (Downey, Hellriegel and Slocum, 1975). DowTney and Slocum (1975) conceptualize uncertainty as a psychological state in which the sources of...Child, 1912; Osborn and Hunt, 1914; Downey, Hellriegel , and Slocum, 1975; Huber, O’Connel and Cummings, 1975; Schmidt and Cummings, 1975; Leifer and...uncertainty and the need for external information processing, including: Duncan, 1972; Child, 1972; Osborn and Hunt, 1914; Hellriegel and Solcum, 1915; Huber

  18. Issues in expert system development

    SciTech Connect

    Baer, C.L.

    1988-03-01

    The explicit representation of domain knowledge and its separation from the processes which manipulate it and the representation formalism particular to artificial intelligence allow expert systems to solve problems which are characterized by a high combinatoric complexity or which are sufficiently ill defined as to not have reasonable software engineering solutions. The expert system approach to problem-solving differs radically from it conventional system development counterpart. This paper defines the expert system and introduces the production system architecture. The relative strengths and weaknesses of expert system and software engineering approaches to problem solving are discussed. Also addressed are criteria for identifying problems amenable to expert system solution and some justification for system development.

  19. System Leaders Using Assessment for Learning as Both the Change and the Change Process: Developing Theory from Practice

    ERIC Educational Resources Information Center

    Davies, Anne; Busick, Kathy; Herbst, Sandra; Sherman, Ann

    2014-01-01

    Many schools and school systems have been deliberately working towards full implementation of Assessment for Learning for more than a decade, yet success has been elusive. Using a leader's implementation of Assessment for Learning in one school as an illustration, this article examines eight positional leaders' experiences as they implemented both…

  20. System Leaders Using Assessment for Learning as Both the Change and the Change Process: Developing Theory from Practice

    ERIC Educational Resources Information Center

    Davies, Anne; Busick, Kathy; Herbst, Sandra; Sherman, Ann

    2014-01-01

    Many schools and school systems have been deliberately working towards full implementation of Assessment for Learning for more than a decade, yet success has been elusive. Using a leader's implementation of Assessment for Learning in one school as an illustration, this article examines eight positional leaders' experiences as they implemented both…

  1. Hybrid systems process mixed wastes

    SciTech Connect

    Chertow, M.R.

    1989-10-01

    Some technologies, developed recently in Europe, combine several processes to separate and reuse materials from solid waste. These plants have in common, generally, that they are reasonably small, have a composting component for the organic portion, and often have a refuse-derived fuel component for combustible waste. Many European communities also have very effective drop-off center programs for recyclables such as bottles and cans. By maintaining the integrity of several different fractions of the waste, there is a less to landfill and less to burn. The importance of these hybrid systems is that they introduce in one plant an approach that encompasses the key concept of today's solid waste planning; recover as much as possible and landfill as little as possible. The plants also introduce various risks, particularly of finding secure markets. There are a number of companies offering various combinations of materials recovery, composting, and waste combustion. Four examples are included: multiple materials recovery and refuse-derived fuel production in Eden Prairie, Minnesota; multiple materials recovery, composting and refuse-derived fuel production in Perugia, Italy; composting, refuse-derived fuel, and gasification in Tolmezzo, Italy; and a front-end system on a mass burning waste-to-energy plant in Neuchatel, Switzerland.

  2. Advanced System for Process Engineering

    SciTech Connect

    Williams, K. E.; Saus, L. S.; Regenhardt, P. A.

    1992-02-01

    ASPEN (Advanced System for Process Engineering) is a state of the art process simulator and economic evaluation package which was designed for use in engineering fossil energy conversion processes. ASPEN can represent multiphase streams including solids, and handle complex substances such as coal. The system can perform steady state material and energy balances, determine equipment size and cost, and carry out preliminary economic evaluations. It is supported by a comprehensive physical property system for computation of major properties such as enthalpy, entropy, free energy, molar volume, equilibrium ratio, fugacity coefficient, viscosity, thermal conductivity, and diffusion coefficient for specified phase conditions; vapor, liquid, or solid. The properties may be computed for pure components, mixtures, or components in a mixture, as appropriate. The ASPEN Input Language is oriented towards process engineers.

  3. Solar-Cell-Junction Processing System

    NASA Technical Reports Server (NTRS)

    Bunker, S. N.; Armini, A. J.

    1986-01-01

    System under development reduces equipment costs. Processing system will produce solar-cell junctions on 4 in. (10.2 cm) round silicon wafers at rate of 10 to seventh power per year. System includes non-mass-analyzed ion implanter, microcomputer-controlled, pulsed-electron-beam annealer, and wafertransport system with vacuum interlock. These features eliminate large, expensive magnet and plates, circuitry, and power source otherwise needed for scanning.

  4. Module Experimental Process System Development Unit (MEPSDU). Quarterly report No. 1, November 26, 1980-February 28, 1981

    SciTech Connect

    Rose, C. M.

    1981-01-01

    Technical work during the first quarter of the program was directed toward completing all design and documentation tasks associated with the MEPSDU Preliminary Design Review Data Package submittal. Highlights of this effort consisted of: (1) preparation of a preliminary specification for MEPSDU input sheet material (dendritic web silicon); (2) preparation of a MEPSDU module preliminary design layout drawing and all associated detail drawings; (3) analysis of the performance of the MEPSDU module over a range of operating conditions; (4) definition of all steps in the baseline process sequence; (5) initiation of equipment specifications for all long lead time items in the MEPSDU; (6) initiation of preliminary design work on the cassette unload element and interconnect feed element of the automated cell interconnect station; and (7) preparation of the preliminary quality assurance plan. (WHK)

  5. An ecological vegetation-activated sludge process (V-ASP) for decentralized wastewater treatment: system development, treatment performance, and mathematical modeling.

    PubMed

    Yuan, Jiajia; Dong, Wenyi; Sun, Feiyun; Li, Pu; Zhao, Ke

    2016-05-01

    An environment-friendly decentralized wastewater treatment process that is comprised of activated sludge process (ASP) and wetland vegetation, named as vegetation-activated sludge process (V-ASP), was developed for decentralized wastewater treatment. The long-term experimental results evidenced that the vegetation sequencing batch reactor (V-SBR) process had consistently stable higher removal efficiencies of organic substances and nutrients from domestic wastewater compared with traditional sequencing batch reactor (SBR). The vegetation allocated into V-SBR system could not only remove nutrients through its vegetation transpiration ratio but also provide great surface area for microorganism activity enhancement. This high vegetation transpiration ratio enhanced nutrients removal effectiveness from wastewater mainly by flux enhancement, oxygen and substrate transportation acceleration, and vegetation respiration stimulation. A mathematical model based on ASM2d was successfully established by involving the specific function of vegetation to simulate system performance. The simulation results on the influence of operational parameters on V-ASP treatment effectiveness demonstrated that V-SBR had a high resistance to seasonal temperature fluctuations and influent loading shocking.

  6. Low-Cost Solar Array Project. Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process. Quarterly progress report, October-December 1980

    SciTech Connect

    Not Available

    1980-01-01

    Progress is reported on the engineering design, fabrication, assembly, operation, economic analysis, and process support R and D for an Experimental Process System Development Unit (EPSDU) for producing semiconductor-grade silicon using the silane-to-silicon process. Most of the process related equipment has been ordered and is being fabricated. Equipment and building foundations have been completed at the EPSDU site, and all the steel was erected for the gantry. The switch gear/control building and the melter building will be completed during the next quarter. The data collection system design is progressing. Various computer programs are being written which will be used to convert electrical, pneumatic and other raw signals into engineering values. The free-space reactor development work was completed with a final 12-hour run in which the free-space reactor PDU ran flawlessly. Also, the quality control method development task was completed. Slim rods were grown from seed silicon rods for subsequent float zone operation and impurity characterization. An excellent quality epitaxial film was deposited on a silicon wafer. Both undoped ad doped films were deposited and the resistivity of the films have been measured. (WHK)

  7. Development of a Versatile Laser Ultrasonic System and Application to On-Line Measurement for Process Control of Wall Thickness and Eccentrictiy of Steel Seamless Mechanical Tubing

    SciTech Connect

    Kisner, R.A.; Kercel, S.W.; Damiano, B.; Bingham, P.R.; Gee, T.F.; Tucker, R.W.; Moore, M.R.; Hileman, M.; Emery, M.; Lenarduzzi, R.; Hardy, J.E.; Weaver, K.; Crutcher, R.; Kolarik, R.V., II; Vandervaart, R.H.

    2002-04-24

    Researchers at the Timken Company conceived a project to develop an on-line instrument for wall thickness measurement of steel seamless mechanical tubing based on laser ultrasonic technology. The instrument, which has been installed and tested at a piercing mill, provides data on tube eccentricity and concentricity. Such measurements permit fine-tuning of manufacturing processes to eliminate excess material in the tube wall and therefore provide a more precisely dimensioned product for their customers. The resulting process energy savings are substantial, as is lowered environmental burden. The expected savings are $85.8 million per year in seamless mechanical tube piercing alone. Applied across the industry, this measurement has a potential of reducing energy consumption by 6 x 10{sup 12} BTU per year, greenhouse gas emissions by 0.3 million metric tons carbon equivalent per year, and toxic waste by 0.255 million pounds per year. The principal technical contributors to the project were the Timken Company, Industrial Materials Institute (IMI, a contractor to Timken), and Oak Ridge National Laboratory (ORNL). Timken provided mill access as well as process and metallurgical understanding. Timken researchers had previously developed fundamental ultrasonic analysis methods on which this project is based. IMI developed and fabricated the laser ultrasonic generation and receiver systems. ORNL developed Bayesian and wavelet based real-time signal processing, spread-spectrum wireless communication, and explored feature extraction and pattern recognition methods. The resulting instrument has successfully measured production tubes at one of Timken's piercing mills. This report concentrates on ORNL's contribution through the CRADA mechanism. The three components of ORNL's contribution were met with mixed success. The real-time signal-processing task accomplished its goal of improvement in detecting time of flight information with a minimum of false data. The signal processing

  8. Parallel processing spacecraft communication system

    NASA Technical Reports Server (NTRS)

    Bolotin, Gary S. (Inventor); Donaldson, James A. (Inventor); Luong, Huy H. (Inventor); Wood, Steven H. (Inventor)

    1998-01-01

    An uplink controlling assembly speeds data processing using a special parallel codeblock technique. A correct start sequence initiates processing of a frame. Two possible start sequences can be used; and the one which is used determines whether data polarity is inverted or non-inverted. Processing continues until uncorrectable errors are found. The frame ends by intentionally sending a block with an uncorrectable error. Each of the codeblocks in the frame has a channel ID. Each channel ID can be separately processed in parallel. This obviates the problem of waiting for error correction processing. If that channel number is zero, however, it indicates that the frame of data represents a critical command only. That data is handled in a special way, independent of the software. Otherwise, the processed data further handled using special double buffering techniques to avoid problems from overrun. When overrun does occur, the system takes action to lose only the oldest data.

  9. Software Development to Assist in the Processing and Analysis of Data Obtained Using Fiber Bragg Grating Interrogation Systems

    NASA Technical Reports Server (NTRS)

    Hicks, Rebecca

    2009-01-01

    A fiber Bragg grating is a portion of a core of a fiber optic strand that has been treated to affect the way light travels through the strand. Light within a certain narrow range of wavelengths will be reflected along the fiber by the grating, while light outside that range will pass through the grating mostly undisturbed. Since the range of wavelengths that can penetrate the grating depends on the grating itself as well as temperature and mechanical strain, fiber Bragg gratings can be used as temperature and strain sensors. This capability, along with the light-weight nature of the fiber optic strands in which the gratings reside, make fiber optic sensors an ideal candidate for flight testing and monitoring in which temperature and wing strain are factors. The purpose of this project is to research the availability of software capable of processing massive amounts of data in both real-time and post-flight settings, and to produce software segments that can be integrated to assist in the task as well.

  10. Power Systems Development Facility

    SciTech Connect

    Southern Company Services

    2009-01-31

    In support of technology development to utilize coal for efficient, affordable, and environmentally clean power generation, the Power Systems Development Facility (PSDF), located in Wilsonville, Alabama, has routinely demonstrated gasification technologies using various types of coals. The PSDF is an engineering scale demonstration of key features of advanced coal-fired power systems, including a Transport Gasifier, a hot gas particulate control device, advanced syngas cleanup systems, and high-pressure solids handling systems. This final report summarizes the results of the technology development work conducted at the PSDF through January 31, 2009. Twenty-one major gasification test campaigns were completed, for a total of more than 11,000 hours of gasification operation. This operational experience has led to significant advancements in gasification technologies.

  11. TMT approach to observatory software development process

    NASA Astrophysics Data System (ADS)

    Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder

    2016-07-01

    The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate

  12. Chemiluminescence development after initiation of Maillard reaction in aqueous solutions of glycine and glucose: nonlinearity of the process and cooperative properties of the reaction system

    NASA Astrophysics Data System (ADS)

    Voeikov, Vladimir L.; Naletov, Vladimir I.

    1998-06-01

    Nonenzymatic glycation of free or peptide bound amino acids (Maillard reaction, MR) plays an important role in aging, diabetic complications and atherosclerosis. MR taking place at high temperatures is accompanied by chemiluminescence (CL). Here kinetics of CL development in MR proceeding in model systems at room temperature has been analyzed for the first time. Brief heating of glycine and D-glucose solutions to t greater than 93 degrees Celsius results in their browning and appearance of fluorescencent properties. Developed In solutions rapidly cooled down to 20 degrees Celsius a wave of CL. It reached maximum intensity around 40 min after the reaction mixture heating and cooling it down. CL intensity elevation was accompanied by certain decoloration of the solution. Appearance of light absorbing substances and development of CL depended critically upon the temperature of preincubation (greater than or equal to 93 degrees Celsius), initial pH (greater than or equal to 11,2), sample volume (greater than or equal to 0.5 ml) and reagents concentrations. Dependence of total counts accumulation on a system volume over the critical volume was non-monotonous. After reaching maximum values CL began to decline, though only small part of glucose and glycin had been consumed. Brief heating of such solutions to the critical temperature resulted in emergence of a new CL wave. This procedure could be repeated in one and the same reaction system for several times. Whole CL kinetic curve best fitted to lognormal distribution. Macrokinetic properties of the process are characteristic of chain reactions with delayed branching. Results imply also, that self-organization occurs in this system, and that the course of the process strongly depends upon boundary conditions and periodic interference in its course.

  13. A Study to Develop an Industrial-Scale, Computer-Controlled High Magnetic Field Processing (HMFP) System to Assist in Commercializing the Novel, Enabling HMFP Manufacturing Technology

    SciTech Connect

    Lutdka, G. M.; Chourey, A.

    2010-05-12

    As the original magnet designer and manufacturer of ORNL’s 9T, 5-inch ID bore magnet, American Magnetics Inc. (AMI) has collaborated with ORNL’s Materials Processing Group’s and this partnership has been instrumental in the development of our unique thermo-magnetic facilities and expertise. Consequently, AMI and ORNL have realized that the commercial implementation of the High Magnetic Field Processing (HMFP) technology will require the evolution of robust, automated superconducting (SC) magnet systems that will be cost-effective and easy to operate in an industrial environment. The goal of this project and CRADA is to significantly expedite the timeline for implementing this revolutionary and pervasive cross-cutting technology for future US produced industrial components. The successful completion of this project is anticipated to significantly assist in the timely commercialization and licensing of our HMFP intellectual property for a broad spectrum of industries; and to open up a new market for AMI. One notable outcome of this project is that the ThermoMagnetic Processing Technology WON a prestigious 2009 R&D 100 Awards. This award acknowledges and recognizes our TMP Technology as one of the top 100 innovative US technologies in 2009. By successfully establishing the design requirements for a commercial scale magnetic processing system, this project effort has accomplished a key first step in facilitating the building and demonstration of a superconducting magnetic processing coil, enabling the transition of the High Magnetic Field Processing Technology beyond a laboratory novelty into a commercially viable and industrially scalable Manufacturing Technology.

  14. Advanced Information Processing System (AIPS)

    NASA Technical Reports Server (NTRS)

    Pitts, Felix L.

    1993-01-01

    Advanced Information Processing System (AIPS) is a computer systems philosophy, a set of validated hardware building blocks, and a set of validated services as embodied in system software. The goal of AIPS is to provide the knowledgebase which will allow achievement of validated fault-tolerant distributed computer system architectures, suitable for a broad range of applications, having failure probability requirements of 10E-9 at 10 hours. A background and description is given followed by program accomplishments, the current focus, applications, technology transfer, FY92 accomplishments, and funding.

  15. Development and Testing of the Advanced CHP System Utilizing the Off-Gas from the Innovative Green Coke Calcining Process in Fluidized Bed

    SciTech Connect

    Chudnovsky, Yaroslav; Kozlov, Aleksandr

    2013-08-15

    Green petroleum coke (GPC) is an oil refining byproduct that can be used directly as a solid fuel or as a feedstock for the production of calcined petroleum coke. GPC contains a high amount of volatiles and sulfur. During the calcination process, the GPC is heated to remove the volatiles and sulfur to produce purified calcined coke, which is used in the production of graphite, electrodes, metal carburizers, and other carbon products. Currently, more than 80% of calcined coke is produced in rotary kilns or rotary hearth furnaces. These technologies provide partial heat utilization of the calcined coke to increase efficiency of the calcination process, but they also share some operating disadvantages. However, coke calcination in an electrothermal fluidized bed (EFB) opens up a number of potential benefits for the production enhancement, while reducing the capital and operating costs. The increased usage of heavy crude oil in recent years has resulted in higher sulfur content in green coke produced by oil refinery process, which requires a significant increase in the calcinations temperature and in residence time. The calorific value of the process off-gas is quite substantial and can be effectively utilized as an “opportunity fuel” for combined heat and power (CHP) production to complement the energy demand. Heat recovered from the product cooling can also contribute to the overall economics of the calcination process. Preliminary estimates indicated the decrease in energy consumption by 35-50% as well as a proportional decrease in greenhouse gas emissions. As such, the efficiency improvement of the coke calcinations systems is attracting close attention of the researchers and engineers throughout the world. The developed technology is intended to accomplish the following objectives: - Reduce the energy and carbon intensity of the calcined coke production process. - Increase utilization of opportunity fuels such as industrial waste off-gas from the novel

  16. Remote systems development

    NASA Technical Reports Server (NTRS)

    Olsen, R.; Schaefer, O.; Hussey, J.

    1992-01-01

    Potential space missions of the nineties and the next century require that we look at the broad category of remote systems as an important means to achieve cost-effective operations, exploration and colonization objectives. This paper addresses such missions, which can use remote systems technology as the basis for identifying required capabilities which must be provided. The relationship of the space-based tasks to similar tasks required for terrestrial applications is discussed. The development status of the required technology is assessed and major issues which must be addressed to meet future requirements are identified. This includes the proper mix of humans and machines, from pure teleoperation to full autonomy; the degree of worksite compatibility for a robotic system; and the required design parameters, such as degrees-of-freedom. Methods for resolution are discussed including analysis, graphical simulation and the use of laboratory test beds. Grumman experience in the application of these techniques to a variety of design issues are presented utilizing the Telerobotics Development Laboratory which includes a 17-DOF robot system, a variety of sensing elements, Deneb/IRIS graphics workstations and control stations. The use of task/worksite mockups, remote system development test beds and graphical analysis are discussed with examples of typical results such as estimates of task times, task feasibility and resulting recommendations for design changes. The relationship of this experience and lessons-learned to future development of remote systems is also discussed.

  17. Architecture for Survivable System Processing (ASSP)

    NASA Technical Reports Server (NTRS)

    Wood, Richard J.

    1991-01-01

    The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

  18. Architecture for Survivable System Processing (ASSP)

    NASA Astrophysics Data System (ADS)

    Wood, Richard J.

    1991-11-01

    The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

  19. Development of high reliability and high processability thermosets for electronic packaging applications based on ternary systems of benzoxazine, epoxy and phenolic resins

    NASA Astrophysics Data System (ADS)

    Rimdusit, Sarawut

    We have developed new polymeric systems based on the ternary mixture of benzoxazine, epoxy, and phenolic novolac resins. Low melt viscosity resins render void free specimens with minimal processing steps. The material properties show a wide range of desirable reliability and processability which are highly dependent on the composition of the monomers in the mixture. Fourier transform mechanical spectroscopy techniques (FTMS) are utilized as a powerful tool to study the sol-gel transition of covalently bonded polymeric networks. The gelation of the ternary mixture shows an Arrhenius-type behavior and the gel time can be well-predicted by the Arrhenius equation. The synergism in the glass transition temperature of these ternary systems is also reported. The molecular rigidity from benzoxazine and the improved crosslink density from epoxy contribute to the synergestic behavior. The mechanical relaxation spectra of the fully cured ternary systems in the temperature range of -140°C to 350°C show four types of relaxation transitions i.e. gamma, beta, alpha1, and alpha2-transitions. Thermal conductivity of the molding compounds based on these ternary mixtures exhibits a very high value of about 27 W/mk in aggregate-type boron nitride filler and the value of about 8.6 W/mk in flake-like crystal boron nitride filler comparing at the same filler loading of 68% by volume. The presence of epoxy resin in the ternary systems is found to provide improvement in a high temperature adhesion. The curing kinetics based on dynamic DSC results of this ternary system show nth order kinetics with an overall reaction order of 1.5 having activation energy of 111 kJ/mol whereas that of the gelation process is 75 kJ/mol. Thermal degradation process of this resin is deceleratory type with activation energy of 185 kJ/mol. A choice of a resin used for the study can provide maximum Tg of about 220°C in its fully cured specimen. The system has a potential use as high performance electronic

  20. XCPU2 process management system

    SciTech Connect

    Ionkov, Latchesar; Van Hensbergen, Eric

    2009-01-01

    Xcpu2 is a new process management system that allows the users to specify custom file system for a running job. Most cluster management systems enforce single software distribution running on all nodes. Xcpu2 allows programs running on the cluster to work in environment identical to the user's desktop, using the same versions of the libraries and tools the user installed locally, and accessing the configuration file in the same places they are located on the desktop. Xcpu2 builds on our earlier work with the Xcpu system. Like Xcpu, Xcpu2's process management interface is represented as a set of files exported by a 9P file server. It supports heterogeneous clusters and multiple head nodes. Unlike Xcpu, it uses pull instead of push model. In this paper we describe the Xcpu2 clustering model, its operation and how the per-job filesystem configuration can be used to solve some of the common problems when running a cluster.

  1. Heatpipe power system development

    SciTech Connect

    Houts, M.G.; Poston, D.I.

    1998-12-31

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of the project was to develop a design approach that could enable the development of near-term, low-cost, space fission-power systems. Sixteen desired attributes were identified for such systems and detailed analyses were performed to verify that they are feasible. Preliminary design work was performed on one concept, the Heatpipe Power system (HPS). As a direct result of this project, funding was obtained from the National Aeronautics and Space Administration to build and test an HPS module. The module tests went well, and they now have funding to build a bimodal module.

  2. A Study to Determine the Optimal Strategic Planning Process for Controlling and Coordinating the In-House Development of an Integrated Computer- Supported Hospital Information System

    DTIC Science & Technology

    1982-05-01

    This study examines Strategic Planning concepts and how they relate to the development of Hospital Information Systems. The author recommends that... Strategic Planning methods be utilized in the development of Hospital Information Systems, and provides guidance on how to do so. Keywords: Theses...Integrated information systems; Hospital administration; Computer networks; Information exchange; Health care; Strategic planning ; Information systems.

  3. Parallel processing and expert systems

    NASA Technical Reports Server (NTRS)

    Lau, Sonie; Yan, Jerry C.

    1991-01-01

    Whether it be monitoring the thermal subsystem of Space Station Freedom, or controlling the navigation of the autonomous rover on Mars, NASA missions in the 1990s cannot enjoy an increased level of autonomy without the efficient implementation of expert systems. Merely increasing the computational speed of uniprocessors may not be able to guarantee that real-time demands are met for larger systems. Speedup via parallel processing must be pursued alongside the optimization of sequential implementations. Prototypes of parallel expert systems have been built at universities and industrial laboratories in the U.S. and Japan. The state-of-the-art research in progress related to parallel execution of expert systems is surveyed. The survey discusses multiprocessors for expert systems, parallel languages for symbolic computations, and mapping expert systems to multiprocessors. Results to date indicate that the parallelism achieved for these systems is small. The main reasons are (1) the body of knowledge applicable in any given situation and the amount of computation executed by each rule firing are small, (2) dividing the problem solving process into relatively independent partitions is difficult, and (3) implementation decisions that enable expert systems to be incrementally refined hamper compile-time optimization. In order to obtain greater speedups, data parallelism and application parallelism must be exploited.

  4. Expert Systems Development Methodology

    DTIC Science & Technology

    1989-07-28

    two volumes. Volume 1 is the Development Metodology and Volume 2 is an Evaluation Methodology containing methods for evaluation, validation and...system are written in an English -like language which almost anyone can understand. Thus programming in rule based systems can become "programming for...computers and others have little understanding about how computers work. The knowledge engineer must therefore be willing and able to teach the expert

  5. POWER SYSTEMS DEVELOPMENT FACILITY

    SciTech Connect

    Unknown

    2002-05-01

    This report discusses test campaign GCT3 of the Halliburton KBR transport reactor train with a Siemens Westinghouse Power Corporation (Siemens Westinghouse) particle filter system at the Power Systems Development Facility (PSDF) located in Wilsonville, Alabama. The transport reactor is an advanced circulating fluidized-bed reactor designed to operate as either a combustor or a gasifier using one of two possible particulate control devices (PCDs). The transport reactor was operated as a pressurized gasifier during GCT3. GCT3 was planned as a 250-hour test run to commission the loop seal and continue the characterization of the limits of operational parameter variations using a blend of several Powder River Basin coals and Bucyrus limestone from Ohio. The primary test objectives were: (1) Loop Seal Commissioning--Evaluate the operational stability of the loop seal with sand and limestone as a bed material at different solids circulation rates and establish a maximum solids circulation rate through the loop seal with the inert bed. (2) Loop Seal Operations--Evaluate the loop seal operational stability during coal feed operations and establish maximum solids circulation rate. Secondary objectives included the continuation of reactor characterization, including: (1) Operational Stability--Characterize the reactor loop and PCD operations with short-term tests by varying coal feed, air/coal ratio, riser velocity, solids circulation rate, system pressure, and air distribution. (2) Reactor Operations--Study the devolatilization and tar cracking effects from transient conditions during transition from start-up burner to coal. Evaluate the effect of process operations on heat release, heat transfer, and accelerated fuel particle heat-up rates. Study the effect of changes in reactor conditions on transient temperature profiles, pressure balance, and product gas composition. (3) Effects of Reactor Conditions on Syngas Composition--Evaluate the effect of air distribution, steam

  6. Grounding Development in Cognitive Processes.

    ERIC Educational Resources Information Center

    Samuelson, Larissa K.; Smith, Linda B.

    2000-01-01

    Argues that the operating characteristics of perceiving and remembering provide a foundation for progress on detailing the processes through which knowledge is realized in real-time tasks and in detailing the processes of developmental change. Includes three examples to illustrate how forming developmental hypotheses in terms of perceiving and…

  7. PMIS: System Description. PMIS Project. Planning & Management Information System. A Project To Develop a Data Processing System for Support of the Planning and Management Needs of Local School Districts.

    ERIC Educational Resources Information Center

    Council of the Great City Schools, Washington, DC.

    PMIS (Planning and Management Information System) is an information system that supports the decisionmaking process of executive management in local school districts. The system is designed around a comprehensive, longitudinal, and interrelated data base. It utilizes a powerful real-time, interactive data management system for strategic planning;…

  8. LANL receiver system development

    SciTech Connect

    Laubscher, B.; Cooke, B.; Cafferty, M.; Olivas, N.

    1997-08-01

    The CALIOPE receiver system development at LANL is the story of two technologies. The first of these technologies consists of off-the-shelf mercury-cadmium-telluride (MCT) detectors and amplifiers. The vendor for this system is Kolmar Technologies. This system was fielded in the Tan Trailer I (TTI) in 1995 and will be referred to in this paper as GEN I. The second system consists of a MCT detector procured from Santa Barbara Research Center (SBRC) and an amplifier designed and built by LANL. This system was fielded in the Tan Trailer II (TTII) system at the NTS tests in 1996 and will be referred to as GEN II. The LANL CALIOPE experimental plan for 1996 was to improve the lidar system by progressing to a higher rep rate laser to perform many shots in a much shorter period of time. In keeping with this plan, the receiver team set a goal of developing a detector system that was background limited for the projected 100 nanosecond (ns) laser pulse. A set of detailed simulations of the DIAL lidar experiment was performed. From these runs, parameters such as optimal detector size, field of view of the receiver system, nominal laser return power, etc. were extracted. With this information, detector physics and amplifier electronic models were developed to obtain the required specifications for each of these components. These derived specs indicated that a substantial improvement over commercially available, off-the-shelf, amplifier and detector technologies would be needed to obtain the goals. To determine if the original GEN I detector was usable, the authors performed tests on a 100 micron square detector at cryogenic temperatures. The results of this test and others convinced them that an advanced detector was required. Eventually, a suitable detector was identified and a number of these single element detectors were procured from SBRC. These single element detectors were witness for the detector arrays built for another DOE project.

  9. Series Bosch System Development

    NASA Technical Reports Server (NTRS)

    Abney, Morgan B.; Evans, Christopher; Mansell, Matt; Swickrath, Michael

    2012-01-01

    State-of-the-art (SOA) carbon dioxide (CO2) reduction technology for the International Space Station produces methane as a byproduct. This methane is subsequently vented overboard. The associated loss of hydrogen ultimately reduces the mass of oxygen that can be recovered from CO2 in a closed-loop life support system. As an alternative to SOA CO2 reduction technology, NASA is exploring a Series-Bosch system capable of reducing CO2 with hydrogen to form water and solid carbon. This results in 100% theoretical recovery of oxygen from metabolic CO2. In the past, Bosch-based technology did not trade favorably against SOA technology due to a high power demand, low reaction efficiencies, concerns with carbon containment, and large resupply requirements necessary to replace expended catalyst cartridges. An alternative approach to Bosch technology, labeled "Series-Bosch," employs a new system design with optimized multi-stage reactors and a membrane-based separation and recycle capability. Multi-physics modeling of the first stage reactor, along with chemical process modeling of the integrated system, has resulted in a design with potential to trade significantly better than previous Bosch technology. The modeling process and resulting system architecture selection are discussed.

  10. A plasma process monitor/control system

    SciTech Connect

    Stevenson, J.O.; Ward, P.P.; Smith, M.L.; Markle, R.J.

    1997-08-01

    Sandia National Laboratories has developed a system to monitor plasma processes for control of industrial applications. The system is designed to act as a fully automated, sand-alone process monitor during printed wiring board and semiconductor production runs. The monitor routinely performs data collection, analysis, process identification, and error detection/correction without the need for human intervention. The monitor can also be used in research mode to allow process engineers to gather additional information about plasma processes. The plasma monitor can perform real-time control of support systems known to influence plasma behavior. The monitor can also signal personnel to modify plasma parameters when the system is operating outside of desired specifications and requires human assistance. A notification protocol can be selected for conditions detected in the plasma process. The Plasma Process Monitor/Control System consists of a computer running software developed by Sandia National Laboratories, a commercially available spectrophotometer equipped with a charge-coupled device camera, an input/output device, and a fiber optic cable.

  11. Advanced PPA Reactor and Process Development

    NASA Technical Reports Server (NTRS)

    Wheeler, Raymond; Aske, James; Abney, Morgan B.; Miller, Lee A.; Greenwood, Zachary

    2012-01-01

    Design and development of a second generation Plasma Pyrolysis Assembly (PPA) reactor is currently underway as part of NASA s Atmosphere Revitalization Resource Recovery effort. By recovering up to 75% of the hydrogen currently lost as methane in the Sabatier reactor effluent, the PPA helps to minimize life support resupply costs for extended duration missions. To date, second generation PPA development has demonstrated significant technology advancements over the first generation device by doubling the methane processing rate while, at the same time, more than halving the required power. One development area of particular interest to NASA system engineers is fouling of the PPA reactor with carbonaceous products. As a mitigation plan, NASA MSFC has explored the feasibility of using an oxidative plasma based upon metabolic CO2 to regenerate the reactor window and gas inlet ports. The results and implications of this testing are addressed along with the advanced PPA reactor development work.

  12. Advanced information processing system: Local system services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Alger, Linda; Whittredge, Roy; Stasiowski, Peter

    1989-01-01

    The Advanced Information Processing System (AIPS) is a multi-computer architecture composed of hardware and software building blocks that can be configured to meet a broad range of application requirements. The hardware building blocks are fault-tolerant, general-purpose computers, fault-and damage-tolerant networks (both computer and input/output), and interfaces between the networks and the computers. The software building blocks are the major software functions: local system services, input/output, system services, inter-computer system services, and the system manager. The foundation of the local system services is an operating system with the functions required for a traditional real-time multi-tasking computer, such as task scheduling, inter-task communication, memory management, interrupt handling, and time maintenance. Resting on this foundation are the redundancy management functions necessary in a redundant computer and the status reporting functions required for an operator interface. The functional requirements, functional design and detailed specifications for all the local system services are documented.

  13. Process Accountability in Curriculum Development.

    ERIC Educational Resources Information Center

    Gooler, Dennis D.; Grotelueschen, Arden

    This paper urges the curriculum developer to assume the accountability for his decisions necessitated by the actual ways our society functions. The curriculum developer is encouraged to recognize that he is a salesman with a commodity (the curriculum). He is urged to realize that if he cannot market the package to the customers (the various…

  14. Trauma system development.

    PubMed

    Lendrum, R A; Lockey, D J

    2013-01-01

    The word 'trauma' describes the disease entity resulting from physical injury. Trauma is one of the leading causes of death worldwide and deaths due to injury look set to increase. As early as the 1970s, it became evident that centralisation of resources and expertise could reduce the mortality rate from serious injury and that organisation of trauma care delivery into formal systems could improve outcome further. Internationally, trauma systems have evolved in various forms, with widespread reports of mortality and functional outcome benefits when major trauma management is delivered in this way. The management of major trauma in England is currently undergoing significant change. The London Trauma System began operating in April 2010 and others throughout England became operational this year. Similar systems exist internationally and continue to be developed. Anaesthetists have been and continue to be involved with all levels of trauma care delivery, from the provision of pre-hospital trauma and retrieval teams, through to chronic pain management and rehabilitation of patients back into society. This review examines the international development of major trauma care delivery and the components of a modern trauma system.

  15. Atmospheric and Oceanographic Information Processing System (AOIPS) system description

    NASA Technical Reports Server (NTRS)

    Bracken, P. A.; Dalton, J. T.; Billingsley, J. B.; Quann, J. J.

    1977-01-01

    The development of hardware and software for an interactive, minicomputer based processing and display system for atmospheric and oceanographic information extraction and image data analysis is described. The major applications of the system are discussed as well as enhancements planned for the future.

  16. Skylab materials processing facility experiment developer's report

    NASA Technical Reports Server (NTRS)

    Parks, P. G.

    1975-01-01

    The development of the Skylab M512 Materials Processing Facility is traced from the design of a portable, self-contained electron beam welding system for terrestrial applications to the highly complex experiment system ultimately developed for three Skylab missions. The M512 experiment facility was designed to support six in-space experiments intended to explore the advantages of manufacturing materials in the near-zero-gravity environment of Earth orbit. Detailed descriptions of the M512 facility and related experiment hardware are provided, with discussions of hardware verification and man-machine interfaces included. An analysis of the operation of the facility and experiments during the three Skylab missions is presented, including discussions of the hardware performance, anomalies, and data returned to earth.

  17. Internal insulation system development

    NASA Technical Reports Server (NTRS)

    Gille, J. P.

    1973-01-01

    The development of an internal insulation system for cryogenic liquids is described. The insulation system is based on a gas layer concept in which capillary or surface tension effects are used to maintain a stable gas layer within a cellular core structure between the tank wall and the contained cryogen. In this work, a 1.8 meter diameter tank was insulated and tested with liquid hydrogen. Ability to withstand cycling of the aluminum tank wall to 450 K was a design and test condition.

  18. ERIPS: Earth Resource Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Quinn, M. J.

    1975-01-01

    The ERIPS is an interactive computer system used in the analysis of remotely sensed data. It consists of a set of software programs which are executed on an IBM System/360 Model 75J computer under the direction of a trained analyst. The software was a derivative of the Purdue LARSYS program and has evolved to include an extensive pattern recognition system and a number of manipulative, preprocessing routines which prepare the imagery for the pattern recognition application. The original purpose of the system was to analyze remotely sensed data, to develop and perfect techniques to process the data, and to determine the feasibility of applying the data to significant earth resources problems. The System developed into a production system. Error recovery and multi-jobbing capabilities were added to the system.

  19. Deciphering the Role of Tectonic and Climatic Processes on the Landscape Development of the Patagonian Andes Along the Liquiñe-Ofqui Fault System, Chile

    NASA Astrophysics Data System (ADS)

    Buscher, J.; Morata, D.; Arancibia, G.; Cembrano, J. M.

    2016-12-01

    Transpressional plate boundaries often exhibit a correlation between plate obliquity and crustal deformation, but establishing spatial and temporal constraints on this relationship is challenging. The presence of continuous rugged topography along many transpressional fault zones as well as along-fault translation of crustal blocks can obscure the link between plate boundary geometry and mountain belt development. The Liquiñe-Ofqui fault system in the Patagonian Andes is an intra-arc dextral-reverse fault zone linked to oblique plate convergence between the Nazca and South America plates that represents a model setting for studying transpressional landscape development. The topography along the Liquiñe-Ofqui fault system is characterized by glacially and fluvially carved rocks of the Patagonian batholith interspersed by a chain of volcanoes that extends subparallel to the fault zone. Available structural and low-temperature thermochronometry data from the region suggest that both transpressional exhumation and glacial erosion have contributed to the long-term development of the orogen (Cembrano et al., 2002; Thomson, 2002; Thomson et al., 2010). Of particular interest is a near-field locus of young cooling ages thought to reflect shear heating along the fault zone (Thomson, 2002) or focused glacial erosion (Thomson et al., 2010; Herman and Brandon, 2015). To help quantify the topographic response to tectonic and climatic processes along the fault zone, we have evaluated first-order topographic features (gross distribution of elevation, relief and slope) and conducted river profile analyses (stream length-gradient, normalized channel steepness and stream convexity indices) using SRTM digital elevation data for comparison with low-temperature thermochronometry data. Preliminary results suggest that the distribution of topographic and river profile features varies with location along the Liquiñe-Ofqui fault system.

  20. Parallel processing and expert systems

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Lau, Sonie

    1991-01-01

    Whether it be monitoring the thermal subsystem of Space Station Freedom, or controlling the navigation of the autonomous rover on Mars, NASA missions in the 90's cannot enjoy an increased level of autonomy without the efficient use of expert systems. Merely increasing the computational speed of uniprocessors may not be able to guarantee that real time demands are met for large expert systems. Speed-up via parallel processing must be pursued alongside the optimization of sequential implementations. Prototypes of parallel expert systems have been built at universities and industrial labs in the U.S. and Japan. The state-of-the-art research in progress related to parallel execution of expert systems was surveyed. The survey is divided into three major sections: (1) multiprocessors for parallel expert systems; (2) parallel languages for symbolic computations; and (3) measurements of parallelism of expert system. Results to date indicate that the parallelism achieved for these systems is small. In order to obtain greater speed-ups, data parallelism and application parallelism must be exploited.

  1. Development of the auditory system.

    PubMed

    Litovsky, Ruth

    2015-01-01

    Auditory development involves changes in the peripheral and central nervous system along the auditory pathways, and these occur naturally, and in response to stimulation. Human development occurs along a trajectory that can last decades, and is studied using behavioral psychophysics, as well as physiologic measurements with neural imaging. The auditory system constructs a perceptual space that takes information from objects and groups, segregates sounds, and provides meaning and access to communication tools such as language. Auditory signals are processed in a series of analysis stages, from peripheral to central. Coding of information has been studied for features of sound, including frequency, intensity, loudness, and location, in quiet and in the presence of maskers. In the latter case, the ability of the auditory system to perform an analysis of the scene becomes highly relevant. While some basic abilities are well developed at birth, there is a clear prolonged maturation of auditory development well into the teenage years. Maturation involves auditory pathways. However, non-auditory changes (attention, memory, cognition) play an important role in auditory development. The ability of the auditory system to adapt in response to novel stimuli is a key feature of development throughout the nervous system, known as neural plasticity.

  2. Development of the auditory system

    PubMed Central

    Litovsky, Ruth

    2015-01-01

    Auditory development involves changes in the peripheral and central nervous system along the auditory pathways, and these occur naturally, and in response to stimulation. Human development occurs along a trajectory that can last decades, and is studied using behavioral psychophysics, as well as physiologic measurements with neural imaging. The auditory system constructs a perceptual space that takes information from objects and groups, segregates sounds, and provides meaning and access to communication tools such as language. Auditory signals are processed in a series of analysis stages, from peripheral to central. Coding of information has been studied for features of sound, including frequency, intensity, loudness, and location, in quiet and in the presence of maskers. In the latter case, the ability of the auditory system to perform an analysis of the scene becomes highly relevant. While some basic abilities are well developed at birth, there is a clear prolonged maturation of auditory development well into the teenage years. Maturation involves auditory pathways. However, non-auditory changes (attention, memory, cognition) play an important role in auditory development. The ability of the auditory system to adapt in response to novel stimuli is a key feature of development throughout the nervous system, known as neural plasticity. PMID:25726262

  3. Microarray Genomic Systems Development

    DTIC Science & Technology

    2008-06-01

    D Canada Contract Report DRDC Suffield CR 2009-145 June 2008 V. Lam, M. Crichton , T. Dickinson Laing, and D.C. Mah Canada West Biosciences Inc...Genomic Systems Development V. Lam, M. Crichton , T. Dickinson Laing, and D.C. Mah Canada West Biosciences Inc. Canada West Biosciences Inc. 5429... Crichton , M.; Dickinson Laing, T.; Mah, D.C.; DRDC Suffield CR 2009- 145; Defence R&D Canada – Suffield; June 2008. Introduction: Conventional

  4. Expert System Development Methodology (ESDM)

    NASA Technical Reports Server (NTRS)

    Sary, Charisse; Gilstrap, Lewey; Hull, Larry G.

    1990-01-01

    The Expert System Development Methodology (ESDM) provides an approach to developing expert system software. Because of the uncertainty associated with this process, an element of risk is involved. ESDM is designed to address the issue of risk and to acquire the information needed for this purpose in an evolutionary manner. ESDM presents a life cycle in which a prototype evolves through five stages of development. Each stage consists of five steps, leading to a prototype for that stage. Development may proceed to a conventional development methodology (CDM) at any time if enough has been learned about the problem to write requirements. ESDM produces requirements so that a product may be built with a CDM. ESDM is considered preliminary because is has not yet been applied to actual projects. It has been retrospectively evaluated by comparing the methods used in two ongoing expert system development projects that did not explicitly choose to use this methodology but which provided useful insights into actual expert system development practices and problems.

  5. Restructure Staff Development for Systemic Change

    ERIC Educational Resources Information Center

    Kelly, Thomas F.

    2012-01-01

    This paper presents a systems approach based on the work of W. Edwards Deming to system wide, high impact staff development. Deming has pointed out the significance of structure in systems. By restructuring the process of staff development we can bring about cost effective improvement of the whole system. We can improve student achievement while…

  6. Development of a comprehensive weld process model

    SciTech Connect

    Radhakrishnan, B.; Zacharia, T.; Paul, A.

    1997-05-01

    This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC`s expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three-dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above.

  7. Optimizing and developing a continuous separation system for the wet process separation of aluminum and polyethylene in aseptic composite packaging waste.

    PubMed

    Yan, Dahai; Peng, Zheng; Liu, Yuqiang; Li, Li; Huang, Qifei; Xie, Minghui; Wang, Qi

    2015-01-01

    The consumption of milk in China is increasing as living standards rapidly improve, and huge amounts of aseptic composite milk packaging waste are being generated. Aseptic composite packaging is composed of paper, polyethylene, and aluminum. It is difficult to separate the polyethylene and aluminum, so most of the waste is currently sent to landfill or incinerated with other municipal solid waste, meaning that enormous amounts of resources are wasted. A wet process technique for separating the aluminum and polyethylene from the composite materials after the paper had been removed from the original packaging waste was studied. The separation efficiency achieved using different separation reagents was compared, different separation mechanisms were explored, and the impacts of a range of parameters, such as the reagent concentration, temperature, and liquid-solid ratio, on the separation time and aluminum loss ratio were studied. Methanoic acid was found to be the optimal separation reagent, and the suitable conditions were a reagent concentration of 2-4 mol/L, a temperature of 60-80°C, and a liquid-solid ratio of 30 L/kg. These conditions allowed aluminum and polyethylene to be separated in less than 30 min, with an aluminum loss ratio of less than 3%. A mass balance was produced for the aluminum-polyethylene separation system, and control technique was developed to keep the ion concentrations in the reaction system stable. This allowed a continuous industrial-scale process for separating aluminum and polyethylene to be developed, and a demonstration facility with a capacity of 50t/d was built. The demonstration facility gave polyethylene and aluminum recovery rates of more than 98% and more than 72%, respectively. Separating 1t of aluminum-polyethylene composite packaging material gave a profit of 1769 Yuan, meaning that an effective method for recycling aseptic composite packaging waste was achieved. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Development of the Selective Hydrophobic Coagulation process

    SciTech Connect

    Yoon, R.H.; Luttrell, G.H.

    1992-01-01

    A novel technique for selectively coagulating and separating coal from dispersed mineral matter has been developed at Virginia Tech. The process, Selective Hydrophobic Coagulation (SHC), has been studied since 1986 under the sponsorship of the US Department of Energy (Contracts AC22-86PC91221 and AC22-90PC90174). The SHC process differs from oil agglomeration, shear or polymer flocculation, and electrolytic coagulation processes in that it does not require reagents or additives to induce the formation of coagula. In most cases, simple pH control is all that is required to (1) induce the coagulation of coal particles and (2) effectively disperse particles of mineral matter. If the coal is oxidized, a small dosage of reagents can be used to enhance the process. During the quarter, the Anutech Mark IV surface force apparatus was used to generate surface force-distance data for the mica/dodecylamine hydrochloride system (Task 2.1.1). Work to characterize the hydrophobicity of this system and the mica/DDOA[sup [minus

  9. Chemical production processes and systems

    DOEpatents

    Holladay, Johnathan E.; Muzatko, Danielle S.; White, James F.; Zacher, Alan H.

    2014-06-17

    Hydrogenolysis systems are provided that can include a reactor housing an Ru-comprising hydrogenolysis catalyst and wherein the contents of the reactor is maintained at a neutral or acidic pH. Reactant reservoirs within the system can include a polyhydric alcohol compound and a base, wherein a weight ratio of the base to the compound is less than 0.05. Systems also include the product reservoir comprising a hydrogenolyzed polyhydric alcohol compound and salts of organic acids, and wherein the moles of base are substantially equivalent to the moles of salts or organic acids. Processes are provided that can include an Ru-comprising catalyst within a mixture having a neutral or acidic pH. A weight ratio of the base to the compound can be between 0.01 and 0.05 during exposing.

  10. Chemical production processes and systems

    DOEpatents

    Holladay, Johnathan E; Muzatko, Danielle S; White, James F; Zacher, Alan H

    2015-04-21

    Hydrogenolysis systems are provided that can include a reactor housing an Ru-comprising hydrogenolysis catalyst and wherein the contents of the reactor is maintained at a neutral or acidic pH. Reactant reservoirs within the system can include a polyhydric alcohol compound and a base, wherein a weight ratio of the base to the compound is less than 0.05. Systems also include the product reservoir comprising a hydrogenolyzed polyhydric alcohol compound and salts of organic acids, and wherein the moles of base are substantially equivalent to the moles of salts or organic acids. Processes are provided that can include an Ru-comprising catalyst within a mixture having a neutral or acidic pH. A weight ratio of the base to the compound can be between 0.01 and 0.05 during exposing.

  11. NDMAS System and Process Description

    SciTech Connect

    Larry Hull

    2012-10-01

    Experimental data generated by the Very High Temperature Reactor Program need to be more available to users in the form of data tables on Web pages that can be downloaded to Excel or in delimited text formats that can be used directly for input to analysis and simulation codes, statistical packages, and graphics software. One solution that can provide current and future researchers with direct access to the data they need, while complying with records management requirements, is the Nuclear Data Management and Analysis System (NDMAS). This report describes the NDMAS system and its components, defines roles and responsibilities, describes the functions the system performs, describes the internal processes the NDMAS team uses to carry out the mission, and describes the hardware and software used to meet Very High Temperature Reactor Program needs.

  12. Development of lysozyme-combined antibacterial system to reduce sulfur dioxide and to stabilize Italian Riesling ice wine during aging process

    PubMed Central

    Chen, Kai; Han, Shun-yu; Zhang, Bo; Li, Min; Sheng, Wen-jun

    2015-01-01

    For the purpose of SO2 reduction and stabilizing ice wine, a new antibacterial technique was developed and verified in order to reduce the content of sulfur dioxide (SO2) and simultaneously maintain protein stability during ice wine aging process. Hazardous bacterial strain (lactic acid bacteria, LAB) and protein stability of Italian Riesling ice wine were evaluated in terms of different amounts of lysozyme, SO2, polyphenols, and wine pH by single-factor experiments. Subsequently, a quadratic rotation-orthogonal composite design with four variables was conducted to establish the multiple linear regression model that demonstrated the influence of different treatments on synthesis score between LAB inhibition and protein stability of ice wine. The results showed that, synthesis score can be influenced by lysozyme and SO2 concentrations on an extremely significant level (P < 0.01). Furthermore, the lysozyme-combined antibacterial system, which is specially designed for ice wine aging, was optimized step by step by response surface methodology and ridge analysis. As a result, the optimal proportion should be control in ice wine as follows: 179.31 mg L−1 lysozyme, 177.14 mg L−1 SO2, 0.60 g L−1 polyphenols, and 4.01 ice wine pH. Based on this system, the normalized synthesis score between LAB inhibition and protein stability can reach the highest point 0.920. Finally, by the experiments of verification and comparison, it was indicated that lysozyme-combined antibacterial system, which was a practical and prospective method to reduce SO2 concentration and effectively prevent contamination from hazardous LAB, can be used to stabilize ice wine during aging process. PMID:26405531

  13. Development of lysozyme-combined antibacterial system to reduce sulfur dioxide and to stabilize Italian Riesling ice wine during aging process.

    PubMed

    Chen, Kai; Han, Shun-Yu; Zhang, Bo; Li, Min; Sheng, Wen-Jun

    2015-09-01

    For the purpose of SO2 reduction and stabilizing ice wine, a new antibacterial technique was developed and verified in order to reduce the content of sulfur dioxide (SO2) and simultaneously maintain protein stability during ice wine aging process. Hazardous bacterial strain (lactic acid bacteria, LAB) and protein stability of Italian Riesling ice wine were evaluated in terms of different amounts of lysozyme, SO2, polyphenols, and wine pH by single-factor experiments. Subsequently, a quadratic rotation-orthogonal composite design with four variables was conducted to establish the multiple linear regression model that demonstrated the influence of different treatments on synthesis score between LAB inhibition and protein stability of ice wine. The results showed that, synthesis score can be influenced by lysozyme and SO2 concentrations on an extremely significant level (P < 0.01). Furthermore, the lysozyme-combined antibacterial system, which is specially designed for ice wine aging, was optimized step by step by response surface methodology and ridge analysis. As a result, the optimal proportion should be control in ice wine as follows: 179.31 mg L(-1) lysozyme, 177.14 mg L(-1) SO2, 0.60 g L(-1) polyphenols, and 4.01 ice wine pH. Based on this system, the normalized synthesis score between LAB inhibition and protein stability can reach the highest point 0.920. Finally, by the experiments of verification and comparison, it was indicated that lysozyme-combined antibacterial system, which was a practical and prospective method to reduce SO2 concentration and effectively prevent contamination from hazardous LAB, can be used to stabilize ice wine during aging process.

  14. Moral Development: The Process and the Pattern.

    ERIC Educational Resources Information Center

    Stonehouse, Cathy

    1979-01-01

    Examines contributions made by both Piaget and Kohlberg to understanding the process and pattern of moral development. Kohlberg's theory is based on Piaget's study of cognitive development that provided an understanding of the development process and the factors that cause the development of moral reasoning. Some implications for teachers are…

  15. Requirement Development Process and Tools

    NASA Technical Reports Server (NTRS)

    Bayt, Robert

    2017-01-01

    Requirements capture the system-level capabilities in a set of complete, necessary, clear, attainable, traceable, and verifiable statements of need. Requirements should not be unduly restrictive, but should set limits that eliminate items outside the boundaries drawn, encourage competition (or alternatives), and capture source and reason of requirement. If it is not needed by the customer, it is not a requirement. They establish the verification methods that will lead to product acceptance. These must be reproducible assessment methods.

  16. Process Developed for Forming Urethane Ice Models

    NASA Technical Reports Server (NTRS)

    Vannuyen, Thomas

    1998-01-01

    A new process for forming ice shapes on an aircraft wing was developed at the NASA Lewis Research Center. The innovative concept was formed by Lewis' Icing Research Tunnel (IRT) team, and the hardware was manufactured by Lewis' Manufacturing Engineering Division. This work was completed to increase our understanding of the stability and control of aircraft during icing conditions. This project will also enhance our evaluation of true aerodynamic wind tunnel effects on aircraft. In addition, it can be used as a design tool for evaluating ice protection systems.

  17. Traffic camera system development

    NASA Astrophysics Data System (ADS)

    Hori, Toshi

    1997-04-01

    The intelligent transportation system has generated a strong need for the development of intelligent camera systems to meet the requirements of sophisticated applications, such as electronic toll collection (ETC), traffic violation detection and automatic parking lot control. In order to achieve the highest levels of accuracy in detection, these cameras must have high speed electronic shutters, high resolution, high frame rate, and communication capabilities. A progressive scan interline transfer CCD camera, with its high speed electronic shutter and resolution capabilities, provides the basic functions to meet the requirements of a traffic camera system. Unlike most industrial video imaging applications, traffic cameras must deal with harsh environmental conditions and an extremely wide range of light. Optical character recognition is a critical function of a modern traffic camera system, with detection and accuracy heavily dependent on the camera function. In order to operate under demanding conditions, communication and functional optimization is implemented to control cameras from a roadside computer. The camera operates with a shutter speed faster than 1/2000 sec. to capture highway traffic both day and night. Consequently camera gain, pedestal level, shutter speed and gamma functions are controlled by a look-up table containing various parameters based on environmental conditions, particularly lighting. Lighting conditions are studied carefully, to focus only on the critical license plate surface. A unique light sensor permits accurate reading under a variety of conditions, such as a sunny day, evening, twilight, storms, etc. These camera systems are being deployed successfully in major ETC projects throughout the world.

  18. Materials Processing Research and Development

    DTIC Science & Technology

    2010-08-01

    of microstructural evolution, (5) development of Gamma and Beta-Gamma titanium alloys towards rolled sheets for thermal protection applications, ( 6 ...the hydrostatic stress. This work was published in Metallurgical and Materials Transactions A by Nicolaou, Miller, and Semiatin [ 6 ]. 4 2.2.2 The...observed values for the Titanium 6242s measured by Porter and John, as well as Ti6- 4 alloy reported on by Chan in Mater. Trans, 2008. In addition

  19. Materials Processing Research and Development

    DTIC Science & Technology

    2001-11-01

    possibility of developing a surface-pressure sensor for forging dies using an ultrasonic probe. The work conducted by UES in support of this project...NUMBER 4347 5e. TASK NUMBER 53 6. AUTHOR(S) Douglas R. Barker Robert L. Goetz 5f. WORK UNIT NUMBER 06 7. PERFORMING ORGANIZATION NAME(S...2 2.1.1 A Criterion for Intergranular Fracture During Hot Working of a Near Gamma

  20. Deficiency tracking system, conceptual business process requirements

    SciTech Connect

    Hermanson, M.L.

    1997-04-18

    The purpose of this document is to describe the conceptual business process requirements of a single, site-wide, consolidated, automated, deficiency management tracking, trending, and reporting system. This description will be used as the basis for the determination of the automated system acquisition strategy including the further definition of specific requirements, a ''make or buy'' determination and the development of specific software design details.

  1. FLIPS: Friendly Lisp Image Processing System

    NASA Astrophysics Data System (ADS)

    Gee, Shirley J.

    1991-08-01

    The Friendly Lisp Image Processing System (FLIPS) is the interface to Advanced Target Detection (ATD), a multi-resolutional image analysis system developed by Hughes in conjunction with the Hughes Research Laboratories. Both menu- and graphics-driven, FLIPS enhances system usability by supporting the interactive nature of research and development. Although much progress has been made, fully automated image understanding technology that is both robust and reliable is not a reality. In situations where highly accurate results are required, skilled human analysts must still verify the findings of these systems. Furthermore, the systems often require processing times several orders of magnitude greater than that needed by veteran personnel to analyze the same image. The purpose of FLIPS is to facilitate the ability of an image analyst to take statistical measurements on digital imagery in a timely fashion, a capability critical in research environments where a large percentage of time is expended in algorithm development. In many cases, this entails minor modifications or code tinkering. Without a well-developed man-machine interface, throughput is unduly constricted. FLIPS provides mechanisms which support rapid prototyping for ATD. This paper examines the ATD/FLIPS system. The philosophy of ATD in addressing image understanding problems is described, and the capabilities of FLIPS are discussed, along with a description of the interaction between ATD and FLIPS. Finally, an overview of current plans for the system is outlined.

  2. Propellant injection systems and processes

    NASA Technical Reports Server (NTRS)

    Ito, Jackson I.

    1995-01-01

    The previous 'Art of Injector Design' is maturing and merging with the more systematic 'Science of Combustion Device Analysis.' This technology can be based upon observation, correlation, experimentation and ultimately analytical modeling based upon basic engineering principles. This methodology is more systematic and far superior to the historical injector design process of 'Trial and Error' or blindly 'Copying Past Successes.' The benefit of such an approach is to be able to rank candidate design concepts for relative probability of success or technical risk in all the important combustion device design requirements and combustion process development risk categories before committing to an engine development program. Even if a single analytical design concept cannot be developed to predict satisfying all requirements simultaneously, a series of risk mitigation key enabling technologies can be identified for early resolution. Lower cost subscale or laboratory experimentation to demonstrate proof of principle, critical instrumentation requirements, and design discriminating test plans can be developed based on the physical insight provided by these analyses.

  3. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  4. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    NASA Astrophysics Data System (ADS)

    Dai, Heng; Ye, Ming; Walker, Anthony P.; Chen, Xingyuan

    2017-04-01

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods with variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. For demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.

  5. Pyro-processing Technology Development in Japan

    SciTech Connect

    Inoue, Tadashi; Koyama, Tadafumi; Myochin, Munetaka; Arai, Yasuo

    2007-07-01

    Metal fuel cycle with pyro-processing technology has another potential different from oxide fuel cycle with aqueous process. In addition to the advantage of metal fuel fast reactor, such as achieving a high breeding ratio over 1.3, the pyro-processing with metal electrorefining expects that no additional process is required to separate minor actinides and no organic solvent that degrades by radiation and acid is utilized. 'Feasibility Study on Commercialized Fast Reactor (FR) Cycle Systems' in Japan selected the metal fuel fast reactor fuel cycle with metal-electrorefining as the sub-system for future development. CRIEPI has been involving on R and D of pyro-processing technology with metal-electrorefining since 1980's and followed JAERI, that orients to apply on the treatment of spent target with nitride fuel for ADS, and JNC, currently merged to JAEA, and, then, wider collaboration started among CRIEPI/JAERI/JNC. The series verification of process starting with MOX pellets have produced U-Pu alloy after distillation through reduction and electrorefining at the facility installed in Tokai, JAEA. The metal fuel fabrication has started from the stage of U-Pu alloy fabrication from UO{sub 2} and PuO{sub 2} mixture by electrochemical reduction, and currently succeeds to produce a fuel slag of U-Pu-Zr of 30 cm by injection casting in Oarai, JAEA. The alloys are scheduled to irradiate in JOYO fast reactor core. The development of engineering model of electro-refiner and electrochemical reduction device has successfully conducted by use of UO{sub 2} with kg scale. In addition to domestic R and Ds, pyro-processing verification with genuine material is proceeding in the joint study of CRIEPI/ITU. TRU is extracted into cadmium from chloride prepared from HLLW through denitration by reductive extraction in a caisson installed in a hot cell, and, then, electrorefining by use of PHENIX-irradiated metal fuel with minor actinides is scheduled. Thus, the R and D on pyro-processing

  6. Dynamic security assessment processing system

    NASA Astrophysics Data System (ADS)

    Tang, Lei

    The architecture of dynamic security assessment processing system (DSAPS) is proposed to address online dynamic security assessment (DSA) with focus of the dissertation on low-probability, high-consequence events. DSAPS upgrades current online DSA functions and adds new functions to fit into the modern power grid. Trajectory sensitivity analysis is introduced and its applications in power system are reviewed. An index is presented to assess transient voltage dips quantitatively using trajectory sensitivities. Then the framework of anticipatory computing system (ACS) for cascading defense is presented as an important function of DSAPS. ACS addresses various security problems and the uncertainties in cascading outages. Corrective control design is automated to mitigate the system stress in cascading progressions. The corrective controls introduced in the dissertation include corrective security constrained optimal power flow, a two-stage load control for severe under-frequency conditions, and transient stability constrained optimal power flow for cascading outages. With state-of-the-art computing facilities to perform high-speed extended-term time-domain simulation and optimization for large-scale systems, DSAPS/ACS efficiently addresses online DSA for low-probability, high-consequence events, which are not addressed by today's industrial practice. Human interference is reduced in the computationally burdensome analysis.

  7. A systems process of reinforcement.

    PubMed

    Sudakov, K V

    1997-01-01

    Functional systems theory was used to consider the process of reinforcement of the actions on the body of reinforcing factors, i.e., the results of behavior satisfying the body's original needs. The systems process of reinforcement includes reverse afferentation entering the CNS from receptors acted upon by various parameters of the desired results, and mechanisms for comparing reverse afferentation with the apparatus which accepts the results of the action and the corresponding emotional component. A tight interaction between reinforcement and the dominant motivation is generated on the basis of the hologram principle. Reinforcement forms an apparatus for predicting a desired result, i.e. a result-of-action acceptor. Reinforcement procedures significant changes in the activities of individual neurons in the various brain structures involved in dominant motivation, transforming their spike activity for a burst pattern to regular discharges; there are also molecular changes in neuron properties. After preliminary reinforcement, the corresponding motivation induces the ribosomal system of neurons to start synthesizing special effector molecules, which organize molecular engrams of the acceptor of the action's result. Sensory mechanisms of reinforcement are considered, with particular reference to the information role of emotions.

  8. Performance Monitoring of Distributed Data Processing Systems

    NASA Technical Reports Server (NTRS)

    Ojha, Anand K.

    2000-01-01

    Test and checkout systems are essential components in ensuring safety and reliability of aircraft and related systems for space missions. A variety of systems, developed over several years, are in use at the NASA/KSC. Many of these systems are configured as distributed data processing systems with the functionality spread over several multiprocessor nodes interconnected through networks. To be cost-effective, a system should take the least amount of resource and perform a given testing task in the least amount of time. There are two aspects of performance evaluation: monitoring and benchmarking. While monitoring is valuable to system administrators in operating and maintaining, benchmarking is important in designing and upgrading computer-based systems. These two aspects of performance evaluation are the foci of this project. This paper first discusses various issues related to software, hardware, and hybrid performance monitoring as applicable to distributed systems, and specifically to the TCMS (Test Control and Monitoring System). Next, a comparison of several probing instructions are made to show that the hybrid monitoring technique developed by the NIST (National Institutes for Standards and Technology) is the least intrusive and takes only one-fourth of the time taken by software monitoring probes. In the rest of the paper, issues related to benchmarking a distributed system have been discussed and finally a prescription for developing a micro-benchmark for the TCMS has been provided.

  9. A fuzzy classifier system for process control

    NASA Technical Reports Server (NTRS)

    Karr, C. L.; Phillips, J. C.

    1994-01-01

    A fuzzy classifier system that discovers rules for controlling a mathematical model of a pH titration system was developed by researchers at the U.S. Bureau of Mines (USBM). Fuzzy classifier systems successfully combine the strengths of learning classifier systems and fuzzy logic controllers. Learning classifier systems resemble familiar production rule-based systems, but they represent their IF-THEN rules by strings of characters rather than in the traditional linguistic terms. Fuzzy logic is a tool that allows for the incorporation of abstract concepts into rule based-systems, thereby allowing the rules to resemble the familiar 'rules-of-thumb' commonly used by humans when solving difficult process control and reasoning problems. Like learning classifier systems, fuzzy classifier systems employ a genetic algorithm to explore and sample new rules for manipulating the problem environment. Like fuzzy logic controllers, fuzzy classifier systems encapsulate knowledge in the form of production rules. The results presented in this paper demonstrate the ability of fuzzy classifier systems to generate a fuzzy logic-based process control system.

  10. Dissection of Bacterial Wilt on Medicago truncatula Revealed Two Type III Secretion System Effectors Acting on Root Infection Process and Disease Development[C][W][OA

    PubMed Central

    Turner, Marie; Jauneau, Alain; Genin, Stéphane; Tavella, Marie-José; Vailleau, Fabienne; Gentzbittel, Laurent; Jardinaud, Marie-Françoise

    2009-01-01

    Ralstonia solanacearum is the causal agent of the devastating bacterial wilt disease, which colonizes susceptible Medicago truncatula via the intact root tip. Infection involves four steps: appearance of root tip symptoms, root tip cortical cell invasion, vessel colonization, and foliar wilting. We examined this pathosystem by in vitro inoculation of intact roots of susceptible or resistant M. truncatula with the pathogenic strain GMI1000. The infection process was type III secretion system dependent and required two type III effectors, Gala7 and AvrA, which were shown to be involved at different stages of infection. Both effectors were involved in development of root tip symptoms, and Gala7 was the main determinant for bacterial invasion of cortical cells. Vessel invasion depended on the host genetic background and was never observed in the resistant line. The invasion of the root tip vasculature in the susceptible line caused foliar wilting. The avrA mutant showed reduced aggressiveness in all steps of the infection process, suggesting a global role in R. solanacearum pathogenicity. The roles of these two effectors in subsequent stages were studied using an assay that bypassed the penetration step; with this assay, the avrA mutant showed no effect compared with the GMI1000 strain, indicating that AvrA is important in early stages of infection. However, later disease symptoms were reduced in the gala7 mutant, indicating a key role in later stages of infection. PMID:19493968

  11. Development of a prototype chest digital tomosynthesis (CDT) R/F system with fast image reconstruction using graphics processing unit (GPU) programming

    NASA Astrophysics Data System (ADS)

    Choi, Sunghoon; Lee, Seungwan; Lee, Haenghwa; Lee, Donghoon; Choi, Seungyeon; Shin, Jungwook; Seo, Chang-Woo; Kim, Hee-Joung

    2017-03-01

    Digital tomosynthesis offers the advantage of low radiation doses compared to conventional computed tomography (CT) by utilizing small numbers of projections ( 80) acquired over a limited angular range. It produces 3D volumetric data, although there are artifacts due to incomplete sampling. Based upon these characteristics, we developed a prototype digital tomosynthesis R/F system for applications in chest imaging. Our prototype chest digital tomosynthesis (CDT) R/F system contains an X-ray tube with high power R/F pulse generator, flat-panel detector, R/F table, electromechanical radiographic subsystems including a precise motor controller, and a reconstruction server. For image reconstruction, users select between analytic and iterative reconstruction methods. Our reconstructed images of Catphan700 and LUNGMAN phantoms clearly and rapidly described the internal structures of phantoms using graphics processing unit (GPU) programming. Contrast-to-noise ratio (CNR) values of the CTP682 module of Catphan700 were higher in images using a simultaneous algebraic reconstruction technique (SART) than in those using filtered back-projection (FBP) for all materials by factors of 2.60, 3.78, 5.50, 2.30, 3.70, and 2.52 for air, lung foam, low density polyethylene (LDPE), Delrin® (acetal homopolymer resin), bone 50% (hydroxyapatite), and Teflon, respectively. Total elapsed times for producing 3D volume were 2.92 s and 86.29 s on average for FBP and SART (20 iterations), respectively. The times required for reconstruction were clinically feasible. Moreover, the total radiation dose from our system (5.68 mGy) was lower than that of conventional chest CT scan. Consequently, our prototype tomosynthesis R/F system represents an important advance in digital tomosynthesis applications.

  12. Managing the Software Development Process

    NASA Technical Reports Server (NTRS)

    Lubelczky, Jeffrey T.; Parra, Amy

    1999-01-01

    The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.

  13. Managing the Software Development Process

    NASA Astrophysics Data System (ADS)

    Lubelczyk, J.; Parra, A.

    The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.

  14. Telemedicine optoelectronic biomedical data processing system

    NASA Astrophysics Data System (ADS)

    Prosolovska, Vita V.

    2010-08-01

    The telemedicine optoelectronic biomedical data processing system is created to share medical information for the control of health rights and timely and rapid response to crisis. The system includes the main blocks: bioprocessor, analog-digital converter biomedical images, optoelectronic module for image processing, optoelectronic module for parallel recording and storage of biomedical imaging and matrix screen display of biomedical images. Rated temporal characteristics of the blocks defined by a particular triggering optoelectronic couple in analog-digital converters and time imaging for matrix screen. The element base for hardware implementation of the developed matrix screen is integrated optoelectronic couples produced by selective epitaxy.

  15. [The systems process of reinforcement].

    PubMed

    Sudakov, K V

    1996-01-01

    The process of reinforcement is considered in the context of the general theory of functional systems as an important part of behavioural act organization closely interacting with the dominant motivation. It is shown that reinforcement substantially changes the activities of separate neurons in different brain structures involved in dominant motivation. After a preliminary reinforcement under the influence of corresponding motivation the ribosomal apparatus of neurons begins to synthesize special molecular engrams of the action acceptor. The sensory mechanisms of reinforcement and, especially, the role of emotions are considered in details in the paper.

  16. Development of a Sample Processing System (SPS) for the in situ search of organic compounds on Mars : application to the Mars Organic Molecule Analyzer (MOMA) experiment

    NASA Astrophysics Data System (ADS)

    Buch, A.; Sternberg, R.; Garnier, C.; Fressinet, C.; Szopa, C.; El Bekri, J.; Coll, P.; Rodier, C.; Raulin, F.; Goesmann, F.

    2008-09-01

    The search for past or present life signs is one of the primary goals of the future Mars exploratory missions. With this aim the Mars Organic Molecule Analyzer (MOMA) module of the ExoMars 2013 next coming European space mission is designed to the in situ analysis, in the Martian soil, of organic molecules of exobiological interest such as amino acids, carboxylic acids, nucleobases or polycyclic aromatic hydrocarbons (PAHs). In the frame of the MOMA experiment we have been developing a Sample Processing System (SPS) compatible with gas chromatography (GC) analysis. The main goal of SPS is to allow the extraction and the gas chromatography separation of the refractory organic compounds from a solid matrix at trace level within space compatible operating conditions. The SPS is a mini-reactor, containing the solid sample (~500mg), able to increase (or decrease) the internal temperature from 20 to 500 °C within 13 sec. The extraction step is therefore performed by using thermodesorption, the best yield of extraction being obtained at 300°C for 10 to 20 min. It has to be noticed that the temperature could be increased up to 500°C without a significant lost of efficiency if the heating run time is kept below 3 min. After the thermodesorption the chemical derivatization of the extracted compounds is performed directly on the soil with a mixture of MTBSTFA and DMF [buch et al.]. By decreasing the polarity of the target molecules, this step allows their volatilization at a temperature below 250°C without any chemical degradation. Once derivatized, the targeted volatile molecules are transferred through a heated transfer line in the gas chromatograph coupled with a mass spectrometer for the detection. The SPS is a "one step/one pot" sample preparation system which should allow the MOMA experiment to detect the refractory molecules absorbed in the Martian soil at a detection limit below the ppb level. A. Buch, R. Sternberg, C. Szopa, C. Freissinet, C. Garnier, J. El Bekri

  17. Advanced Development Strategies for Biopharmaceutical Cell Culture Processes.

    PubMed

    Zalai, Denes; Golabgir, Aydin; Wechselberger, Patrick; Putics, Akos; Herwig, Christoph

    2015-01-01

    The shift from empirical to science-based process development is considered to be a key factor to increase bioprocess performance and to reduce time to market for biopharmaceutical products in the near future. In the last decade, expanding knowledge in systems biology and bioprocess technology has delivered the foundation of the scientific understanding of relationships between process input parameters and process output features. Based on this knowledge, advanced process development approaches can be applied to maximize process performance and to generate process understanding. This review focuses on tools which enable the integration of physiological knowledge into cell culture process development. As a structured approach, the availability and the proposed benefit of the application of these tools are discussed for the subsequent stages of process development. The ultimate aim is to deliver a comprehensive overview of the current role of physiological understanding during cell culture process development from clone selection to the scale-up of advanced control strategies for ensuring process robustness.

  18. Tank Waste Remediation System optimized processing strategy

    SciTech Connect

    Slaathaug, E.J.; Boldt, A.L.; Boomer, K.D.; Galbraith, J.D.; Leach, C.E.; Waldo, T.L.

    1996-03-01

    This report provides an alternative strategy evolved from the current Hanford Site Tank Waste Remediation System (TWRS) programmatic baseline for accomplishing the treatment and disposal of the Hanford Site tank wastes. This optimized processing strategy performs the major elements of the TWRS Program, but modifies the deployment of selected treatment technologies to reduce the program cost. The present program for development of waste retrieval, pretreatment, and vitrification technologies continues, but the optimized processing strategy reuses a single facility to accomplish the separations/low-activity waste (LAW) vitrification and the high-level waste (HLW) vitrification processes sequentially, thereby eliminating the need for a separate HLW vitrification facility.

  19. Development of a high-precision image-processing automatic measurement system for MRI visceral fat images acquired using a binomial RF-excitation pulse.

    PubMed

    Nakai, Ryusuke; Azuma, Takashi; Kishimoto, Taizou; Hirata, Tazuko; Takizawa, Osamu; Hyon, Suong-Hyu; Tsutsumi, Sadami

    2010-05-01

    Development of a rapid and accurate method for visceral fat measurement is an important task, given the recent increase in the number of patients with metabolic syndrome. In this study, we optimized the Fast Low Angle Shot (FLASH) sequence using a binominal radiofrequency excitation pulse, in which the acquisition time is short, and measured changes in the amount of visceral fat in subjects after a period of wearing clothes with a fat-reducing effect during walking. We solved the reproducibility problem associated with the number of slices, and developed automatic measurement software for high-precision separation and extraction of abdominal visceral fat images. This software was developed using intensity correction with the coil position, derivation of a threshold by histogram analysis and fat separation by template matching for abdominal images. The cross-sectional area of a single slice varies for every acquisition due to visceral organ movement, but the relative error largely converged for seven slices. The measured amount of abdominal fat tended to be consistent with changes in the body fat and waist circumference of the subjects. The correlation coefficients between automatic extraction using the measurement software and manual extraction were 0.9978 for subcutaneous fat and 0.9972 for visceral fat, showing very strong positive correlations. The consistency rates were 0.9502+/-0.0167 for subcutaneous fat and 0.9395+/-0.0147 for visceral fat, and the shapes of the regions were also extracted very accurately. These results show that the magnetic resonance imaging acquisition method and image processing system developed in this study are beneficial for measurement of abdominal visceral fat. Therefore, this method may have a major role in future diagnosis of metabolic syndrome.

  20. Spacelab output processing system architectural study

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Two different system architectures are presented. The two architectures are derived from two different data flows within the Spacelab Output Processing System. The major differences between these system architectures are in the position of the decommutation function (the first architecture performs decommutation in the latter half of the system and the second architecture performs that function in the front end of the system). In order to be examined, the system was divided into five stand-alone subsystems; Work Assembler, Mass Storage System, Output Processor, Peripheral Pool, and Resource Monitor. The work load of each subsystem was estimated independent of the specific devices to be used. The candidate devices were surveyed from a wide sampling of off-the-shelf devices. Analytical expressions were developed to quantify the projected workload in conjunction with typical devices which would adequately handle the subsystem tasks. All of the study efforts were then directed toward preparing performance and cost curves for each architecture subsystem.

  1. Development of novel microencapsulation processes

    NASA Astrophysics Data System (ADS)

    Yin, Weisi

    of polymer solution suspended in water or from a spray. Hollow PS particles were obtained by swelling PS latex with solvent, freezing in liquid nitrogen, and drying in vacuum. It is shown that the particle morphology is due to phase separation in the polymer emulsion droplets upon freezing in liquid nitrogen, and that morphological changes are driven largely by lowering interfacial free energy. The dried hollow particles were resuspended in a dispersing media and exposed to a plasticizer, which imparts mobility to polymer chains, to close the surface opening and form microcapsules surrounding an aqueous core. The interfacial free energy difference between the hydrophobic inside and hydrophilic outside surfaces is the major driving force for closing the hole on the surface. A controlled release biodegradable vehicle for drug was made by encapsulating procaine hydrochloride, a water-soluble drug, into the core of poly(DL-lactide) (PLA) microcapsules, which were made by the freeze-drying and subsequent closing process. The encapsulation efficiency is affected by the hollow particle morphology, amount of closing agent, exposure time, surfactant, and method of dispersing the hollow particles in water. Controlled release of procaine hydrochloride from the microcapsules into phosphate buffer was observed. The use of benign solvents dimethyl carbonate in spray/freeze-drying and CO2 for closing would eliminate concerns of residual harmful solvent in the product. The ease of separation of CO2 from the drug solution may also enable recycling of the drug solution to increase the overall encapsulation efficiency using these novel hollow particles.

  2. Power Systems Development Facility

    SciTech Connect

    2003-07-01

    This report discusses Test Campaign TC12 of the Kellogg Brown & Root, Inc. (KBR) Transport Gasifier train with a Siemens Westinghouse Power Corporation (SW) particle filter system at the Power Systems Development Facility (PSDF) located in Wilsonville, Alabama. The Transport Gasifier is an advanced circulating fluidized-bed reactor designed to operate as either a combustor or a gasifier using a particulate control device (PCD). While operating as a gasifier, either air or oxygen can be used as the oxidant. Test run TC12 began on May 16, 2003, with the startup of the main air compressor and the lighting of the gasifier start-up burner. The Transport Gasifier operated until May 24, 2003, when a scheduled outage occurred to allow maintenance crews to install the fuel cell test unit and modify the gas clean-up system. On June 18, 2003, the test run resumed when operations relit the start-up burner, and testing continued until the scheduled end of the run on July 14, 2003. TC12 had a total of 733 hours using Powder River Basin (PRB) subbituminous coal. Over the course of the entire test run, gasifier temperatures varied between 1,675 and 1,850 F at pressures from 130 to 210 psig.

  3. Development of the selective hydrophobic coagulation process

    SciTech Connect

    Yoon, R.H.; Luttrell, G.H.

    1992-01-01

    A novel technique for selectively coagulating and separating coal from dispersed mineral matter has been developed at Virginia Tech. The process, Selective Hydrophobic Coagulation (SHC), has been studied since 1986 under the sponsorship of the US Department of Energy. The SHC process differs from oil agglomeration, shear or polymer flocculation, and electrolytic coagulation processes in that it does not require reagents or additives to induce the formation of coagula. In most cases, simple pH control is all that is required to (i) induce the coagulation of coal particles and (ii) effectively disperse particles of mineral matter. If the coal is oxidized, a small dosage of reagents can be used to enhance the process. The technical work program was initiated on July 1, 1992. Force-distance curves were generated for DDOA Br-coated mica surfaces in water and used to calculate hydrophobicity constants and decay lengths for this system; and a new device for the measurement of water contact angles, similar to the Wilhelmy plate balance, has been built 225 kg samples of Pittsburgh No. 8 and Elkhom No. 3 seam coals were obtained; a static mixer test facility for the study of coagula growth was set up and was undergoing shakedown tests at the end of the quarter; a bench-scale lamella thickener was being constructed; and preliminary coagula/ mineral separation tests were being conducted in a bench-scale continuous drum filter.

  4. Advanced Dewatering Systems Development

    SciTech Connect

    R.H. Yoon; G.H. Luttrell

    2008-07-31

    A new fine coal dewatering technology has been developed and tested in the present work. The work was funded by the Solid Fuels and Feedstocks Grand Challenge PRDA. The objective of this program was to 'develop innovative technical approaches to ensure a continued supply of environmentally sound solid fuels for existing and future combustion systems with minimal incremental fuel cost.' Specifically, this solicitation is aimed at developing technologies that can (i) improve the efficiency or economics of the recovery of carbon when beneficiating fine coal from both current production and existing coal slurry impoundments and (ii) assist in the greater utilization of coal fines by improving the handling characteristics of fine coal via dewatering and/or reconstitution. The results of the test work conducted during Phase I of the current project demonstrated that the new dewatering technologies can substantially reduce the moisture from fine coal, while the test work conducted during Phase II successfully demonstrated the commercial viability of this technology. It is believed that availability of such efficient and affordable dewatering technology is essential to meeting the DOE's objectives.

  5. Features, Events, and Processes: system Level

    SciTech Connect

    D. McGregor

    2004-10-15

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760).

  6. Vision Systems Illuminate Industrial Processes

    NASA Technical Reports Server (NTRS)

    2013-01-01

    When NASA designs a spacecraft to undertake a new mission, innovation does not stop after the design phase. In many cases, these spacecraft are firsts of their kind, requiring not only remarkable imagination and expertise in their conception but new technologies and methods for their manufacture. In the realm of manufacturing, NASA has from necessity worked on the cutting-edge, seeking new techniques and materials for creating unprecedented structures, as well as capabilities for reducing the cost and increasing the efficiency of existing manufacturing technologies. From friction stir welding enhancements (Spinoff 2009) to thermoset composites (Spinoff 2011), NASA s innovations in manufacturing have often transferred to the public in ways that enable the expansion of the Nation s industrial productivity. NASA has long pursued ways of improving upon and ensuring quality results from manufacturing processes ranging from arc welding to thermal coating applications. But many of these processes generate blinding light (hence the need for special eyewear during welding) that obscures the process while it is happening, making it difficult to monitor and evaluate. In the 1980s, NASA partnered with a company to develop technology to address this issue. Today, that collaboration has spawned multiple commercial products that not only support effective manufacturing for private industry but also may support NASA in the use of an exciting, rapidly growing field of manufacturing ideal for long-duration space missions.

  7. SIT-5 system development.

    NASA Technical Reports Server (NTRS)

    Hyman, J., Jr.

    1972-01-01

    A 5-cm structurally integrated ion thruster (SIT-5) has been developed for attitude control and stationkeeping of synchronous satellites. With two-dimension thrust-vectoring grids, a first generation unit has demonstrated a thrust of 0.56 mlb at a beam voltage of 1200 V, total mass efficiency of 64%, and electrical efficiency of 46.8%. Structural integrity is demonstrated with a dielectric-coated grid for shock (30 G), sinusoidal (9 G), and random (19.9 G rms) accelerations. System envelope is 31.8 cm long by 13.9 cm flange bolt circle, with a mass of 8.5 kg, including 6.2 kg mercury propellant. Characteristics of a second-generation unit indicate significant performance gains.

  8. Hybrid Sulfur Thermochemical Process Development Annual Report

    SciTech Connect

    Summers, William A.; Buckner, Melvin R.

    2005-07-21

    The Hybrid Sulfur (HyS) Thermochemical Process is a means of producing hydrogen via water-splitting through a combination of chemical reactions and electrochemistry. Energy is supplied to the system as high temperature heat (approximately 900 C) and electricity. Advanced nuclear reactors (Generation IV) or central solar receivers can be the source of the primary energy. Large-scale hydrogen production based on this process could be a major contributor to meeting the needs of a hydrogen economy. This project's objectives include optimization of the HyS process design, analysis of technical issues and concerns, creation of a development plan, and laboratory-scale proof-of-concept testing. The key component of the HyS Process is the SO2-depolarized electrolyzer (SDE). Studies were performed that showed that an electrolyzer operating in the range of 500-600 mV per cell can lead to an overall HyS cycle efficiency in excess of 50%, which is superior to all other currently proposed thermochemical cycles. Economic analysis indicated hydrogen production costs of approximately $1.60 per kilogram for a mature nuclear hydrogen production plant. However, in order to meet commercialization goals, the electrolyzer should be capable of operating at high current density, have a long operating lifetime , and have an acceptable capital cost. The use of proton-exchange-membrane (PEM) technology, which leverages work for the development of PEM fuel cells, was selected as the most promising route to meeting these goals. The major accomplishments of this project were the design and construction of a suitable electrolyzer test facility and the proof-of-concept testing of a PEM-based SDE.

  9. Power Systems Development Facility

    SciTech Connect

    Southern Company Services

    2004-04-30

    This report discusses Test Campaign TC15 of the Kellogg Brown & Root, Inc. (KBR) Transport Gasifier train with a Siemens Power Generation, Inc. (SPG) particle filter system at the Power Systems Development Facility (PSDF) located in Wilsonville, Alabama. The Transport Gasifier is an advanced circulating fluidized-bed reactor designed to operate as either a combustor or gasifier using a particulate control device (PCD). While operating as a gasifier, either air or oxygen can be used as the oxidant. Test run TC15 began on April 19, 2004, with the startup of the main air compressor and the lighting of the gasifier startup burner. The Transport Gasifier was shutdown on April 29, 2004, accumulating 200 hours of operation using Powder River Basin (PRB) subbituminous coal. About 91 hours of the test run occurred during oxygen-blown operations. Another 6 hours of the test run was in enriched-air mode. The remainder of the test run, approximately 103 hours, took place during air-blown operations. The highest operating temperature in the gasifier mixing zone mostly varied from 1,800 to 1,850 F. The gasifier exit pressure ran between 200 and 230 psig during air-blown operations and between 110 and 150 psig in oxygen-enhanced air operations.

  10. Ground data systems resource allocation process

    NASA Technical Reports Server (NTRS)

    Berner, Carol A.; Durham, Ralph; Reilly, Norman B.

    1989-01-01

    The Ground Data Systems Resource Allocation Process at the Jet Propulsion Laboratory provides medium- and long-range planning for the use of Deep Space Network and Mission Control and Computing Center resources in support of NASA's deep space missions and Earth-based science. Resources consist of radio antenna complexes and associated data processing and control computer networks. A semi-automated system was developed that allows operations personnel to interactively generate, edit, and revise allocation plans spanning periods of up to ten years (as opposed to only two or three weeks under the manual system) based on the relative merit of mission events. It also enhances scientific data return. A software system known as the Resource Allocation and Planning Helper (RALPH) merges the conventional methods of operations research, rule-based knowledge engineering, and advanced data base structures. RALPH employs a generic, highly modular architecture capable of solving a wide variety of scheduling and resource sequencing problems. The rule-based RALPH system has saved significant labor in resource allocation. Its successful use affirms the importance of establishing and applying event priorities based on scientific merit, and the benefit of continuity in planning provided by knowledge-based engineering. The RALPH system exhibits a strong potential for minimizing development cycles of resource and payload planning systems throughout NASA and the private sector.

  11. Color Image Processing and Object Tracking System

    NASA Technical Reports Server (NTRS)

    Klimek, Robert B.; Wright, Ted W.; Sielken, Robert S.

    1996-01-01

    This report describes a personal computer based system for automatic and semiautomatic tracking of objects on film or video tape, developed to meet the needs of the Microgravity Combustion and Fluids Science Research Programs at the NASA Lewis Research Center. The system consists of individual hardware components working under computer control to achieve a high degree of automation. The most important hardware components include 16-mm and 35-mm film transports, a high resolution digital camera mounted on a x-y-z micro-positioning stage, an S-VHS tapedeck, an Hi8 tapedeck, video laserdisk, and a framegrabber. All of the image input devices are remotely controlled by a computer. Software was developed to integrate the overall operation of the system including device frame incrementation, grabbing of image frames, image processing of the object's neighborhood, locating the position of the object being tracked, and storing the coordinates in a file. This process is performed repeatedly until the last frame is reached. Several different tracking methods are supported. To illustrate the process, two representative applications of the system are described. These applications represent typical uses of the system and include tracking the propagation of a flame front and tracking the movement of a liquid-gas interface with extremely poor visibility.

  12. Development of the LICADO coal cleaning process

    SciTech Connect

    Not Available

    1990-07-31

    Development of the liquid carbon dioxide process for the cleaning of coal was performed in batch, variable volume (semi-continuous), and continuous tests. Continuous operation at feed rates up to 4.5 kg/hr (10-lb/hr) was achieved with the Continuous System. Coals tested included Upper Freeport, Pittsburgh, Illinois No. 6, and Middle Kittanning seams. Results showed that the ash and pyrite rejections agreed closely with washability data for each coal at the particle size tested (-200 mesh). A 0.91 metric ton (1-ton) per hour Proof-of-Concept Plant was conceptually designed. A 181 metric ton (200 ton) per hour and a 45 metric ton (50 ton) per hour plant were sized sufficiently to estimate costs for economic analyses. The processing costs for the 181 metric ton (200 ton) per hour and 45 metric ton (50 ton) per hour were estimated to be $18.96 per metric ton ($17.20 per ton) and $11.47 per metric ton ($10.40 per ton), respectively for these size plants. The costs for the 45 metric ton per hour plant are lower because it is assumed to be a fines recovery plant which does not require a grinding circuit of complex waste handling system.

  13. ASI-Volcanic Risk System (SRV): a pilot project to develop EO data processing modules and products for volcanic activity monitoring, first results.

    NASA Astrophysics Data System (ADS)

    Silvestri, M.; Musacchio, M.; Buongiorno, M. F.; Dini, L.

    2009-04-01

    The Project called Sistema Rischio Vulcanico (SRV) is funded by the Italian Space Agency (ASI) in the frame of the National Space Plan 2003-2005 under the Earth Observations section for natural risks management. The SRV Project is coordinated by the Istituto Nazionale di Geofisica e Vulcanologia (INGV) which is responsible at national level for the volcanic monitoring. The project philosophy is to implement, by incremental versions, specific modules which allow to process, store and visualize through Web GIS tools geophysical parameters suitable for volcanic risk management. The ASI-SRV is devoted to the development of an integrated system based on Earth Observation (EO) data to respond to specific needs of the Italian Civil Protection Department (DPC) and improve the monitoring of Italian active volcanoes during all the risk phases (Pre Crisis, Crisis and Post Crisis). The ASI-SRV system provides support to risk managers during the different volcanic activity phases and its results are addressed to the Italian Civil Protection Department (DPC). SRV provides the capability to manage the import many different EO data into the system, it maintains a repository where the acquired data have to be stored and generates selected volcanic products. The processing modules for EO Optical sensors data are based on procedures jointly developed by INGV and University of Modena. This procedures allow to estimate a number of parameters such as: surface thermal proprieties, gas, aerosol and ash emissions and to characterize the volcanic products in terms of composition and geometry. For the analysis of the surface thermal characteristics, the available algorithms allow to extract information during the prevention phase and during the Warning and Crisis phase. In the prevention phase the thermal analysis is directed to the identification of temperature variation on volcanic structure which may indicate a change in the volcanic activity state. At the moment the only sensor that

  14. Ultrasound process tomography system for hydrocyclones

    PubMed

    Schlaberg; Podd; Hoyle

    2000-03-01

    The implementation of a laboratory-based ultrasound tomography system to an industrial process application is not straightforward. In the present work, a tomography system with 16 transducers has been applied to an industrial 50 mm hydrocyclone to visualize its air-core size and position. Hydrocyclones are used to separate fine particles from a slurry. The efficiency of the separation process depends on the size of the air core within the cyclone. If the core is too large due to spigot wear, there will be a detrimental effect on the slurry throughput. Conversely, if the throughput is increased to an extent where the air core becomes unstable or disappears, the particle separation will no longer take place, and the processed batches may become contaminated. Ultrasound tomography presents a very good tool with which to visualize the size, position and movement of the air core and monitor its behaviour under varying input parameters. Ultimately, it could be used within this application both to control the input flow rate depending on the air core size and to detect spigot wear. This paper describes the development of an ultrasonic tomography system applied to an instrumented hydrocyclone. Time-of-flight data are captured by a dedicated acquisition system that pre-processes the information using a DSP and transfers the results to a PC via a fast serial link. The hardware of the tomography system is described, and cursory results are presented in the form of reconstructed images of the air core within the hydrocyclone.

  15. Development of the Concise Data Processing Assessment

    ERIC Educational Resources Information Center

    Day, James; Bonn, Doug

    2011-01-01

    The Concise Data Processing Assessment (CDPA) was developed to probe student abilities related to the nature of measurement and uncertainty and to handling data. The diagnostic is a ten question, multiple-choice test that can be used as both a pre-test and post-test. A key component of the development process was interviews with students, which…

  16. Nicotine-induced plasticity during development: modulation of the cholinergic system and long-term consequences for circuits involved in attention and sensory processing

    PubMed Central

    Heath, Christopher J.; Picciotto, Marina R.

    2009-01-01

    Summary Despite a great deal of progress, more than 10% of pregnant women in the USA smoke. Epidemiological studies have demonstrated correlations between developmental tobacco smoke exposure and sensory processing deficits, as well as a number of neuropsychiatric conditions, including attention deficit hyperactivity disorder. Significantly, data from animal models of developmental nicotine exposure have suggested that the nicotine in tobacco contributes significantly to the effects of developmental smoke exposure. Consequently, we hypothesize that nicotinic acetylcholine receptors (nAChRs) are critical for setting and refining the strength of corticothalamic-thalamocortical loops during critical periods of development and that disruption of this process by developmental nicotine exposure can result in long-lasting dysregulation of sensory processing. The ability of nAChR activation to modulate synaptic plasticity is likely to underlie the effects of both endogenous cholinergic signaling and pharmacologically-administered nicotine to alter cellular, physiological and behavioral processes during critical periods of development. PMID:18692078

  17. ASRM test report: Autoclave cure process development

    NASA Technical Reports Server (NTRS)

    Nachbar, D. L.; Mitchell, Suzanne

    1992-01-01

    ASRM insulated segments will be autoclave cured following insulation pre-form installation and strip wind operations. Following competitive bidding, Aerojet ASRM Division (AAD) Purchase Order 100142 was awarded to American Fuel Cell and Coated Fabrics Company, Inc. (Amfuel), Magnolia, AR, for subcontracted insulation autoclave cure process development. Autoclave cure process development test requirements were included in Task 3 of TM05514, Manufacturing Process Development Specification for Integrated Insulation Characterization and Stripwind Process Development. The test objective was to establish autoclave cure process parameters for ASRM insulated segments. Six tasks were completed to: (1) evaluate cure parameters that control acceptable vulcanization of ASRM Kevlar-filled EPDM insulation material; (2) identify first and second order impact parameters on the autoclave cure process; and (3) evaluate insulation material flow-out characteristics to support pre-form configuration design.

  18. ASRM test report: Autoclave cure process development

    NASA Astrophysics Data System (ADS)

    Nachbar, D. L.; Mitchell, Suzanne

    1992-05-01

    ASRM insulated segments will be autoclave cured following insulation pre-form installation and strip wind operations. Following competitive bidding, Aerojet ASRM Division (AAD) Purchase Order 100142 was awarded to American Fuel Cell and Coated Fabrics Company, Inc. (Amfuel), Magnolia, AR, for subcontracted insulation autoclave cure process development. Autoclave cure process development test requirements were included in Task 3 of TM05514, Manufacturing Process Development Specification for Integrated Insulation Characterization and Stripwind Process Development. The test objective was to establish autoclave cure process parameters for ASRM insulated segments. Six tasks were completed to: (1) evaluate cure parameters that control acceptable vulcanization of ASRM Kevlar-filled EPDM insulation material; (2) identify first and second order impact parameters on the autoclave cure process; and (3) evaluate insulation material flow-out characteristics to support pre-form configuration design.

  19. Monitoring a coordinated exchange process in a four-component biological interaction system: development of a time-resolved terbium-based one-donor/three-acceptor multicolor FRET system.

    PubMed

    Kim, Sung Hoon; Gunther, Jillian R; Katzenellenbogen, John A

    2010-04-07

    Hormonal regulation of cellular function involves the binding of small molecules with receptors that then coordinate subsequent interactions with other signal transduction proteins. These dynamic, multicomponent processes are difficult to track in cells and even in reconstituted in vitro systems, and most methods can monitor only two-component interactions, often with limited capacity to follow dynamic changes. Through a judicious choice of three organic acceptor fluorophores paired with a terbium donor fluorophore, we have developed the first example of a one-donor/three-acceptor multicolor time-resolved fluorescence energy transfer (TR-FRET) system, and we have exemplified its use by monitoring a ligand-regulated protein-protein exchange process in a four-component biological system. By careful quantification of the emission from each of the three acceptors at the four channels for terbium donor emission, we demonstrate that any of these donor channels can be used to estimate the magnitude of the three FRET signals in this terbium-donor triple-acceptor system with minimal bleedthrough. Using this three-channel terbium-based, TR-FRET assay system, we show in one experiment that the addition of a fluorescein-labeled estrogen agonist displaces a SNAPFL-labeled antiestrogen from the ligand binding pocket of a terbium-labeled estrogen receptor, at the same time causing a Cy5-labeled coactivator to be recruited to the estrogen receptor. This experiment demonstrates the power of a four-color TR-FRET experiment, and it shows that the overall process of estrogen receptor ligand exchange and coactivator binding is a dynamic but precisely coordinated process.

  20. Monitoring a Coordinated Exchange Process in a Four-Component Biological Interaction System: Development of a Time-Resolved Terbium-Based One Donor/Three-Acceptor Multi-Color FRET System

    PubMed Central

    Kim, Sung Hoon; Gunther, Jillian R.; Katzenellenbogen, John A.

    2010-01-01

    Hormonal regulation of cellular function involves the binding of small molecules with receptors that then coordinate subsequent interactions with other signal transduction proteins. These dynamic, multi-component processes are difficult to track in cells and even in reconstituted in vitro systems, and most methods can monitor only two-component interactions, often with limited capacity to follow dynamic changes. Through a judicious choice of three organic acceptor fluorophores paired with a terbium donor fluorophore, we have developed the first example of a one-donor/three-acceptor multi-color time-resolved fluorescence energy transfer (TR-FRET) system, and we have exemplified its use by monitoring a ligand-regulated protein-protein exchange process in a four-component biological system. By careful quantification of the emission from each of the three acceptors at the four channels for terbium donor emission, we demonstrate that any of these donor channels can be used to estimate the magnitude of the three FRET signals in this terbium donor triple-acceptor system with minimal bleedthrough. Using this three-channel terbium-based, TR-FRET assay system, we show in one experiment that the addition of a fluorescein-labeled estrogen agonist displaces a SNAPFL-labeled antiestrogen from the ligand binding pocket of a terbium-labeled estrogen receptor, at the same time causing a Cy5-labeled coactivator to be recruited to the estrogen receptor. This experiment demonstrates the power of a four-color TR-FRET experiment, and it shows that the overall process of estrogen receptor ligand exchange and coactivator binding is a dynamic but precisely coordinated process. PMID:20230029

  1. Advanced systems for shuttle launch processing

    NASA Technical Reports Server (NTRS)

    Perez, Rafael A.

    1995-01-01

    Four advanced technologies that could be used in a new shuttle launch processing center are described. The latest methods for high capacity data storage technology, disk arrays and magneto optical disks, are described and their advantages and disadvantages compared. A 3-D protein based optical memory, now being researched, is also described as a possible future technology for data storage. An overview of neural network technology is presented together with several commercial software development options now available for neural network applications. The feasibility of Asynchronous Data Transfer technology as the networking technology to integrate video, voice, and data in a new launch processing center is also considered. Different applications of expert system technology at KSC are enumerated together with a number of commercial expert systems development packages presently available.

  2. A versatile scalable PET processing system

    SciTech Connect

    H. Dong, A. Weisenberger, J. McKisson, Xi Wenze, C. Cuevas, J. Wilson, L. Zukerman

    2011-06-01

    Positron Emission Tomography (PET) historically has major clinical and preclinical applications in cancerous oncology, neurology, and cardiovascular diseases. Recently, in a new direction, an application specific PET system is being developed at Thomas Jefferson National Accelerator Facility (Jefferson Lab) in collaboration with Duke University, University of Maryland at Baltimore (UMAB), and West Virginia University (WVU) targeted for plant eco-physiology research. The new plant imaging PET system is versatile and scalable such that it could adapt to several plant imaging needs - imaging many important plant organs including leaves, roots, and stems. The mechanical arrangement of the detectors is designed to accommodate the unpredictable and random distribution in space of the plant organs without requiring the plant be disturbed. Prototyping such a system requires a new data acquisition system (DAQ) and data processing system which are adaptable to the requirements of these unique and versatile detectors.

  3. EUV mask process specifics and development challenges

    NASA Astrophysics Data System (ADS)

    Nesladek, Pavel

    2014-07-01

    EUV lithography is currently the favorite and most promising candidate among the next generation lithography (NGL) technologies. Decade ago the NGL was supposed to be used for 45 nm technology node. Due to introduction of immersion 193nm lithography, double/triple patterning and further techniques, the 193 nm lithography capabilities was greatly improved, so it is expected to be used successfully depending on business decision of the end user down to 10 nm logic. Subsequent technology node will require EUV or DSA alternative technology. Manufacturing and especially process development for EUV technology requires significant number of unique processes, in several cases performed at dedicated tools. Currently several of these tools as e.g. EUV AIMS or actinic reflectometer are not available on site yet. The process development is done using external services /tools with impact on the single unit process development timeline and the uncertainty of the process performance estimation, therefore compromises in process development, caused by assumption about similarities between optical and EUV mask made in experiment planning and omitting of tests are further reasons for challenges to unit process development. Increased defect risk and uncertainty in process qualification are just two examples, which can impact mask quality / process development. The aim of this paper is to identify critical aspects of the EUV mask manufacturing with respect to defects on the mask with focus on mask cleaning and defect repair and discuss the impact of the EUV specific requirements on the experiments needed.

  4. Model systems for life processes on Mars

    NASA Technical Reports Server (NTRS)

    Mitz, M. A.

    1974-01-01

    In the evolution of life forms nonphotosynthetic mechanisms are developed. The question remains whether a total life system could evolve which is not dependent upon photosynthesis. In trying to visualize life on other planets, the photosynthetic process has problems. On Mars, the high intensity of light at the surface is a concern and alternative mechanisms need to be defined and analyzed. In the UV search for alternate mechanisms, several different areas may be identified. These involve activated inorganic compounds in the atmosphere, such as the products of photodissociation of carbon dioxide and the organic material which may be created by natural phenomena. In addition, a life system based on the pressure of the atmospheric constituents, such as carbon dioxide, is a possibility. These considerations may be important for the understanding of evolutionary processes of life on another planet. Model systems which depend on these alternative mechanisms are defined and related to presently planned and future planetary missions.

  5. Process and control systems for composites manufacturing

    NASA Technical Reports Server (NTRS)

    Tsiang, T. H.; Wanamaker, John L.

    1992-01-01

    A precise control of composite material processing would not only improve part quality, but it would also directly reduce the overall manufacturing cost. The development and incorporation of sensors will help to generate real-time information for material processing relationships and equipment characteristics. In the present work, the thermocouple, pressure transducer, and dielectrometer technologies were investigated. The monitoring sensors were integrated with the computerized control system in three non-autoclave fabrication techniques: hot-press, self contained tool (self heating and pressurizing), and pressure vessel). The sensors were implemented in the parts and tools.

  6. Supervisory development system

    SciTech Connect

    Arthur, P.L.; Norlach, D.L.

    1985-03-01

    The Supervisory Development System (SDS) consists of a series of training inputs which are designed to meet the training needs of a newly appointed manufacturing supervisor. Each training component has been carefully designed to insure that a new supervisor receives training which is job related and coincides with growth on the job. The SDS is initiated with appointment of the new supervisor and extends to eighteen months after appointment. Mobil's Marketing and Refining Division's U.S. operations are headquartered in Fairfax, Virginia. The Manufacturing function has five refineries located in Beaumont, Texas; Ferndale, Washington; Joliet, Illinois; Paulsboro, New Jersey; and Torrance, California. New first-line supervisors are appointed at a rate of about seven per year in one refinery and up to fifteen or twenty per year in others. First-line supervisors in Mobil's refineries are similar to those found in other refineries. To the hourly rate or blue collar employee, the first-level supervisor represents the company. They are responsible for providing work direction, improving performance, and operating efficiently within a safe environment.

  7. Genesis Eco Systems, Inc. soil washing process

    SciTech Connect

    Cena, R.J.

    1994-10-11

    The Genesis soil washing system is an integrated system of modular design allowing for maximum material handling capabilities, with optimized use of space for site mobility. The Surfactant Activated Bio-enhanced Remediation Equipment-Generation 1 (SABRE-1, Patent Applied For) modification was developed specifically for removing petroleum byproducts from contaminated soils. Scientifically formulated surfactants, introduced by high pressure spray nozzles, displace the contaminant from the surface of the soil particles into the process solution. Once the contaminant is dispersed into the liquid fraction of the process, it is either mechanically removed, chemically oxidized, or biologically oxidized. The contaminated process water is pumped through the Genesis Biosep (Patent Applied For) filtration system where the fines portion is flocculated, and the contaminant-rich liquid portion is combined with an activated mixture of nutrients and carefully selected bacteria to decompose the hydrocarbon fraction. The treated soil and dewatered fines are transferred to a bermed stockpile where bioremediation continues during drying. The process water is reclaimed, filtered, and recycled within the system.

  8. Process for Selecting System Level Assessments for Human System Technologies

    NASA Technical Reports Server (NTRS)

    Watts, James; Park, John

    2006-01-01

    The integration of many life support systems necessary to construct a stable habitat is difficult. The correct identification of the appropriate technologies and corresponding interfaces is an exhaustive process. Once technologies are selected secondary issues such as mechanical and electrical interfaces must be addressed. The required analytical and testing work must be approached in a piecewise fashion to achieve timely results. A repeatable process has been developed to identify and prioritize system level assessments and testing needs. This Assessment Selection Process has been defined to assess cross cutting integration issues on topics at the system or component levels. Assessments are used to identify risks, encourage future actions to mitigate risks, or spur further studies.

  9. Advanced alarm systems: Display and processing issues

    SciTech Connect

    O`Hara, J.M.; Wachtel, J.; Perensky, J.

    1995-05-01

    This paper describes a research program sponsored by the US Nuclear Regulatory Commission to address the human factors engineering (HFE) deficiencies associated with nuclear power plant alarm systems. The overall objective of the study is to develop HFE review guidance for alarm systems. In support of this objective, human performance issues needing additional research were identified. Among the important issues were alarm processing strategies and alarm display techniques. This paper will discuss these issues and briefly describe our current research plan to address them.

  10. High-throughput process development: I. Process chromatography.

    PubMed

    Rathore, Anurag S; Bhambure, Rahul

    2014-01-01

    Chromatographic separation serves as "a workhorse" for downstream process development and plays a key role in removal of product-related, host cell-related, and process-related impurities. Complex and poorly characterized raw materials and feed material, low feed concentration, product instability, and poor mechanistic understanding of the processes are some of the critical challenges that are faced during development of a chromatographic step. Traditional process development is performed as trial-and-error-based evaluation and often leads to a suboptimal process. High-throughput process development (HTPD) platform involves an integration of miniaturization, automation, and parallelization and provides a systematic approach for time- and resource-efficient chromatography process development. Creation of such platforms requires integration of mechanistic knowledge of the process with various statistical tools for data analysis. The relevance of such a platform is high in view of the constraints with respect to time and resources that the biopharma industry faces today. This protocol describes the steps involved in performing HTPD of process chromatography step. It described operation of a commercially available device (PreDictor™ plates from GE Healthcare). This device is available in 96-well format with 2 or 6 μL well size. We also discuss the challenges that one faces when performing such experiments as well as possible solutions to alleviate them. Besides describing the operation of the device, the protocol also presents an approach for statistical analysis of the data that is gathered from such a platform. A case study involving use of the protocol for examining ion-exchange chromatography of granulocyte colony-stimulating factor (GCSF), a therapeutic product, is briefly discussed. This is intended to demonstrate the usefulness of this protocol in generating data that is representative of the data obtained at the traditional lab scale. The agreement in the

  11. Development of the selective coagulation process

    SciTech Connect

    Yoon, R.H.; Luttrell, G.H.

    1991-01-01

    The aim of this project is to develop an economical method for producing low-sulfur and low-ash coals using the selective hydrophobic coagulation (SHC) process. This work has been divided into three tasks: (1) project planning and sample acquisition; (2) studies of the fundamental mechanism(s) of the selective coagulation process and the parameters that affect the process of separating coal from both the ash-forming minerals and pyritic sulfur; and (3) bench-scale process development test work to establish the best possible method(s) of separating the hydrophobic and coagula from the dispersed mineral matter.

  12. Robot development for nuclear material processing

    SciTech Connect

    Pedrotti, L.R.; Armantrout, G.A.; Allen, D.C.; Sievers, R.H. Sr.

    1991-07-01

    The Department of Energy is seeking to modernize its special nuclear material (SNM) production facilities and concurrently reduce radiation exposures and process and incidental radioactive waste generated. As part of this program, Lawrence Livermore National Laboratory (LLNL) lead team is developing and adapting generic and specific applications of commercial robotic technologies to SNM pyrochemical processing and other operations. A working gantry robot within a sealed processing glove box and a telerobot control test bed are manifestations of this effort. This paper describes the development challenges and progress in adapting processing, robotic, and nuclear safety technologies to the application. 3 figs.

  13. Development of a Comprehensive Weld Process Model

    SciTech Connect

    Radhakrishnan, B.; Zacharia, T.

    1997-05-01

    This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC's expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three- dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above. The following technical tasks have been accomplished as part of the CRADA. 1. The LMES welding code has been ported to the Intel Paragon parallel computer at ORNL

  14. Intelligent systems for KSC ground processing

    NASA Technical Reports Server (NTRS)

    Heard, Astrid E.

    1992-01-01

    The ground processing and launch of Shuttle vehicles and their payloads is the primary task of Kennedy Space Center. It is a process which is largely manual and contains little inherent automation. Business is conducted today much as it was during previous NASA programs such as Apollo. In light of new programs and decreasing budgets, NASA must find more cost effective ways in which to do business while retaining the quality and safety of activities. Advanced technologies including artificial intelligence could cut manpower and processing time. This paper is an overview of the research and development in Al technology at KSC with descriptions of the systems which have been implemented, as well as a few under development which are promising additions to ground processing software. Projects discussed cover many facets of ground processing activities, including computer sustaining engineering, subsystem monitor and diagnosis tools and launch team assistants. The deployed Al applications have proven an effectiveness which has helped to demonstrate the benefits of utilizing intelligent software in the ground processing task.

  15. Intelligent pumping system developed

    SciTech Connect

    Not Available

    1983-06-01

    The oil field's first intelligent rod pumping system designed specifically to reduce the cost of pumping oil wells now is a reality. As a plus benefit, the system (called Liftronic) is compact and quiet. The new system combines an efficient mechanical design with a computer control system to reduce pumping costs. The unit stands less than 8 ft high, or approx. one-fourth the height of a comparable beam unit. It also mounts directly on the wellhead. The entire system can be concealed behind a fence or enclosed within a small building to make it a more attractive neighbor in residential, commercial, or recreational areas. It is useful also for agricultural areas where overhead irrigation systems restrict the use of many oil field pumping systems.

  16. Aviation System Analysis Capability Executive Assistant Development

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen; Villani, James A.; Anderson, Kevin; Book, Paul

    1999-01-01

    In this technical document, we describe the development of the Aviation System Analysis Capability (ASAC) Executive Assistant (EA) Proof of Concept (POC) and Beta version. We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models in the ASAC system, and describe the design process and the results of the ASAC EA POC and Beta system development. We also describe the evaluation process and results for applicable COTS software. The document has seven chapters, a bibliography, and two appendices.

  17. Systems analysis for the development of small resource recovery systems: research and development needs. Final report

    SciTech Connect

    Crnkovich, P G; Helmstetter, A J

    1980-10-01

    The technologies that should be developed to make small-scale solid waste processing facilities attractive and viable for small municipalities with solid waste between 50 and 250 tons per day are identified. Research and development needs for refuse derived fuel systems, thermal systems, and biological processes are listed. Selected research and development needs discussed for mechanical processing systems are: develop data bank for low-cost, low-energy shredder options; develop performance data for shredders applied after separation; develop data bank for Trommel performance; and identification and evaluation of low-cost materials separation equipment. Selected research and development needs discussed for thermal systems are: emission levels from solid/waste/to/energy systems; determination of the theoretical efficiencies for thermal processing systems; boiler erosion/corrosion evaluation for systems firing refuse derived fuel; optimization of feed and ash handling systems; refractory life and maintenance requirements; development of 5- to 20-TPD systems; and optimization studies of control systems for small modular incinerators. Selected research and development needs discussed for biological processing systems are: optimum design and operation to maximize gas recovery rates and investigate process configuration alternatives for anaerobic digesters.

  18. Development of a School Leadership Evaluation System

    ERIC Educational Resources Information Center

    Orlando, Nik

    2014-01-01

    This action research study examined the effectiveness of the process implemented by Partnerships to Uplift Communities (PUC) Schools Charter Management Organization to develop their school leader evaluation system in collaboration with current PUC school leaders. The development of the leadership evaluation system included the collective voices of…

  19. Development of a School Leadership Evaluation System

    ERIC Educational Resources Information Center

    Orlando, Nik

    2014-01-01

    This action research study examined the effectiveness of the process implemented by Partnerships to Uplift Communities (PUC) Schools Charter Management Organization to develop their school leader evaluation system in collaboration with current PUC school leaders. The development of the leadership evaluation system included the collective voices of…

  20. Scaleup of IGT MILDGAS Process to a process development unit

    SciTech Connect

    Campbell, J.A.L.; Longanbach, J.; Johnson, R.; Underwood, K.; Mead, J.; Carty, R.H.

    1992-12-31

    The MILDGAS process is capable of processing both eastern caking and western non-caking coals to yield a slate of liquid and solid products. The liquids can be processed to produce: feedstocks for chemicals; pitch for use as a binder for electrodes in the aluminum industry; and fuels. Depending on the feed coal characteristics and the operating conditions, the char can be used as an improved fuel for power generation or can be used to make form coke for steel-making blast furnaces or for foundry cupola operations. The specific objectives of the program are to: design, construct, and operate a 24-tons/day adiabatic process development unit (PDU) to obtain process performance data suitable for design scaleup; obtain large batches of coal-derived co-products for industrial evaluation; prepare a detailed design of a demonstration unit; and develop technical and economic plans for commercialization of the MILDGAS process. In this paper, the authors present the process design of the PDU facility, a description of the expected product distribution and the project test plan to be implemented in the program.