Sample records for processing system consists

  1. Seeing the System through the End Users' Eyes: Shadow Expert Technique for Evaluating the Consistency of a Learning Management System

    NASA Astrophysics Data System (ADS)

    Holzinger, Andreas; Stickel, Christian; Fassold, Markus; Ebner, Martin

    Interface consistency is an important basic concept in web design and has an effect on performance and satisfaction of end users. Consistency also has significant effects on the learning performance of both expert and novice end users. Consequently, the evaluation of consistency within a e-learning system and the ensuing eradication of irritating discrepancies in the user interface redesign is a big issue. In this paper, we report of our experiences with the Shadow Expert Technique (SET) during the evaluation of the consistency of the user interface of a large university learning management system. The main objective of this new usability evaluation method is to understand the interaction processes of end users with a specific system interface. Two teams of usability experts worked independently from each other in order to maximize the objectivity of the results. The outcome of this SET method is a list of recommended changes to improve the user interaction processes, hence to facilitate high consistency.

  2. Modeling and Advanced Control for Sustainable Process Systems

    EPA Science Inventory

    This book chapter introduces a novel process systems engineering framework that integrates process control with sustainability assessment tools for the simultaneous evaluation and optimization of process operations. The implemented control strategy consists of a biologically-insp...

  3. Process migration in UNIX environments

    NASA Technical Reports Server (NTRS)

    Lu, Chin; Liu, J. W. S.

    1988-01-01

    To support process migration in UNIX environments, the main problem is how to encapsulate the location dependent features of the system in such a way that a host independent virtual environment is maintained by the migration handlers on the behalf of each migrated process. An object-oriented approach is used to describe the interaction between a process and its environment. More specifically, environmental objects were introduced in UNIX systems to carry out the user-environment interaction. The implementation of the migration handlers is based on both the state consistency criterion and the property consistency criterion.

  4. Natural resources information system.

    NASA Technical Reports Server (NTRS)

    Leachtenauer, J. C.; Woll, A. M.

    1972-01-01

    A computer-based Natural Resources Information System was developed for the Bureaus of Indian Affairs and Land Management. The system stores, processes and displays data useful to the land manager in the decision making process. Emphasis is placed on the use of remote sensing as a data source. Data input consists of maps, imagery overlays, and on-site data. Maps and overlays are entered using a digitizer and stored as irregular polygons, lines and points. Processing functions include set intersection, union and difference and area, length and value computations. Data output consists of computer tabulations and overlays prepared on a drum plotter.

  5. Sewage Treatment

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Stennis Space Center's aquaculture research program has led to an attractive wastewater treatment for private homes. The system consists of a septic tank or tanks for initial sewage processing and a natural secondary treatment facility for further processing of septic tanks' effluent, consisting of a narrow trench, which contains marsh plants and rocks, providing a place for microorganisms. Plants and microorganisms absorb and digest, thus cleansing partially processed wastewater. No odors are evident and cleaned effluent may be discharged into streams or drainage canals. The system is useful in rural areas, costs about $1,900, and requires less maintenance than mechanical systems.

  6. The Raid distributed database system

    NASA Technical Reports Server (NTRS)

    Bhargava, Bharat; Riedl, John

    1989-01-01

    Raid, a robust and adaptable distributed database system for transaction processing (TP), is described. Raid is a message-passing system, with server processes on each site to manage concurrent processing, consistent replicated copies during site failures, and atomic distributed commitment. A high-level layered communications package provides a clean location-independent interface between servers. The latest design of the package delivers messages via shared memory in a configuration with several servers linked into a single process. Raid provides the infrastructure to investigate various methods for supporting reliable distributed TP. Measurements on TP and server CPU time are presented, along with data from experiments on communications software, consistent replicated copy control during site failures, and concurrent distributed checkpointing. A software tool for evaluating the implementation of TP algorithms in an operating-system kernel is proposed.

  7. Attitude determination of a high altitude balloon system. Part 2: Development of the parameter determination process

    NASA Technical Reports Server (NTRS)

    Nigro, N. J.; Elkouh, A. F.

    1975-01-01

    The attitude of the balloon system is determined as a function of time if: (a) a method for simulating the motion of the system is available, and (b) the initial state is known. The initial state is obtained by fitting the system motion (as measured by sensors) to the corresponding output predicted by the mathematical model. In the case of the LACATE experiment the sensors consisted of three orthogonally oriented rate gyros and a magnetometer all mounted on the research platform. The initial state was obtained by fitting the angular velocity components measured with the gyros to the corresponding values obtained from the solution of the math model. A block diagram illustrating the attitude determination process employed for the LACATE experiment is shown. The process consists of three essential parts; a process for simulating the balloon system, an instrumentation system for measuring the output, and a parameter estimation process for systematically and efficiently solving the initial state. Results are presented and discussed.

  8. The embedded operating system project

    NASA Technical Reports Server (NTRS)

    Campbell, R. H.

    1985-01-01

    The design and construction of embedded operating systems for real-time advanced aerospace applications was investigated. The applications require reliable operating system support that must accommodate computer networks. Problems that arise in the construction of such operating systems, reconfiguration, consistency and recovery in a distributed system, and the issues of real-time processing are reported. A thesis that provides theoretical foundations for the use of atomic actions to support fault tolerance and data consistency in real-time object-based system is included. The following items are addressed: (1) atomic actions and fault-tolerance issues; (2) operating system structure; (3) program development; (4) a reliable compiler for path Pascal; and (5) mediators, a mechanism for scheduling distributed system processes.

  9. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  10. Integrated Information Support System (IISS). Volume 8. User Interface Subsystem. Part 3. User Interface Services Product Specification.

    DTIC Science & Technology

    1985-11-01

    User Interface that consists of a set of callable execution time routines available to an application program for form processing . IISS Function Screen...provisions for test consists of the normal testing techniques that are accomplished during the construction process . They consist of design and code...application presents a form * to the user which must be filled in with information for processing by that application. The application then

  11. Development of integrated control system for smart factory in the injection molding process

    NASA Astrophysics Data System (ADS)

    Chung, M. J.; Kim, C. Y.

    2018-03-01

    In this study, we proposed integrated control system for automation of injection molding process required for construction of smart factory. The injection molding process consists of heating, tool close, injection, cooling, tool open, and take-out. Take-out robot controller, image processing module, and process data acquisition interface module are developed and assembled to integrated control system. By adoption of integrated control system, the injection molding process can be simplified and the cost for construction of smart factory can be inexpensive.

  12. Low level image processing techniques using the pipeline image processing engine in the flight telerobotic servicer

    NASA Technical Reports Server (NTRS)

    Nashman, Marilyn; Chaconas, Karen J.

    1988-01-01

    The sensory processing system for the NASA/NBS Standard Reference Model (NASREM) for telerobotic control is described. This control system architecture was adopted by NASA of the Flight Telerobotic Servicer. The control system is hierarchically designed and consists of three parallel systems: task decomposition, world modeling, and sensory processing. The Sensory Processing System is examined, and in particular the image processing hardware and software used to extract features at low levels of sensory processing for tasks representative of those envisioned for the Space Station such as assembly and maintenance are described.

  13. [Dual process in large number estimation under uncertainty].

    PubMed

    Matsumuro, Miki; Miwa, Kazuhisa; Terai, Hitoshi; Yamada, Kento

    2016-08-01

    According to dual process theory, there are two systems in the mind: an intuitive and automatic System 1 and a logical and effortful System 2. While many previous studies about number estimation have focused on simple heuristics and automatic processes, the deliberative System 2 process has not been sufficiently studied. This study focused on the System 2 process for large number estimation. First, we described an estimation process based on participants’ verbal reports. The task, corresponding to the problem-solving process, consisted of creating subgoals, retrieving values, and applying operations. Second, we investigated the influence of such deliberative process by System 2 on intuitive estimation by System 1, using anchoring effects. The results of the experiment showed that the System 2 process could mitigate anchoring effects.

  14. Reprocessing system with nuclide separation based on chromatography in hydrochloric acid solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suzuki, Tatsuya; Tachibana, Yu; Koyama, Shi-ichi

    2013-07-01

    We have proposed the reprocessing system with nuclide separation processes based on the chromatographic technique in the hydrochloric acid solution system. Our proposed system consists of the dissolution process, the reprocessing process, the minor actinide separation process, and nuclide separation processes. In the reprocessing and separation processes, the pyridine resin is used as a main separation media. It was confirmed that the dissolution in the hydrochloric acid solution is easily achieved by the plasma voloxidation and by the addition of oxygen peroxide into the hydrochloric acid solution.

  15. Aligning grammatical theories and language processing models.

    PubMed

    Lewis, Shevaun; Phillips, Colin

    2015-02-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second, how should we relate grammatical theories and language processing models to each other?

  16. Development of the Diagnostic Expert System for Tea Processing

    NASA Astrophysics Data System (ADS)

    Yoshitomi, Hitoshi; Yamaguchi, Yuichi

    A diagnostic expert system for tea processing which can presume the cause of the defect of the processed tea was developed to contribute to the improvement of tea processing. This system that consists of some programs can be used through the Internet. The inference engine, the core of the system adopts production system which is well used on artificial intelligence, and is coded by Prolog as the artificial intelligence oriented language. At present, 176 rules for inference have been registered on this system. The system will be able to presume better if more rules are added to the system.

  17. LANDSAT-D data format control book. Volume 6: (Products)

    NASA Technical Reports Server (NTRS)

    Kabat, F.

    1981-01-01

    Four basic product types are generated from the raw thematic mapper (TM) and multispectral scanner (MSS) payload data by the NASA GSFC LANDSAT 4 data management system: (1) unprocessed data (raw sensor data); (2) partially processed data, which consists of radiometrically corrected sensor data with geometric correction information appended; (3) fully processed data, which consists of radiometrically and geometrically corrected sensor data; and (4) inventory data which consists of summary information about product types 2 and 3. High density digital recorder formatting and the radiometric correction process are described. Geometric correction information is included.

  18. Consistent Correlations for Parameterised Boolean Equation Systems with Applications in Correctness Proofs for Manipulations

    NASA Astrophysics Data System (ADS)

    Willemse, Tim A. C.

    We introduce the concept of consistent correlations for parameterised Boolean equation systems (PBESs), motivated largely by the laborious proofs of correctness required for most manipulations in this setting. Consistent correlations focus on relating the equations that occur in PBESs, rather than their solutions. For a fragment of PBESs, consistent correlations are shown to coincide with a recently introduced form of bisimulation. Finally, we show that bisimilarity on processes induces consistent correlations on PBESs encoding model checking problems. We apply our theory to two example manipulations from the literature.

  19. Aligning Grammatical Theories and Language Processing Models

    ERIC Educational Resources Information Center

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  20. Information system life-cycle and documentation standards, volume 1

    NASA Technical Reports Server (NTRS)

    Callender, E. David; Steinbacher, Jody

    1989-01-01

    The Software Management and Assurance Program (SMAP) Information System Life-Cycle and Documentation Standards Document describes the Version 4 standard information system life-cycle in terms of processes, products, and reviews. The description of the products includes detailed documentation standards. The standards in this document set can be applied to the life-cycle, i.e., to each phase in the system's development, and to the documentation of all NASA information systems. This provides consistency across the agency as well as visibility into the completeness of the information recorded. An information system is software-intensive, but consists of any combination of software, hardware, and operational procedures required to process, store, or transmit data. This document defines a standard life-cycle model and content for associated documentation.

  1. Advanced Manufacturing Systems in Food Processing and Packaging Industry

    NASA Astrophysics Data System (ADS)

    Shafie Sani, Mohd; Aziz, Faieza Abdul

    2013-06-01

    In this paper, several advanced manufacturing systems in food processing and packaging industry are reviewed, including: biodegradable smart packaging and Nano composites, advanced automation control system consists of fieldbus technology, distributed control system and food safety inspection features. The main purpose of current technology in food processing and packaging industry is discussed due to major concern on efficiency of the plant process, productivity, quality, as well as safety. These application were chosen because they are robust, flexible, reconfigurable, preserve the quality of the food, and efficient.

  2. CO 2-scrubbing and methanation as purification system for PEFC

    NASA Astrophysics Data System (ADS)

    Ledjeff-Hey, K.; Roes, J.; Wolters, R.

    Hydrogen is usually produced by steam reforming of natural gas in large-scale processes. The reformate consists of hydrogen, carbon dioxide, carbon monoxide, and residues of hydrocarbons. Since the anode catalyst of a polymer electrolyte membrane fuel cell (PEFC) is usually based on platinum, which is easily poisoned by carbon monoxide, the conditioned feed gas should contain less than 100 ppmv CO, and preferably, less than 10 ppmv. Depending on the design and operating conditions of the hydrogen production process, the CO content of a typical reformate gas, even after the CO shift reactor may be in the range of 0.2-1.0 vol.%; this is far higher than a PEFC can tolerate. A CO management system is required to lower the CO concentration to acceptable levels. In many cases, the CO purification system consists of a combination of physical or chemical processes to achieve the necessary reduction in CO content. A promising alternative for hydrogen purification is a combined process consisting of a carbon dioxide scrubber with subsequent methanation to reduce the carbon monoxide content to an acceptable level of less than 10 ppmv.

  3. NTP comparison process

    NASA Technical Reports Server (NTRS)

    Corban, Robert

    1993-01-01

    The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.

  4. The default-mode, ego-functions and free-energy: a neurobiological account of Freudian ideas

    PubMed Central

    Friston, K. J.

    2010-01-01

    This article explores the notion that Freudian constructs may have neurobiological substrates. Specifically, we propose that Freud’s descriptions of the primary and secondary processes are consistent with self-organized activity in hierarchical cortical systems and that his descriptions of the ego are consistent with the functions of the default-mode and its reciprocal exchanges with subordinate brain systems. This neurobiological account rests on a view of the brain as a hierarchical inference or Helmholtz machine. In this view, large-scale intrinsic networks occupy supraordinate levels of hierarchical brain systems that try to optimize their representation of the sensorium. This optimization has been formulated as minimizing a free-energy; a process that is formally similar to the treatment of energy in Freudian formulations. We substantiate this synthesis by showing that Freud’s descriptions of the primary process are consistent with the phenomenology and neurophysiology of rapid eye movement sleep, the early and acute psychotic state, the aura of temporal lobe epilepsy and hallucinogenic drug states. PMID:20194141

  5. Control of Prose Processing via Instructional and Typographical Cues.

    ERIC Educational Resources Information Center

    Glynn, Shawn M.; Di Vesta, Francis J.

    1979-01-01

    College students studied text about an imaginary solar system. Two cuing systems were manipulated to induce a single or double set of cues consistent with one or two sets of text propositions, or no target propositions were specified. Cuing systems guided construction and implementation of prose-processing decision criteria. (Author/RD)

  6. Log-Based Recovery in Asynchronous Distributed Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Kane, Kenneth Paul

    1989-01-01

    A log-based mechanism is described for restoring consistent states to replicated data objects after failures. Preserving a causal form of consistency based on the notion of virtual time is focused upon in this report. Causal consistency has been shown to apply to a variety of applications, including distributed simulation, task decomposition, and mail delivery systems. Several mechanisms have been proposed for implementing causally consistent recovery, most notably those of Strom and Yemini, and Johnson and Zwaenepoel. The mechanism proposed here differs from these in two major respects. First, a roll-forward style of recovery is implemented. A functioning process is never required to roll-back its state in order to achieve consistency with a recovering process. Second, the mechanism does not require any explicit information about the causal dependencies between updates. Instead, all necessary dependency information is inferred from the orders in which updates are logged by the object servers. This basic recovery technique appears to be applicable to forms of consistency other than causal consistency. In particular, it is shown how the recovery technique can be modified to support an atomic form of consistency (grouping consistency). By combining grouping consistency with casual consistency, it may even be possible to implement serializable consistency within this mechanism.

  7. Adaptive weld control for high-integrity welding applications

    NASA Technical Reports Server (NTRS)

    Powell, Bradley W.

    1993-01-01

    An advanced adaptive control weld system for high-integrity welding applications is presented. The system consists of a state-of-the-art weld control subsystem, motion control subsystem, and sensor subsystem which closes the loop on the process. The adaptive control subsystem (ACS), which is required to totally close the loop on weld process control, consists of a multiprocessor system, data acquisition hardware, and three welding sensors which provide measurements from all areas around the torch in real time. The ACS acquires all 'measurables' and feeds offset trims back into the weld control and motion control subsystems to modify the 'controllables' in order to maintain a previously defined weld quality.

  8. TARGET's role in knowledge acquisition, engineering, validation, and documentation

    NASA Technical Reports Server (NTRS)

    Levi, Keith R.

    1994-01-01

    We investigate the use of the TARGET task analysis tool for use in the development of rule-based expert systems. We found TARGET to be very helpful in the knowledge acquisition process. It enabled us to perform knowledge acquisition with one knowledge engineer rather than two. In addition, it improved communication between the domain expert and knowledge engineer. We also found it to be useful for both the rule development and refinement phases of the knowledge engineering process. Using the network in these phases required us to develop guidelines that enabled us to easily translate the network into production rules. A significant requirement for TARGET remaining useful throughout the knowledge engineering process was the need to carefully maintain consistency between the network and the rule representations. Maintaining consistency not only benefited the knowledge engineering process, but also has significant payoffs in the areas of validation of the expert system and documentation of the knowledge in the system.

  9. FAA Directives System

    DOT National Transportation Integrated Search

    1992-08-26

    Consistent with the Federal Aviation Administration's mission to foster a safe, : secure, and efficient aviation system is the need for an effective and efficient : process for communitcating policy and procedures. The FAA Directives System : provide...

  10. A KPI framework for process-based benchmarking of hospital information systems.

    PubMed

    Jahn, Franziska; Winter, Alfred

    2011-01-01

    Benchmarking is a major topic for monitoring, directing and elucidating the performance of hospital information systems (HIS). Current approaches neglect the outcome of the processes that are supported by the HIS and their contribution to the hospital's strategic goals. We suggest to benchmark HIS based on clinical documentation processes and their outcome. A framework consisting of a general process model and outcome criteria for clinical documentation processes is introduced.

  11. Self-Consistency of the Theory of Elementary Stage Rates of Reversible Processes and the Equilibrium Distribution of Reaction Mixture Components

    NASA Astrophysics Data System (ADS)

    Tovbin, Yu. K.

    2018-06-01

    An analysis is presented of one of the key concepts of physical chemistry of condensed phases: the theory self-consistency in describing the rates of elementary stages of reversible processes and the equilibrium distribution of components in a reaction mixture. It posits that by equating the rates of forward and backward reactions, we must obtain the same equation for the equilibrium distribution of reaction mixture components, which follows directly from deducing the equation in equilibrium theory. Ideal reaction systems always have this property, since the theory is of a one-particle character. Problems arise in considering interparticle interactions responsible for the nonideal behavior of real systems. The Eyring and Temkin approaches to describing nonideal reaction systems are compared. Conditions for the self-consistency of the theory for mono- and bimolecular processes in different types of interparticle potentials, the degree of deviation from the equilibrium state, allowing for the internal motions of molecules in condensed phases, and the electronic polarization of the reagent environment are considered within the lattice gas model. The inapplicability of the concept of an activated complex coefficient for reaching self-consistency is demonstrated. It is also shown that one-particle approximations for considering intermolecular interactions do not provide a theory of self-consistency for condensed phases. We must at a minimum consider short-range order correlations.

  12. [Evaluation of Educational Effect of Problem-Posing System in Nursing Processing Study].

    PubMed

    Tsuji, Keiko; Takano, Yasuomi; Yamakawa, Hiroto; Kaneko, Daisuke; Takai, Kiyako; Kodama, Hiromi; Hagiwara, Tomoko; Komatsugawa, Hiroshi

    2015-09-01

    The nursing processing study is generally difficult, because it is important for nursing college students to understand knowledge and utilize it. We have developed an integrated system to understand, utilize, and share knowledge. We added a problem-posing function to this system, and expected that students would deeply understand the nursing processing study through the new system. This system consisted of four steps: create a problem, create an answer input section, create a hint, and verification. Nursing students created problems related to nursing processing by this system. When we gave a lecture on the nursing processing for second year students of A university, we tried to use the creating problem function of this system. We evaluated the effect by the number of problems and the contents of the created problem, that is, whether the contents consisted of a lecture stage or not. We also evaluated the correlation between those and regular examination and report scores. We derived the following: 1. weak correlation between the number of created problems and report score (r=0.27), 2. significant differences between regular examination and report scores of students who created problems corresponding to the learning stage, and those of students who created problems not corresponding to it (P<0.05). From these results, problem-posing is suggested to be effective to fix and utilize knowledge in the lecture of nursing processing theory.

  13. Cogeneration Technology Alternatives Study (CTAS). Volume 3: Industrial processes

    NASA Technical Reports Server (NTRS)

    Palmer, W. B.; Gerlaugh, H. E.; Priestley, R. R.

    1980-01-01

    Cogenerating electric power and process heat in single energy conversion systems rather than separately in utility plants and in process boilers is examined in terms of cost savings. The use of various advanced energy conversion systems are examined and compared with each other and with current technology systems for their savings in fuel energy, costs, and emissions in individual plants and on a national level. About fifty industrial processes from the target energy consuming sectors were used as a basis for matching a similar number of energy conversion systems that are considered as candidate which can be made available by the 1985 to 2000 time period. The sectors considered included food, textiles, lumber, paper, chemicals, petroleum, glass, and primary metals. The energy conversion systems included steam and gas turbines, diesels, thermionics, stirling, closed cycle and steam injected gas turbines, and fuel cells. Fuels considered were coal, both coal and petroleum based residual and distillate liquid fuels, and low Btu gas obtained through the on site gasification of coal. An attempt was made to use consistent assumptions and a consistent set of ground rules specified by NASA for determining performance and cost. Data and narrative descriptions of the industrial processes are given.

  14. Steam atmosphere dryer project: System development and field test. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1999-02-01

    The objective of this project was to develop and demonstrate the use of a superheated steam atmosphere dryer as a highly improved alternative to conventional hot air-drying systems, the present industrial standard method for drying various wet feedstocks. The development program plan consisted of three major activities. The first was engineering analysis and testing of a small-scale laboratory superheated steam dryer. This dryer provided the basic engineering heat transfer data necessary to design a large-scale system. The second major activity consisted of the design, fabrication, and laboratory checkout testing of the field-site prototype superheated steam dryer system. The third majormore » activity consisted of the installation and testing of the complete 250-lb/hr evaporation rate dryer and a 30-kW cogeneration system in conjunction with an anaerobic digester facility at the Village of Bergen, NY. Feedstock for the digester facility at the Village of Bergen, NY. Feedstock for the digester was waste residue from a nearby commercial food processing plant. The superheated steam dryer system was placed into operation in August 1996 and operated successfully through March 1997. During this period, the dryer processed all the material from the digester to a powdered consistency usable as a high-nitrogen-based fertilizer.« less

  15. An intelligent factory-wide optimal operation system for continuous production process

    NASA Astrophysics Data System (ADS)

    Ding, Jinliang; Chai, Tianyou; Wang, Hongfeng; Wang, Junwei; Zheng, Xiuping

    2016-03-01

    In this study, a novel intelligent factory-wide operation system for a continuous production process is designed to optimise the entire production process, which consists of multiple units; furthermore, this system is developed using process operational data to avoid the complexity of mathematical modelling of the continuous production process. The data-driven approach aims to specify the structure of the optimal operation system; in particular, the operational data of the process are used to formulate each part of the system. In this context, the domain knowledge of process engineers is utilised, and a closed-loop dynamic optimisation strategy, which combines feedback, performance prediction, feed-forward, and dynamic tuning schemes into a framework, is employed. The effectiveness of the proposed system has been verified using industrial experimental results.

  16. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William

    1987-01-01

    Spacelab Data Processing Facility (SLDPF) expert system prototypes were developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. The SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

  17. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William

    1987-01-01

    Spacelab Data Processing Facility (SLDPF) expert system prototypes have been developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

  18. DE-CERTS: A Decision Support System for a Comparative Evaluation Method for Risk Management Methodologies and Tools

    DTIC Science & Technology

    1991-09-01

    iv III. THE ANALYTIC HIERARCHY PROCESS ..... ........ 15 A. INTRODUCTION ...... ................. 15 B. THE AHP PROCESS ...... ................ 16 C...INTRODUCTION ...... ................. 26 B. IMPLEMENTATION OF CERTS USING AHP ........ .. 27 1. Consistency ...... ................ 29 2. User Interface...the proposed technique into a Decision Support System. Expert Choice implements the Analytic Hierarchy Process ( AHP ), an approach to multi- criteria

  19. Development of a robust space power system decision model

    NASA Astrophysics Data System (ADS)

    Chew, Gilbert; Pelaccio, Dennis G.; Jacobs, Mark; Stancati, Michael; Cataldo, Robert

    2001-02-01

    NASA continues to evaluate power systems to support human exploration of the Moon and Mars. The system(s) would address all power needs of surface bases and on-board power for space transfer vehicles. Prior studies have examined both solar and nuclear-based alternatives with respect to individual issues such as sizing or cost. What has not been addressed is a comprehensive look at the risks and benefits of the options that could serve as the analytical framework to support a system choice that best serves the needs of the exploration program. This paper describes the SAIC developed Space Power System Decision Model, which uses a formal Two-step Analytical Hierarchy Process (TAHP) methodology that is used in the decision-making process to clearly distinguish candidate power systems in terms of benefits, safety, and risk. TAHP is a decision making process based on the Analytical Hierarchy Process, which employs a hierarchic approach of structuring decision factors by weights, and relatively ranks system design options on a consistent basis. This decision process also includes a level of data gathering and organization that produces a consistent, well-documented assessment, from which the capability of each power system option to meet top-level goals can be prioritized. The model defined on this effort focuses on the comparative assessment candidate power system options for Mars surface application(s). This paper describes the principles of this approach, the assessment criteria and weighting procedures, and the tools to capture and assess the expert knowledge associated with space power system evaluation. .

  20. Programmable partitioning for high-performance coherence domains in a multiprocessor system

    DOEpatents

    Blumrich, Matthias A [Ridgefield, CT; Salapura, Valentina [Chappaqua, NY

    2011-01-25

    A multiprocessor computing system and a method of logically partitioning a multiprocessor computing system are disclosed. The multiprocessor computing system comprises a multitude of processing units, and a multitude of snoop units. Each of the processing units includes a local cache, and the snoop units are provided for supporting cache coherency in the multiprocessor system. Each of the snoop units is connected to a respective one of the processing units and to all of the other snoop units. The multiprocessor computing system further includes a partitioning system for using the snoop units to partition the multitude of processing units into a plurality of independent, memory-consistent, adjustable-size processing groups. Preferably, when the processor units are partitioned into these processing groups, the partitioning system also configures the snoop units to maintain cache coherency within each of said groups.

  1. Single-chip microcomputer for image processing in the photonic measuring system

    NASA Astrophysics Data System (ADS)

    Smoleva, Olga S.; Ljul, Natalia Y.

    2002-04-01

    The non-contact measuring system has been designed for rail- track parameters control on the Moscow Metro. It detects some significant parameters: rail-track width, rail-track height, gage, rail-slums, crosslevel, pickets, and car speed. The system consists of three subsystems: non-contact system of rail-track width, height, and gage inspection, non-contact system of rail-slums inspection and subsystem for crosslevel, speed, and pickets detection. Data from subsystems is transferred to pre-processing unit. In order to process data received from subsystems, the single-chip signal processor ADSP-2185 must be used due to providing required processing speed. After data will be processed, it is send to PC, which processes it and outputs it in the readable form.

  2. In-Process Thermal Imaging of the Electron Beam Freeform Fabrication Process

    NASA Technical Reports Server (NTRS)

    Taminger, Karen M.; Domack, Christopher S.; Zalameda, Joseph N.; Taminger, Brian L.; Hafley, Robert A.; Burke, Eric R.

    2016-01-01

    Researchers at NASA Langley Research Center have been developing the Electron Beam Freeform Fabrication (EBF3) metal additive manufacturing process for the past 15 years. In this process, an electron beam is used as a heat source to create a small molten pool on a substrate into which wire is fed. The electron beam and wire feed assembly are translated with respect to the substrate to follow a predetermined tool path. This process is repeated in a layer-wise fashion to fabricate metal structural components. In-process imaging has been integrated into the EBF3 system using a near-infrared (NIR) camera. The images are processed to provide thermal and spatial measurements that have been incorporated into a closed-loop control system to maintain consistent thermal conditions throughout the build. Other information in the thermal images is being used to assess quality in real time by detecting flaws in prior layers of the deposit. NIR camera incorporation into the system has improved the consistency of the deposited material and provides the potential for real-time flaw detection which, ultimately, could lead to the manufacture of better, more reliable components using this additive manufacturing process.

  3. Video Guidance, Landing, and Imaging system (VGLIS) for space missions

    NASA Technical Reports Server (NTRS)

    Schappell, R. T.; Knickerbocker, R. L.; Tietz, J. C.; Grant, C.; Flemming, J. C.

    1975-01-01

    The feasibility of an autonomous video guidance system that is capable of observing a planetary surface during terminal descent and selecting the most acceptable landing site was demonstrated. The system was breadboarded and "flown" on a physical simulator consisting of a control panel and monitor, a dynamic simulator, and a PDP-9 computer. The breadboard VGLIS consisted of an image dissector camera and the appropriate processing logic. Results are reported.

  4. EOS MLS Science Data Processing System: A Description of Architecture and Capabilities

    NASA Technical Reports Server (NTRS)

    Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.

    2006-01-01

    This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.

  5. METHOD OF SEPARATING FROTHS FROM LIQUIDS

    DOEpatents

    Monet, G.P.

    1958-01-21

    A method for separating solids and precipitates from liquids is described. The method is particularly adapted for and valuable in processing highly radioactive solutions. It consists in essence, in employing the principles of froth flotation to effect the separation of approximately 99% of the solids present. An apparatus, consisting of a system of pipes, valves and vessels, for carrying out the process of this patent is also described therein.

  6. Asynchronous Processing of a Constellation of Geostationary and Polar-Orbiting Satellites for Fire Detection and Smoke Estimation

    NASA Astrophysics Data System (ADS)

    Hyer, E. J.; Peterson, D. A.; Curtis, C. A.; Schmidt, C. C.; Hoffman, J.; Prins, E. M.

    2014-12-01

    The Fire Locating and Monitoring of Burning Emissions (FLAMBE) system converts satellite observations of thermally anomalous pixels into spatially and temporally continuous estimates of smoke release from open biomass burning. This system currently processes data from a constellation of 5 geostationary and 2 polar-orbiting sensors. Additional sensors, including NPP VIIRS and the imager on the Korea COMS-1 geostationary satellite, will soon be added. This constellation experiences schedule changes and outages of various durations, making the set of available scenes for fire detection highly variable on an hourly and daily basis. Adding to the complexity, the latency of the satellite data is variable between and within sensors. FLAMBE shares with many fire detection systems the goal of detecting as many fires as possible as early as possible, but the FLAMBE system must also produce a consistent estimate of smoke production with minimal artifacts from the changing constellation. To achieve this, NRL has developed a system of asynchronous processing and cross-calibration that permits satellite data to be used as it arrives, while preserving the consistency of the smoke emission estimates. This talk describes the asynchronous data ingest methodology, including latency statistics for the constellation. We also provide an overview and show results from the system we have developed to normalize multi-sensor fire detection for consistency.

  7. Description of a dual fail-operational redundant strapdown inertial measurement unit for integrated avionics systems research

    NASA Technical Reports Server (NTRS)

    Bryant, W. H.; Morrell, F. R.

    1981-01-01

    Attention is given to a redundant strapdown inertial measurement unit for integrated avionics. The system consists of four two-degree-of-freedom turned rotor gyros and four two-degree-of-freedom accelerometers in a skewed and separable semi-octahedral array. The unit is coupled through instrument electronics to two flight computers which compensate sensor errors. The flight computers are interfaced to the microprocessors and process failure detection, isolation, redundancy management and flight control/navigation algorithms. The unit provides dual fail-operational performance and has data processing frequencies consistent with integrated avionics concepts presently planned.

  8. Thermally stable, low resistance contact systems for use with shallow junction p(+) nn(+) and n(+)pp(+) InP solar cells

    NASA Technical Reports Server (NTRS)

    Weizer, V. G.; Fatemi, N. S.; Hoffman, R. W.

    1995-01-01

    Two contact systems for use on shallow junction InP solar cells are described. The feature shared by these two contact systems is the absence of the metallurgical intermixing that normally takes place between the semiconductor and the contact metallization during the sintering process. The n(+)pp(+) cell contact system, consisting of a combination of Au and Ge, not only exhibits very low resistance in the as-fabricated state, but also yields post-sinter resistivity values of 1(exp -7) ohms-sq cm, with effectively no metal-InP interdiffusion. The n(+)pp(+)cell contact system, consisting of a combination of Ag and Zn, permits low resistance ohmic contact to be made directly to a shallow junction p/n InP device without harming the device itself during the contacting process.

  9. STP Position Paper: Recommended Best Practices for Sampling, Processing and Analysis of the Peripheral Nervous System (Nerves and Somatic and Autonomic Ganglia) during Nonclinical Toxicity Studies

    EPA Science Inventory

    These Society of Toxicologic Pathology “best” practice recommendations should ensure consistent sampling, processing, and evaluation of the peripheral nervous system (PNS). For toxicity studies where neurotoxicity is not anticipated (Situation 1), PNS evaluation may be limited...

  10. Balanced diets in food systems: emerging trends and challenges for human health.

    PubMed

    Sammugam, Lakhsmi; Pasupuleti, Visweswara Rao

    2018-04-25

    Processed foods, generally known as modified raw foods produced by innovative processing technologies alters the food constituents such natural enzymes, fatty acids, micronutrients, macronutrients and vitamins. In contrast to fresh and unprocessed foods, processed foods are guaranteed to be safer, imperishable, long lasting and consist high level of nutrients bioactivity. Currently, the evolution in food processing technologies is necessary to face food security and safety, nutrition demand, its availability and also other global challenges in the food system. In this scenario, this review consists of information on two food processing technologies, which effects on processed foods before and after processing and the impact of food products on human health. It is also very well established that understanding the type and structure of foods to be processed can assist food processing industries towards advancement of novel food products. In connection with this fact, the present article also discusses the emerging trends and possible modifications in food processing technologies with the combination of conventional and modern techniques to get the suitable nutritional and safety qualities in food.

  11. Increased Reliability of Gas Turbine Components by Robust Coatings Manufacturing

    NASA Astrophysics Data System (ADS)

    Sharma, A.; Dudykevych, T.; Sansom, D.; Subramanian, R.

    2017-08-01

    The expanding operational windows of the advanced gas turbine components demand increasing performance capability from protective coating systems. This demand has led to the development of novel multi-functional, multi-materials coating system architectures over the last years. In addition, the increasing dependency of components exposed to extreme environment on protective coatings results in more severe penalties, in case of a coating system failure. This emphasizes that reliability and consistency of protective coating systems are equally important to their superior performance. By means of examples, this paper describes the effects of scatter in the material properties resulting from manufacturing variations on coating life predictions. A strong foundation in process-property-performance correlations as well as regular monitoring and control of the coating process is essential for robust and well-controlled coating process. Proprietary and/or commercially available diagnostic tools can help in achieving these goals, but their usage in industrial setting is still limited. Various key contributors to process variability are briefly discussed along with the limitations of existing process and product control methods. Other aspects that are important for product reliability and consistency in serial manufacturing as well as advanced testing methodologies to simplify and enhance product inspection and improve objectivity are briefly described.

  12. Development of expert systems for modeling of technological process of pressure casting on the basis of artificial intelligence

    NASA Astrophysics Data System (ADS)

    Gavarieva, K. N.; Simonova, L. A.; Pankratov, D. L.; Gavariev, R. V.

    2017-09-01

    In article the main component of expert system of process of casting under pressure which consists of algorithms, united in logical models is considered. The characteristics of system showing data on a condition of an object of management are described. A number of logically interconnected steps allowing to increase quality of the received castings is developed

  13. Heat pump processes induced by laser radiation

    NASA Technical Reports Server (NTRS)

    Garbuny, M.; Henningsen, T.

    1980-01-01

    A carbon dioxide laser system was constructed for the demonstration of heat pump processes induced by laser radiation. The system consisted of a frequency doubling stage, a gas reaction cell with its vacuum and high purity gas supply system, and provisions to measure the temperature changes by pressure, or alternatively, by density changes. The theoretical considerations for the choice of designs and components are dicussed.

  14. Linking consistency with object/thread semantics - An approach to robust computation

    NASA Technical Reports Server (NTRS)

    Chen, Raymond C.; Dasgupta, Partha

    1989-01-01

    This paper presents an object/thread based paradigm that links data consistency with object/thread semantics. The paradigm can be used to achieve a wide range of consistency semantics from strict atomic transactions to standard process semantics. The paradigm supports three types of data consistency. Object programmers indicate the type of consistency desired on a per-operation basis and the system performs automatic concurrency control and recovery management to ensure that those consistency requirements are met. This allows programmers to customize consistency and recovery on a per-application basis without having to supply complicated, custom recovery management schemes. The paradigm allows robust and nonrobust computation to operate concurrently on the same data in a well defined manner. The operating system needs to support only one vehicle of computation - the thread.

  15. COMPUTERIZED RISK AND BIOACCUMULATION SYSTEM (VERSION 1.0)

    EPA Science Inventory

    CRABS is a combination of a rule-based expert system and more traditional procedural programming techniques. ule-based expert systems attempt to emulate the decision making process of human experts within a clearly defined subject area. xpert systems consist of an "inference engi...

  16. ARES - A New Airborne Reflective Emissive Spectrometer

    DTIC Science & Technology

    2005-10-01

    Information and Management System (DIMS), an automated processing environment with robot archive interface as established for the handling of satellite data...consisting of geocoded ground reflectance data. All described processing steps will be integrated in the automated processing environment DIMS to assure a

  17. Providing security for automated process control systems at hydropower engineering facilities

    NASA Astrophysics Data System (ADS)

    Vasiliev, Y. S.; Zegzhda, P. D.; Zegzhda, D. P.

    2016-12-01

    This article suggests the concept of a cyberphysical system to manage computer security of automated process control systems at hydropower engineering facilities. According to the authors, this system consists of a set of information processing tools and computer-controlled physical devices. Examples of cyber attacks on power engineering facilities are provided, and a strategy of improving cybersecurity of hydropower engineering systems is suggested. The architecture of the multilevel protection of the automated process control system (APCS) of power engineering facilities is given, including security systems, control systems, access control, encryption, secure virtual private network of subsystems for monitoring and analysis of security events. The distinctive aspect of the approach is consideration of interrelations and cyber threats, arising when SCADA is integrated with the unified enterprise information system.

  18. Combined Acquisition/Processing For Data Reduction

    NASA Astrophysics Data System (ADS)

    Kruger, Robert A.

    1982-01-01

    Digital image processing systems necessarily consist of three components: acquisition, storage/retrieval and processing. The acquisition component requires the greatest data handling rates. By coupling together the acquisition witn some online hardwired processing, data rates and capacities for short term storage can be reduced. Furthermore, long term storage requirements can be reduced further by appropriate processing and editing of image data contained in short term memory. The net result could be reduced performance requirements for mass storage, processing and communication systems. Reduced amounts of data also snouid speed later data analysis and diagnostic decision making.

  19. Multilevel photonic modules for millimeter-wave phased-array antennas

    NASA Astrophysics Data System (ADS)

    Paolella, Arthur C.; Bauerle, Athena; Joshi, Abhay M.; Wright, James G.; Coryell, Louis A.

    2000-09-01

    Millimeter wave phased array systems have antenna element sizes and spacings similar to MMIC chip dimensions by virtue of the operating wavelength. Designing modules in traditional planar packaing techniques are therefore difficult to implement. An advantageous way to maintain a small module footprint compatible with Ka-Band and high frequency systems is to take advantage of two leading edge technologies, opto- electronic integrated circuits (OEICs) and multilevel packaging technology. Under a Phase II SBIR these technologies are combined to form photonic modules for optically controlled millimeter wave phased array antennas. The proposed module, consisting of an OEIC integrated with a planar antenna array will operate on the 40GHz region. The OEIC consists of an InP based dual-depletion PIN photodetector and distributed amplifier. The multi-level module will be fabricated using an enhanced circuit processing thick film process. Since the modules are batch fabricated using an enhanced circuit processing thick film process. Since the modules are batch fabricated, using standard commercial processes, it has the potential to be low cost while maintaining high performance, impacting both military and commercial communications systems.

  20. Development of patient collation system by kinetic analysis for chest dynamic radiogram with flat panel detector

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Yuichiro; Kodera, Yoshie

    2006-03-01

    In the picture archiving and communication system (PACS) environment, it is important that all images be stored in the correct location. However, if information such as the patient's name or identification number has been entered incorrectly, it is difficult to notice the error. The present study was performed to develop a system of patient collation automatically for dynamic radiogram examination by a kinetic analysis, and to evaluate the performance of the system. Dynamic chest radiographs during respiration were obtained by using a modified flat panel detector system. Our computer algorithm developed in this study was consisted of two main procedures, kinetic map imaging processing, and collation processing. Kinetic map processing is a new algorithm to visualize a movement for dynamic radiography; direction classification of optical flows and intensity-density transformation technique was performed. Collation processing consisted of analysis with an artificial neural network (ANN) and discrimination for Mahalanobis' generalized distance, those procedures were performed to evaluate a similarity of combination for the same person. Finally, we investigated the performance of our system using eight healthy volunteers' radiographs. The performance was shown as a sensitivity and specificity. The sensitivity and specificity for our system were shown 100% and 100%, respectively. This result indicated that our system has excellent performance for recognition of a patient. Our system will be useful in PACS management for dynamic chest radiography.

  1. Representational Issues in Systemic Functional Grammar and Systemic Grammar and Functional Unification Grammar. ISI Reprint Series.

    ERIC Educational Resources Information Center

    Matthiessen, Christian; Kasper, Robert

    Consisting of two separate papers, "Representational Issues in Systemic Functional Grammar," by Christian Matthiessen and "Systemic Grammar and Functional Unification Grammar," by Robert Kasper, this document deals with systemic aspects of natural language processing and linguistic theory and with computational applications of…

  2. System approach to modeling of industrial technologies

    NASA Astrophysics Data System (ADS)

    Toropov, V. S.; Toropov, E. S.

    2018-03-01

    The authors presented a system of methods for modeling and improving industrial technologies. The system consists of information and software. The information part is structured information about industrial technologies. The structure has its template. The template has several essential categories used to improve the technological process and eliminate weaknesses in the process chain. The base category is the physical effect that takes place when the technical process proceeds. The programming part of the system can apply various methods of creative search to the content stored in the information part of the system. These methods pay particular attention to energy transformations in the technological process. The system application will allow us to systematize the approach to improving technologies and obtaining new technical solutions.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washiya, Tadahiro; Komaki, Jun; Funasaka, Hideyuki

    Japan Atomic Energy Agency (JAEA) has been developing the new aqueous reprocessing system named 'NEXT' (New Extraction system for TRU recovery)1-2, which provides many advantages as waste volume reduction, cost savings by advanced components and simplification of process operation. Advanced head-end systems in the 'NEXT' process consist of fuel disassembly system, fuel shearing system and continuous dissolver system. We developed reliable fuel disassembly system with innovative procedure, and short-length shearing system and continuous dissolver system can be provided highly concentrated dissolution to adapt to the uranium crystallization process. We have carried out experimental studies, and fabrication of engineering-scale test devicesmore » to confirm the systems performance. In this paper, research and development of advanced head-end systems are described. (authors)« less

  4. Modeling and Simulation Roadmap to Enhance Electrical Energy Security of U.S. Naval Bases

    DTIC Science & Technology

    2012-03-01

    evaluating power system architectures and technologies and, therefore, can become a valuable tool for the implementation of the described plan for Navy...a well validated and consistent process for evaluating power system architectures and technologies and, therefore, can be a valuable tool for the...process for evaluating power system architectures and component technologies is needed to support the development and implementation of these new

  5. A multiprocessing architecture for real-time monitoring

    NASA Technical Reports Server (NTRS)

    Schmidt, James L.; Kao, Simon M.; Read, Jackson Y.; Weitzenkamp, Scott M.; Laffey, Thomas J.

    1988-01-01

    A multitasking architecture for performing real-time monitoring and analysis using knowledge-based problem solving techniques is described. To handle asynchronous inputs and perform in real time, the system consists of three or more distributed processes which run concurrently and communicate via a message passing scheme. The Data Management Process acquires, compresses, and routes the incoming sensor data to other processes. The Inference Process consists of a high performance inference engine that performs a real-time analysis on the state and health of the physical system. The I/O Process receives sensor data from the Data Management Process and status messages and recommendations from the Inference Process, updates its graphical displays in real time, and acts as the interface to the console operator. The distributed architecture has been interfaced to an actual spacecraft (NASA's Hubble Space Telescope) and is able to process the incoming telemetry in real-time (i.e., several hundred data changes per second). The system is being used in two locations for different purposes: (1) in Sunnyville, California at the Space Telescope Test Control Center it is used in the preflight testing of the vehicle; and (2) in Greenbelt, Maryland at NASA/Goddard it is being used on an experimental basis in flight operations for health and safety monitoring.

  6. Sensor fault detection and isolation system for a condensation process.

    PubMed

    Castro, M A López; Escobar, R F; Torres, L; Aguilar, J F Gómez; Hernández, J A; Olivares-Peregrino, V H

    2016-11-01

    This article presents the design of a sensor Fault Detection and Isolation (FDI) system for a condensation process based on a nonlinear model. The condenser is modeled by dynamic and thermodynamic equations. For this work, the dynamic equations are described by three pairs of differential equations which represent the energy balance between the fluids. The thermodynamic equations consist in algebraic heat transfer equations and empirical equations, that allow for the estimation of heat transfer coefficients. The FDI system consists of a bank of two nonlinear high-gain observers, in order to detect, estimate and to isolate the fault in any of both outlet temperature sensors. The main contributions of this work were the experimental validation of the condenser nonlinear model and the FDI system. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  7. ISRU System Model Tool: From Excavation to Oxygen Production

    NASA Technical Reports Server (NTRS)

    Santiago-Maldonado, Edgardo; Linne, Diane L.

    2007-01-01

    In the late 80's, conceptual designs for an in situ oxygen production plant were documented in a study by Eagle Engineering [1]. In the "Summary of Findings" of this study, it is clearly pointed out that: "reported process mass and power estimates lack a consistent basis to allow comparison." The study goes on to say: "A study to produce a set of process mass, power, and volume requirements on a consistent basis is recommended." Today, approximately twenty years later, as humans plan to return to the moon and venture beyond, the need for flexible up-to-date models of the oxygen extraction production process has become even more clear. Multiple processes for the production of oxygen from lunar regolith are being investigated by NASA, academia, and industry. Three processes that have shown technical merit are molten regolith electrolysis, hydrogen reduction, and carbothermal reduction. These processes have been selected by NASA as the basis for the development of the ISRU System Model Tool (ISMT). In working to develop up-to-date system models for these processes NASA hopes to accomplish the following: (1) help in the evaluation process to select the most cost-effective and efficient process for further prototype development, (2) identify key parameters, (3) optimize the excavation and oxygen production processes, and (4) provide estimates on energy and power requirements, mass and volume of the system, oxygen production rate, mass of regolith required, mass of consumables, and other important parameters. Also, as confidence and high fidelity is achieved with each component's model, new techniques and processes can be introduced and analyzed at a fraction of the cost of traditional hardware development and test approaches. A first generation ISRU System Model Tool has been used to provide inputs to the Lunar Architecture Team studies.

  8. Network model of chemical-sensing system inspired by mouse taste buds.

    PubMed

    Tateno, Katsumi; Igarashi, Jun; Ohtubo, Yoshitaka; Nakada, Kazuki; Miki, Tsutomu; Yoshii, Kiyonori

    2011-07-01

    Taste buds endure extreme changes in temperature, pH, osmolarity, so on. Even though taste bud cells are replaced in a short span, they contribute to consistent taste reception. Each taste bud consists of about 50 cells whose networks are assumed to process taste information, at least preliminarily. In this article, we describe a neural network model inspired by the taste bud cells of mice. It consists of two layers. In the first layer, the chemical stimulus is transduced into an irregular spike train. The synchronization of the output impulses is induced by the irregular spike train at the second layer. These results show that the intensity of the chemical stimulus is encoded as the degree of the synchronization of output impulses. The present algorithms for signal processing result in a robust chemical-sensing system.

  9. Intelligent system of coordination and control for manufacturing

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2016-08-01

    This paper wants shaping an intelligent system monitoring and control, which leads to optimizing material and information flows of the company. The paper presents a model for tracking and control system using intelligent real. Production system proposed for simulation analysis provides the ability to track and control the process in real time. Using simulation models be understood: the influence of changes in system structure, commands influence on the general condition of the manufacturing process conditions influence the behavior of some system parameters. Practical character consists of tracking and real-time control of the technological process. It is based on modular systems analyzed using mathematical models, graphic-analytical sizing, configuration, optimization and simulation.

  10. Common Badging and Access Control System (CBACS)

    NASA Technical Reports Server (NTRS)

    Baldridge, Tim

    2005-01-01

    The goals of the project are: Achieve high business value through a common badging and access control system that integrates with smart cards. Provide physical (versus logical) deployment of smart cards initially. Provides a common consistent and reliable environment into which to release the smart card. Gives opportunity to develop agency-wide consistent processes, practices and policies. Enables enterprise data capture and management. Promotes data validation prior to SC issuance.

  11. Materials And Processes Technical Information System (MAPTIS) LDEF materials database

    NASA Technical Reports Server (NTRS)

    Davis, John M.; Strickland, John W.

    1992-01-01

    The Materials and Processes Technical Information System (MAPTIS) is a collection of materials data which was computerized and is available to engineers in the aerospace community involved in the design and development of spacecraft and related hardware. Consisting of various database segments, MAPTIS provides the user with information such as material properties, test data derived from tests specifically conducted for qualification of materials for use in space, verification and control, project management, material information, and various administrative requirements. A recent addition to the project management segment consists of materials data derived from the LDEF flight. This tremendous quantity of data consists of both pre-flight and post-flight data in such diverse areas as optical/thermal, mechanical and electrical properties, atomic concentration surface analysis data, as well as general data such as sample placement on the satellite, A-O flux, equivalent sun hours, etc. Each data point is referenced to the primary investigator(s) and the published paper from which the data was taken. The MAPTIS system is envisioned to become the central location for all LDEF materials data. This paper consists of multiple parts, comprising a general overview of the MAPTIS System and the types of data contained within, and the specific LDEF data element and the data contained in that segment.

  12. Toward Interpreting Failure in Sintered-Silver Interconnection Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wereszczak, Andrew A; Waters, Shirley B

    2016-01-01

    The mechanical strength and subsequent reliability of a sintered-silver interconnection system is a function of numerous independent parameters. That system is still undergoing process development. Most of those parameters (e.g., choice of plating) are arguably and unfortunately taken for granted and are independent of the silver s cohesive strength. To explore such effects, shear strength testing and failure analyses were completed on a simple, mock sintered-silver interconnection system consisting of bonding two DBC ceramic substrates. Silver and gold platings were part of the test matrix, as was pre-drying strategies, and the consideration of stencil-printing vs. screen-printing. Shear strength of sintered-silvermore » interconnect systems was found to be was insensitive to the choice of plating, drying practice, and printing method provided careful and consistent processing of the sintered-silver are practiced. But if the service stress in sintered silver interconnect systems is anticipated to exceed ~ 60 MPa, then the system will likely fail.« less

  13. Managing IT service management implementation complexity: from the perspective of the Warfield Version of systems science

    NASA Astrophysics Data System (ADS)

    Wan, Jiangping; Jones, James D.

    2013-11-01

    The Warfield version of systems science supports a wide variety of application areas, and is useful to practitioners who use the work program of complexity (WPOC) tool. In this article, WPOC is applied to information technology service management (ITSM) for managing the complexity of projects. In discussing the application of WPOC to ITSM, we discuss several steps of WPOC. The discovery step of WPOC consists of a description process and a diagnosis process. During the description process, 52 risk factors are identified, which are then narrowed to 20 key risk factors. All of this is done by interviews and surveys. Root risk factors (the most basic risk factors) consist of 11 kinds of common 'mindbugs' which are selected from an interpretive structural model. This is achieved by empirical analysis of 25 kinds of mindbugs. (A lesser aim of this research is to affirm that these mindbugs developed from a Western mindset have corresponding relevance in a completely different culture: the Peoples Republic of China.) During the diagnosis process, the relationships among the root risk factors in the implementation of the ITSM project are identified. The resolution step of WPOC consists of a design process and an implementation process. During the design process, issues related to the ITSM application are compared to both e-Government operation and maintenance, and software process improvement. The ITSM knowledge support structure is also designed at this time. During the implementation process, 10 keys to the successful implementation of ITSM projects are identified.

  14. Data acquisition and processing system for the HT-6M tokamak fusion experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shu, Y.T.; Liu, G.C.; Pang, J.Q.

    1987-08-01

    This paper describes a high-speed data acquisition and processing system which has been successfully operated on the HT-6M tokamak fusion experimental device. The system collects, archives and analyzes up to 512 kilobytes of data from each shot of the experiment. A shot lasts 50-150 milliseconds and occurs every 5-10 minutes. The system consists of two PDP-11/24 computer systems. One PDP-11/24 is used for real-time data taking and on-line data analysis. It is based upon five CAMAC crates organized into a parallel branch. Another PDP-11/24 is used for off-line data processing. Both data acquisition software RSX-DAS and data processing software RSX-DAPmore » have modular, multi-tasking and concurrent processing features.« less

  15. Central Data Processing System (CDPS) user's manual: Solar heating and cooling program

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The software and data base management system required to assess the performance of solar heating and cooling systems installed at multiple sites is presented. The instrumentation data associated with these systems is collected, processed, and presented in a form which supported continuity of performance evaluation across all applications. The CDPS consisted of three major elements: communication interface computer, central data processing computer, and performance evaluation data base. Users of the performance data base were identified, and procedures for operation, and guidelines for software maintenance were outlined. The manual also defined the output capabilities of the CDPS in support of external users of the system.

  16. Multi-mission space science data processing systems - Past, present, and future

    NASA Technical Reports Server (NTRS)

    Stallings, William H.

    1990-01-01

    Packetized telemetry that is consistent with the international Consultative Committee for Space Data Systems (CCSDS) has been baselined for future NASA missions such as Space Station Freedom. Some experiences from past and present multimission systems are examined, including current experiences in implementing a CCSDS standard packetized data processing system, relative to the effectiveness of the multimission approach in lowering life cycle cost and the complexity of meeting new mission needs. It is shown that the continued effort toward standardization of telemetry and processing support will permit the development of multimission systems needed to meet the increased requirements of future NASA missions.

  17. Appendices to the model description document for a computer program for the emulation/simulation of a space station environmental control and life support system

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    A Model Description Document for the Emulation Simulation Computer Model was already published. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation simulation combination in the design, development, and test of a piece of ARS hardware, SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from test. Also, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system. The second consists of a potential air revitalization system.

  18. Fabrication Process for Large Size Mold and Alignment Method for Nanoimprint System

    NASA Astrophysics Data System (ADS)

    Ishibashi, Kentaro; Kokubo, Mitsunori; Goto, Hiroshi; Mizuno, Jun; Shoji, Shuichi

    Nanoimprint technology is considered one of the mass production methods of the display for cellular phone or notebook computer, with Anti-Reflection Structures (ARS) pattern and so on. In this case, the large size mold with nanometer order pattern is very important. Then, we describe the fabrication process for large size mold, and the alignment method for UV nanoimprint system. We developed the original mold fabrication process using nanoimprint method and etching techniques. In 66 × 45 mm2 area, 200nm period seamless patterns were formed using this process. And, we constructed original alignment system that consists of the CCD-camera system, X-Y-θ table, method of moiré fringe, and image processing system, because the accuracy of pattern connection depends on the alignment method. This alignment system accuracy was within 20nm.

  19. Energy Models

    EPA Science Inventory

    Energy models characterize the energy system, its evolution, and its interactions with the broader economy. The energy system consists of primary resources, including both fossil fuels and renewables; power plants, refineries, and other technologies to process and convert these r...

  20. Summary of the First Network-Centric Sensing Community Workshop, ’Netted Sensors: A Government, Industry and Academia Dialogue’

    DTIC Science & Technology

    2006-04-01

    and Scalability, (2) Sensors and Platforms, (3) Distributed Computing and Processing , (4) Information Management, (5) Fusion and Resource Management...use of the deployed system. 3.3 Distributed Computing and Processing Session The Distributed Computing and Processing Session consisted of three

  1. Experimental approaches to well controlled studies of thin-film nucleation and growth.

    NASA Technical Reports Server (NTRS)

    Poppa, H.; Moorhead, R. D.; Heinemann, K.

    1972-01-01

    Particular features and the performance of two experimental systems are described for quantitative studies of thin-film nucleation and growth processes including epitaxial depositions. System I consists of a modified LEED-Auger instrument combined with high-resolution electron microscopy. System II is a UHV electron microscope adapted for in-situ deposition studies. The two systems complement each other ideally, and the combined use of both can result in a comprehensive investigation of vapor deposition processes not obtainable with any other known method.

  2. Cold Vacuum Drying facility civil structural system design description (SYS 06)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PITKOFF, C.C.

    This document describes the Cold Vacuum Drying (CVD) Facility civil - structural system. This system consists of the facility structure, including the administrative and process areas. The system's primary purpose is to provide for a facility to house the CVD process and personnel and to provide a tertiary level of containment. The document provides a description of the facility and demonstrates how the design meets the various requirements imposed by the safety analysis report and the design requirements document.

  3. Radar Unix: a complete package for GPR data processing

    NASA Astrophysics Data System (ADS)

    Grandjean, Gilles; Durand, Herve

    1999-03-01

    A complete package for ground penetrating radar data interpretation including data processing, forward modeling and a case history database consultation is presented. Running on an Unix operating system, its architecture consists of a graphical user interface generating batch files transmitted to a library of processing routines. This design allows a better software maintenance and the possibility for the user to run processing or modeling batch files by itself and differed in time. A case history data base is available and consists of an hypertext document which can be consulted by using a standard HTML browser. All the software specifications are presented through a realistic example.

  4. Advanced Land Imager Assessment System

    NASA Technical Reports Server (NTRS)

    Chander, Gyanesh; Choate, Mike; Christopherson, Jon; Hollaren, Doug; Morfitt, Ron; Nelson, Jim; Nelson, Shar; Storey, James; Helder, Dennis; Ruggles, Tim; hide

    2008-01-01

    The Advanced Land Imager Assessment System (ALIAS) supports radiometric and geometric image processing for the Advanced Land Imager (ALI) instrument onboard NASA s Earth Observing-1 (EO-1) satellite. ALIAS consists of two processing subsystems for radiometric and geometric processing of the ALI s multispectral imagery. The radiometric processing subsystem characterizes and corrects, where possible, radiometric qualities including: coherent, impulse; and random noise; signal-to-noise ratios (SNRs); detector operability; gain; bias; saturation levels; striping and banding; and the stability of detector performance. The geometric processing subsystem and analysis capabilities support sensor alignment calibrations, sensor chip assembly (SCA)-to-SCA alignments and band-to-band alignment; and perform geodetic accuracy assessments, modulation transfer function (MTF) characterizations, and image-to-image characterizations. ALIAS also characterizes and corrects band-toband registration, and performs systematic precision and terrain correction of ALI images. This system can geometrically correct, and automatically mosaic, the SCA image strips into a seamless, map-projected image. This system provides a large database, which enables bulk trending for all ALI image data and significant instrument telemetry. Bulk trending consists of two functions: Housekeeping Processing and Bulk Radiometric Processing. The Housekeeping function pulls telemetry and temperature information from the instrument housekeeping files and writes this information to a database for trending. The Bulk Radiometric Processing function writes statistical information from the dark data acquired before and after the Earth imagery and the lamp data to the database for trending. This allows for multi-scene statistical analyses.

  5. Exploiting Virtual Synchrony in Distributed Systems

    DTIC Science & Technology

    1987-02-01

    for distributed systems yield the best performance relative to the level of synchronization guaranteed by the primitive . A pro- grammer could then... synchronization facility. Semaphores Replicated binary and general semaphores . Monitors Monitor lock, condition variables and signals. Deadlock detection...We describe applications of a new software abstraction called the virtually synchronous process group. Such a group consists of a set of processes

  6. Automatic Processing of Metallurgical Abstracts for the Purpose of Information Retrieval. Final Report.

    ERIC Educational Resources Information Center

    Melton, Jessica S.

    Objectives of this project were to develop and test a method for automatically processing the text of abstracts for a document retrieval system. The test corpus consisted of 768 abstracts from the metallurgical section of Chemical Abstracts (CA). The system, based on a subject indexing rational, had two components: (1) a stored dictionary of words…

  7. Use of artificial intelligence in the production of high quality minced meat

    NASA Astrophysics Data System (ADS)

    Kapovsky, B. R.; Pchelkina, V. A.; Plyasheshnik, P. I.; Dydykin, A. S.; Lazarev, A. A.

    2017-09-01

    A design for an automatic line for minced meat production according to new production technology based on an innovative meat milling method is proposed. This method allows the necessary degree of raw material comminution at the stage of raw material preparation to be obtained, which leads to production intensification due to the traditional meat mass comminution equipment being unnecessary. To ensure consistent quality of the product obtained, the use of on-line automatic control of the technological process for minced meat production is envisaged. This system has been developed using artificial intelligence methods and technologies. The system is trainable during the operation process, adapts to changes in processed raw material characteristics and to external impacts that affect the system operation, and manufactures meat shavings with minimal dispersion of the typical particle size. The control system includes equipment for express analysis of the chemical composition of the minced meat and its temperature after comminution. In this case, the minced meat production process can be controlled strictly as a function of time, which excludes subjective factors for assessing the degree of finished product readiness. This will allow finished meat products with consistent, targeted high quality to be produced.

  8. Homogeneous (Cu, Ni)6Sn5 intermetallic compound joints rapidly formed in asymmetrical Ni/Sn/Cu system using ultrasound-induced transient liquid phase soldering process.

    PubMed

    Li, Z L; Dong, H J; Song, X G; Zhao, H Y; Tian, H; Liu, J H; Feng, J C; Yan, J C

    2018-04-01

    Homogeneous (Cu, Ni) 6 Sn 5 intermetallic compound (IMC) joints were rapidly formed in asymmetrical Ni/Sn/Cu system by an ultrasound-induced transient liquid phase (TLP) soldering process. In the traditional TLP soldering process, the intermetallic joints formed in Ni/Sn/Cu system consisted of major (Cu, Ni) 6 Sn 5 and minor Cu 3 Sn IMCs, and the grain morphology of (Cu, Ni) 6 Sn 5 IMCs subsequently exhibited fine rounded, needlelike and coarse rounded shapes from the Ni side to the Cu side, which was highly in accordance with the Ni concentration gradient across the joints. However, in the ultrasound-induced TLP soldering process, the intermetallic joints formed in Ni/Sn/Cu system only consisted of the (Cu, Ni) 6 Sn 5 IMCs which exhibited an uniform grain morphology of rounded shape with a remarkably narrowed Ni concentration gradient. The ultrasound-induced homogeneous intermetallic joints exhibited higher shear strength (61.6 MPa) than the traditional heterogeneous intermetallic joints (49.8 MPa). Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Idea of Identification of Copper Ore with the Use of Process Analyser Technology Sensors

    NASA Astrophysics Data System (ADS)

    Jurdziak, Leszek; Kaszuba, Damian; Kawalec, Witold; Król, Robert

    2016-10-01

    The Polish resources of the copper ore exploited by the KGHM S.A. underground mines are considered as one of the most complex in the world and - consequently - the most difficult to be processed. The ore consists of three lithology forms: dolomites, shales and sandstones but in different proportions which has a significant impact on the effectiveness of the grinding and flotation processes. The lithological composition of the ore is generally recognised in-situ but after being mined it is blended on its long way from various mining fields to the processing plant by the complex transportation system consisting of belt conveyors with numerous switching points, ore bunkers and shafts. Identification of the lithological composition of the ore being supplied to the processing plant should improve the adjustments of the ore processing machinery equipment aiming to decrease the specific processing (mainly grinding) energy consumption as well as increase the metal recovery. The novel idea of Process Analyser Technology (PAT) sensors - information carrying pellets, dropped into the transported or processed bulk material which can be read directly when needed - is investigated for various applications within the DISIRE project (a part of the SPIRE initiative, acting under the Horizon2020 framework program) and here is adopted for implementing the annotation the transported copper ore for the needs of ore processing plants control. The identification of the lithological composition of ore blended on its way to the processing plant can be achieved by an information system consisting of pellets that keep the information about the original location of the portions of conveyed ore, the digital, geological database keeping the data of in-situ lithology and the simulation models of the transportation system, necessary to evaluate the composition of the blended ore. The assumptions of the proposed solution and the plan of necessary in-situ tests (with the special respect to harsh environment of

  10. System design package for the solar heating and cooling central data processing system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The central data processing system provides the resources required to assess the performance of solar heating and cooling systems installed at remote sites. These sites consist of residential, commercial, government, and educational types of buildings, and the solar heating and cooling systems can be hot-water, space heating, cooling, and combinations of these. The instrumentation data associated with these systems will vary according to the application and must be collected, processed, and presented in a form which supports continuity of performance evaluation across all applications. Overall software system requirements were established for use in the central integration facility which transforms raw data collected at remote sites into performance evaluation information for assessing the performance of solar heating and cooling systems.

  11. Image acquisition system for traffic monitoring applications

    NASA Astrophysics Data System (ADS)

    Auty, Glen; Corke, Peter I.; Dunn, Paul; Jensen, Murray; Macintyre, Ian B.; Mills, Dennis C.; Nguyen, Hao; Simons, Ben

    1995-03-01

    An imaging system for monitoring traffic on multilane highways is discussed. The system, named Safe-T-Cam, is capable of operating 24 hours per day in all but extreme weather conditions and can capture still images of vehicles traveling up to 160 km/hr. Systems operating at different remote locations are networked to allow transmission of images and data to a control center. A remote site facility comprises a vehicle detection and classification module (VCDM), an image acquisition module (IAM) and a license plate recognition module (LPRM). The remote site is connected to the central site by an ISDN communications network. The remote site system is discussed in this paper. The VCDM consists of a video camera, a specialized exposure control unit to maintain consistent image characteristics, and a 'real-time' image processing system that processes 50 images per second. The VCDM can detect and classify vehicles (e.g. cars from trucks). The vehicle class is used to determine what data should be recorded. The VCDM uses a vehicle tracking technique to allow optimum triggering of the high resolution camera of the IAM. The IAM camera combines the features necessary to operate consistently in the harsh environment encountered when imaging a vehicle 'head-on' in both day and night conditions. The image clarity obtained is ideally suited for automatic location and recognition of the vehicle license plate. This paper discusses the camera geometry, sensor characteristics and the image processing methods which permit consistent vehicle segmentation from a cluttered background allowing object oriented pattern recognition to be used for vehicle classification. The image capture of high resolution images and the image characteristics required for the LPRMs automatic reading of vehicle license plates, is also discussed. The results of field tests presented demonstrate that the vision based Safe-T-Cam system, currently installed on open highways, is capable of producing automatic classification of vehicle class and recording of vehicle numberplates with a success rate around 90 percent in a period of 24 hours.

  12. Global Positioning System data collection, processing, and analysis conducted by the U.S. Geological Survey Earthquake Hazards Program

    USGS Publications Warehouse

    Murray, Jessica R.; Svarc, Jerry L.

    2017-01-01

    The U.S. Geological Survey Earthquake Science Center collects and processes Global Positioning System (GPS) data throughout the western United States to measure crustal deformation related to earthquakes and tectonic processes as part of a long‐term program of research and monitoring. Here, we outline data collection procedures and present the GPS dataset built through repeated temporary deployments since 1992. This dataset consists of observations at ∼1950 locations. In addition, this article details our data processing and analysis procedures, which consist of the following. We process the raw data collected through temporary deployments, in addition to data from continuously operating western U.S. GPS stations operated by multiple agencies, using the GIPSY software package to obtain position time series. Subsequently, we align the positions to a common reference frame, determine the optimal parameters for a temporally correlated noise model, and apply this noise model when carrying out time‐series analysis to derive deformation measures, including constant interseismic velocities, coseismic offsets, and transient postseismic motion.

  13. On the definition of the concepts thinking, consciousness, and conscience.

    PubMed Central

    Monin, A S

    1992-01-01

    A complex system (CS) is defined as a set of elements, with connections between them, singled out of the environment, capable of getting information from the environment, capable of making decisions (i.e., of choosing between alternatives), and having purposefulness (i.e., an urge towards preferable states or other goals). Thinking is a process that takes place (or which can take place) in some of the CS and consists of (i) receiving information from the environment (and from itself), (ii) memorizing the information, (iii) the subconscious, and (iv) consciousness. Life is a process that takes place in some CS and consists of functions i and ii, as well as (v) reproduction with passing of hereditary information to progeny, and (vi) oriented energy and matter exchange with the environment sufficient for the maintenance of all life processes. Memory is a complex of processes of placing information in memory banks, keeping it there, and producing it according to prescriptions available in the system or to inquiries arising in it. Consciousness is a process of realization by the thinking CS of some set of algorithms consisting of the comparison of its knowledge, intentions, decisions, and actions with reality--i.e., with accumulated and continuously received internal and external information. Conscience is a realization of an algorithm of good and evil pattern recognition. PMID:1631060

  14. Simulating optoelectronic systems for remote sensing with SENSOR

    NASA Astrophysics Data System (ADS)

    Boerner, Anko

    2003-04-01

    The consistent end-to-end simulation of airborne and spaceborne remote sensing systems is an important task and sometimes the only way for the adaptation and optimization of a sensor and its observation conditions, the choice and test of algorithms for data processing, error estimation and the evaluation of the capabilities of the whole sensor system. The presented software simulator SENSOR (Software ENvironment for the Simulation of Optical Remote sensing systems) includes a full model of the sensor hardware, the observed scene, and the atmosphere in between. It allows the simulation of a wide range of optoelectronic systems for remote sensing. The simulator consists of three parts. The first part describes the geometrical relations between scene, sun, and the remote sensing system using a ray tracing algorithm. The second part of the simulation environment considers the radiometry. It calculates the at-sensor radiance using a pre-calculated multidimensional lookup-table taking the atmospheric influence on the radiation into account. Part three consists of an optical and an electronic sensor model for the generation of digital images. Using SENSOR for an optimization requires the additional application of task-specific data processing algorithms. The principle of the end-to-end-simulation approach is explained, all relevant concepts of SENSOR are discussed, and examples of its use are given. The verification of SENSOR is demonstrated.

  15. Complexity Theory

    USGS Publications Warehouse

    Lee, William H K.

    2016-01-01

    A complex system consists of many interacting parts, generates new collective behavior through self organization, and adaptively evolves through time. Many theories have been developed to study complex systems, including chaos, fractals, cellular automata, self organization, stochastic processes, turbulence, and genetic algorithms.

  16. Evidence for global processing of complex visual displays

    NASA Technical Reports Server (NTRS)

    Munson, Robert C.; Horst, Richard L.

    1986-01-01

    'Polar graphic' displays, in which changes in system status are represented by distortions in the form of a geometric figure, were presented to subjects, and reaction time (RT) to discriminate system status was recorded. Of interest was the extent to which reaction time showed evidence of global processing of these displays as the number of nodes and difficulty of discrimination were varied. When discrimination of system status was easy, RT showed no increase with increasing number of nodes, providing evidence of global processing. When discrimination was difficult, systematic differences in RT as a function of the number of nodes suggested the invocation of other (local) processes, although the data were not consistent with a node-by-node search process.

  17. Satellite orbit considerations for a global change technology architecture trade study

    NASA Technical Reports Server (NTRS)

    Harrison, Edwin F.; Gibson, Gary G.; Suttles, John T.; Buglia, James J.; Taback, Israel

    1991-01-01

    A study was conducted to determine satellite orbits for earth observation missions aimed at obtaining data for assessing data global climate change. A multisatellite system is required to meet the scientific requirements for temporal coverage over the globe. The best system consists of four sun-synchronous satellites equally spaced in local time of equatorial crossing. This system can obtain data every three hours for all regions. Several other satellite systems consisting of combinations of sun-synchronous orbits and either the Space Station Freedom or a mid-altitude equatorial satellite can provide three to six hour temporal coverage, which is sufficient for measuring many of the parameters required for the global change monitoring mission. Geosynchronous satellites are required to study atmospheric and surface processes involving variations on the order of a few minutes to an hour. One or two geosynchronous satellites can be relocated in longitude to study processes over selected regions of earth.

  18. Advanced optical manufacturing digital integrated system

    NASA Astrophysics Data System (ADS)

    Tao, Yizheng; Li, Xinglan; Li, Wei; Tang, Dingyong

    2012-10-01

    It is necessarily to adapt development of advanced optical manufacturing technology with modern science technology development. To solved these problems which low of ration, ratio of finished product, repetition, consistent in big size and high precision in advanced optical component manufacturing. Applied business driven and method of Rational Unified Process, this paper has researched advanced optical manufacturing process flow, requirement of Advanced Optical Manufacturing integrated System, and put forward architecture and key technology of it. Designed Optical component core and Manufacturing process driven of Advanced Optical Manufacturing Digital Integrated System. the result displayed effective well, realized dynamic planning Manufacturing process, information integration improved ratio of production manufactory.

  19. Bioregenerative technologies for waste processing and resource recovery in advanced space life support system

    NASA Technical Reports Server (NTRS)

    Chamberland, Dennis

    1991-01-01

    The Controlled Ecological Life Support System (CELSS) for producing oxygen, water, and food in space will require an interactive facility to process and return wastes as resources to the system. This paper examines the bioregenerative techologies for waste processing and resource recovery considered for a CELSS Resource Recovery system. The components of this system consist of a series of biological reactors to treat the liquid and solid material fractions, in which the aerobic and anaerobic reactors are combined in a block called the Combined Reactor Equipment (CORE) block. The CORE block accepts the human wastes, kitchen wastes, inedible refractory plant materials, grey waters from the CELLS system, and aquaculture solids and processes these materials in either aerobic or anaerobic reactors depending on the desired product and the rates required by the integrated system.

  20. Analysis of x-ray hand images for bone age assessment

    NASA Astrophysics Data System (ADS)

    Serrat, Joan; Vitria, Jordi M.; Villanueva, Juan J.

    1990-09-01

    In this paper we describe a model-based system for the assessment of skeletal maturity on hand radiographs by the TW2 method. The problem consists in classiflying a set of bones appearing in an image in one of several stages described in an atlas. A first approach consisting in pre-processing segmentation and classification independent phases is also presented. However it is only well suited for well contrasted low noise images without superimposed bones were the edge detection by zero crossing of second directional derivatives is able to extract all bone contours maybe with little gaps and few false edges on the background. Hence the use of all available knowledge about the problem domain is needed to build a rather general system. We have designed a rule-based system for narrow down the rank of possible stages for each bone and guide the analysis process. It calls procedures written in conventional languages for matching stage models against the image and getting features needed in the classification process.

  1. 7 CFR 274.4 - Reconciliation and reporting.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... basis and consist of: (1) Information on how the system operates relative to its performance standards..., shall be submitted by each State agency operating an issuance system. The report shall be prepared at... reconciliation process. The EBT system shall provide reports and documentation pertaining to the following: (1...

  2. NASA information sciences and human factors program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Data Systems Program consists of research and technology devoted to controlling, processing, storing, manipulating, and analyzing space-derived data. The objectives of the program are to provide the technology advancements needed to enable affordable utilization of space-derived data, to increase substantially the capability for future missions of on-board processing and recording and to provide high-speed, high-volume computational systems that are anticipated for missions such as the evolutionary Space Station and Earth Observing System.

  3. Enhancing Safety of Artificially Ventilated Patients Using Ambient Process Analysis.

    PubMed

    Lins, Christian; Gerka, Alexander; Lüpkes, Christian; Röhrig, Rainer; Hein, Andreas

    2018-01-01

    In this paper, we present an approach for enhancing the safety of artificially ventilated patients using ambient process analysis. We propose to use an analysis system consisting of low-cost ambient sensors such as power sensor, RGB-D sensor, passage detector, and matrix infrared temperature sensor to reduce risks for artificially ventilated patients in both home and clinical environments. We describe the system concept and our implementation and show how the system can contribute to patient safety.

  4. Micro Autonomous Systems Research: Systems Engineering Processes for Micro-Autonomous Systems

    DTIC Science & Technology

    2015-09-30

    detailed geometry that can be sent directly to an automated manufacturing process like a 3D printer . Like traditional design, 3D CAD files were...force REF is stationed. The REF operates mobile manufacturing labs equipped with 3D printers , laser cutters, and CNC mills. The REF takes the...to verify that the 3d printer was cable of printing airfoil sections with sufficient accuracy and consistency while also providing an airfoil that

  5. Process consistency in models: The importance of system signatures, expert knowledge, and process complexity

    NASA Astrophysics Data System (ADS)

    Hrachowitz, M.; Fovet, O.; Ruiz, L.; Euser, T.; Gharari, S.; Nijzink, R.; Freer, J.; Savenije, H. H. G.; Gascuel-Odoux, C.

    2014-09-01

    Hydrological models frequently suffer from limited predictive power despite adequate calibration performances. This can indicate insufficient representations of the underlying processes. Thus, ways are sought to increase model consistency while satisfying the contrasting priorities of increased model complexity and limited equifinality. In this study, the value of a systematic use of hydrological signatures and expert knowledge for increasing model consistency was tested. It was found that a simple conceptual model, constrained by four calibration objective functions, was able to adequately reproduce the hydrograph in the calibration period. The model, however, could not reproduce a suite of hydrological signatures, indicating a lack of model consistency. Subsequently, testing 11 models, model complexity was increased in a stepwise way and counter-balanced by "prior constraints," inferred from expert knowledge to ensure a model which behaves well with respect to the modeler's perception of the system. We showed that, in spite of unchanged calibration performance, the most complex model setup exhibited increased performance in the independent test period and skill to better reproduce all tested signatures, indicating a better system representation. The results suggest that a model may be inadequate despite good performance with respect to multiple calibration objectives and that increasing model complexity, if counter-balanced by prior constraints, can significantly increase predictive performance of a model and its skill to reproduce hydrological signatures. The results strongly illustrate the need to balance automated model calibration with a more expert-knowledge-driven strategy of constraining models.

  6. Rhetorical Consequences of the Computer Society: Expert Systems and Human Communication.

    ERIC Educational Resources Information Center

    Skopec, Eric Wm.

    Expert systems are computer programs that solve selected problems by modelling domain-specific behaviors of human experts. These computer programs typically consist of an input/output system that feeds data into the computer and retrieves advice, an inference system using the reasoning and heuristic processes of human experts, and a knowledge…

  7. MEMS-based system and image processing strategy for epiretinal prosthesis.

    PubMed

    Xia, Peng; Hu, Jie; Qi, Jin; Gu, Chaochen; Peng, Yinghong

    2015-01-01

    Retinal prostheses have the potential to restore some level of visual function to the patients suffering from retinal degeneration. In this paper, an epiretinal approach with active stimulation devices is presented. The MEMS-based processing system consists of an external micro-camera, an information processor, an implanted electrical stimulator and a microelectrode array. The image processing strategy combining image clustering and enhancement techniques was proposed and evaluated by psychophysical experiments. The results indicated that the image processing strategy improved the visual performance compared with direct merging pixels to low resolution. The image processing methods assist epiretinal prosthesis for vision restoration.

  8. Inferring Instantaneous, Multivariate and Nonlinear Sensitivities for the Analysis of Feedback Processes in a Dynamical System: Lorenz Model Case Study

    NASA Technical Reports Server (NTRS)

    Aires, Filipe; Rossow, William B.; Hansen, James E. (Technical Monitor)

    2001-01-01

    A new approach is presented for the analysis of feedback processes in a nonlinear dynamical system by observing its variations. The new methodology consists of statistical estimates of the sensitivities between all pairs of variables in the system based on a neural network modeling of the dynamical system. The model can then be used to estimate the instantaneous, multivariate and nonlinear sensitivities, which are shown to be essential for the analysis of the feedbacks processes involved in the dynamical system. The method is described and tested on synthetic data from the low-order Lorenz circulation model where the correct sensitivities can be evaluated analytically.

  9. IDC Re-Engineering Phase 2 System Specification Document Version 1.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Satpathi, Meara Allena; Burns, John F.; Harris, James M.

    This document contains the system specifications derived to satisfy the system requirements found in the IDC System Requirements Document for the IDC Re-Engineering Phase 2 project. This System Specification Document (SSD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide data but does include requirements for the dissemination of radionuclide datamore » and products.« less

  10. The application of digital signal processing techniques to a teleoperator radar system

    NASA Technical Reports Server (NTRS)

    Pujol, A.

    1982-01-01

    A digital signal processing system was studied for the determination of the spectral frequency distribution of echo signals from a teleoperator radar system. The system consisted of a sample and hold circuit, an analog to digital converter, a digital filter, and a Fast Fourier Transform. The system is interfaced to a 16 bit microprocessor. The microprocessor is programmed to control the complete digital signal processing. The digital filtering and Fast Fourier Transform functions are implemented by a S2815 digital filter/utility peripheral chip and a S2814A Fast Fourier Transform chip. The S2815 initially simulates a low-pass Butterworth filter with later expansion to complete filter circuit (bandpass and highpass) synthesizing.

  11. Open source cardiology electronic health record development for DIGICARDIAC implementation

    NASA Astrophysics Data System (ADS)

    Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.

    2015-12-01

    This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.

  12. Concept of Operations for the Next Generation Air Transportation System. Version 3.0

    DTIC Science & Technology

    2010-01-01

    its operations, and establishes SMS requirements, responsibilities, and accountabilities • Safety Risk Management ( SRM ). The formal process within...the SMS that consists of describing the system; identifying the hazards; and assessing, analyzing, and mitigating the risk. The SRM process is...number of aircraft through the terminal airspace during peak traffic periods. Each of these features contributes to an environment that supports

  13. Analysis of electromagnetic interference from power system processing and transmission components for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Barber, Peter W.; Demerdash, Nabeel A. O.; Wang, R.; Hurysz, B.; Luo, Z.

    1991-01-01

    The goal is to analyze the potential effects of electromagnetic interference (EMI) originating from power system processing and transmission components for Space Station Freedom.The approach consists of four steps: (1) develop analytical tools (models and computer programs); (2) conduct parameterization studies; (3) predict the global space station EMI environment; and (4) provide a basis for modification of EMI standards.

  14. Bibliography On Multiprocessors And Distributed Processing

    NASA Technical Reports Server (NTRS)

    Miya, Eugene N.

    1988-01-01

    Multiprocessor and Distributed Processing Bibliography package consists of large machine-readable bibliographic data base, which in addition to usual keyword searches, used for producing citations, indexes, and cross-references. Data base contains UNIX(R) "refer" -formatted ASCII data and implemented on any computer running under UNIX(R) operating system. Easily convertible to other operating systems. Requires approximately one megabyte of secondary storage. Bibliography compiled in 1985.

  15. Field Artillery Ammunition Processing System (FAAPS) concept evaluation study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kring, C.T.; Babcock, S.M.; Watkin, D.C.

    1992-06-01

    The Field Artillery Ammunition Processing System (FAAPS) is an initiative to introduce a palletized load system (PLS) that is transportable with an automated ammunition processing and storage system for use on the battlefield. System proponents have targeted a 20% increase in the ammunition processing rate over the current operation while simultaneously reducing the total number of assigned field artillery battalion personnel by 30. The overall objective of the FAAPS Project is the development and demonstration of an improved process to accomplish these goals. The initial phase of the FAAPS Project and the subject of this study is the FAAPS conceptmore » evaluation. The concept evaluation consists of (1) identifying assumptions and requirements, (2) documenting the process flow, (3) identifying and evaluating technologies available to accomplish the necessary ammunition processing and storage operations, and (4) presenting alternative concepts with associated costs, processing rates, and manpower requirements for accomplishing the operation. This study provides insight into the achievability of the desired objectives.« less

  16. Field Artillery Ammunition Processing System (FAAPS) concept evaluation study. Ammunition Logistics Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kring, C.T.; Babcock, S.M.; Watkin, D.C.

    1992-06-01

    The Field Artillery Ammunition Processing System (FAAPS) is an initiative to introduce a palletized load system (PLS) that is transportable with an automated ammunition processing and storage system for use on the battlefield. System proponents have targeted a 20% increase in the ammunition processing rate over the current operation while simultaneously reducing the total number of assigned field artillery battalion personnel by 30. The overall objective of the FAAPS Project is the development and demonstration of an improved process to accomplish these goals. The initial phase of the FAAPS Project and the subject of this study is the FAAPS conceptmore » evaluation. The concept evaluation consists of (1) identifying assumptions and requirements, (2) documenting the process flow, (3) identifying and evaluating technologies available to accomplish the necessary ammunition processing and storage operations, and (4) presenting alternative concepts with associated costs, processing rates, and manpower requirements for accomplishing the operation. This study provides insight into the achievability of the desired objectives.« less

  17. A Study of Consistency in Design Selection and the Rank Ordering of Alternatives Using a Value Driven Design Approach

    NASA Astrophysics Data System (ADS)

    Subramanian, Tenkasi R.

    In the current day, with the rapid advancement in technology, engineering design is growing in complexity. Nowadays, engineers have to deal with design problems that are large, complex and involving multi-level decision analyses. With the increase in complexity and size of systems, the production and development cost tend to overshoot the allocated budget and resources. This often results in project delays and project cancellation. This is particularly true for aerospace systems. Value Driven Design proves to be means to strengthen the design process and help counter such trends. Value Driven is a novel framework for optimization which puts stakeholder preferences at the forefront of the design process to capture their true preferences to present system alternatives that are consistent the stakeholder's expectations. Traditional systems engineering techniques promote communication of stakeholder preferences in the form of requirements which confines the design space by imposing additional constraints on it. This results in a design that does not capture the true preferences of the stakeholder. Value Driven Design provides an alternate approach to design wherein a value function is created that corresponds to the true preferences of the stakeholder. The applicability of VDD broad, but it is imperative to first explore its feasibility to ensure the development of an efficient, robust and elegant system design. The key to understanding the usability of VDD is to investigate the formation, propagation and use of a value function. This research investigates the use of rank correlation metrics to ensure consistent rank ordering of design alternatives, while investigating the fidelity of the value function. The impact of design uncertainties on rank ordering. A satellite design system consisting of a satellite, ground station and launch vehicle is used to demonstrate the use of the metrics to aid in decision support during the design process.

  18. System model the processing of heterogeneous sensory information in robotized complex

    NASA Astrophysics Data System (ADS)

    Nikolaev, V.; Titov, V.; Syryamkin, V.

    2018-05-01

    Analyzed the scope and the types of robotic systems consisting of subsystems of the form "a heterogeneous sensors data processing subsystem". On the basis of the Queuing theory model is developed taking into account the unevenness of the intensity of information flow from the sensors to the subsystem of information processing. Analytical solution to assess the relationship of subsystem performance and uneven flows. The research of the obtained solution in the range of parameter values of practical interest.

  19. Lightweight Hyperspectral Mapping System and a Novel Photogrammetric Processing Chain for UAV-based Sensing

    NASA Astrophysics Data System (ADS)

    Suomalainen, Juha; Franke, Jappe; Anders, Niels; Iqbal, Shahzad; Wenting, Philip; Becker, Rolf; Kooistra, Lammert

    2014-05-01

    We have developed a lightweight Hyperspectral Mapping System (HYMSY) and a novel processing chain for UAV based mapping. The HYMSY consists of a custom pushbroom spectrometer (range 450-950nm, FWHM 9nm, ~20 lines/s, 328 pixels/line), a consumer camera (collecting 16MPix raw image every 2 seconds), a GPS-Inertia Navigation System (GPS-INS), and synchronization and data storage units. The weight of the system at take-off is 2.0kg allowing us to mount it on a relatively small octocopter. The novel processing chain exploits photogrammetry in the georectification process of the hyperspectral data. At first stage the photos are processed in a photogrammetric software producing a high-resolution RGB orthomosaic, a Digital Surface Model (DSM), and photogrammetric UAV/camera position and attitude at the moment of each photo. These photogrammetric camera positions are then used to enhance the internal accuracy of GPS-INS data. These enhanced GPS-INS data are then used to project the hyperspectral data over the photogrammetric DSM, producing a georectified end product. The presented photogrammetric processing chain allows fully automated georectification of hyperspectral data using a compact GPS-INS unit while still producingin UAV use higher georeferencing accuracy than would be possible using the traditional processing method. During 2013, we have operated HYMSY on 150+ octocopter flights at 60+ sites or days. On typical flight we have produced for a 2-10ha area: a RGB orthoimagemosaic at 1-5cm resolution, a DSM in 5-10cm resolution, and hyperspectral datacube at 10-50cm resolution. The targets have mostly consisted of vegetated targets including potatoes, wheat, sugar beets, onions, tulips, coral reefs, and heathlands,. In this poster we present the Hyperspectral Mapping System and the photogrammetric processing chain with some of our first mapping results.

  20. The embedded operating system project

    NASA Technical Reports Server (NTRS)

    Campbell, R. H.

    1984-01-01

    This progress report describes research towards the design and construction of embedded operating systems for real-time advanced aerospace applications. The applications concerned require reliable operating system support that must accommodate networks of computers. The report addresses the problems of constructing such operating systems, the communications media, reconfiguration, consistency and recovery in a distributed system, and the issues of realtime processing. A discussion is included on suitable theoretical foundations for the use of atomic actions to support fault tolerance and data consistency in real-time object-based systems. In particular, this report addresses: atomic actions, fault tolerance, operating system structure, program development, reliability and availability, and networking issues. This document reports the status of various experiments designed and conducted to investigate embedded operating system design issues.

  1. In-situ quality monitoring during laser brazing

    NASA Astrophysics Data System (ADS)

    Ungers, Michael; Fecker, Daniel; Frank, Sascha; Donst, Dmitri; Märgner, Volker; Abels, Peter; Kaierle, Stefan

    Laser brazing of zinc coated steel is a widely established manufacturing process in the automotive sector, where high quality requirements must be fulfilled. The strength, impermeablitiy and surface appearance of the joint are particularly important for judging its quality. The development of an on-line quality control system is highly desired by the industry. This paper presents recent works on the development of such a system, which consists of two cameras operating in different spectral ranges. For the evaluation of the system, seam imperfections are created artificially during experiments. Finally image processing algorithms for monitoring process parameters based the captured images are presented.

  2. Consonance in Information System Projects: A Relationship Marketing Perspective

    ERIC Educational Resources Information Center

    Lin, Pei-Ying

    2010-01-01

    Different stakeholders in the information system project usually have different perceptions and expectations of the projects. There is seldom consistency in the stakeholders' evaluations of the project outcome. Thus the outcomes of information system projects are usually disappointing to one or more stakeholders. Consonance is a process that can…

  3. A demonstration of expert systems applications in transportation engineering : volume III, evaluation of the prototype expert system TRANZ.

    DOT National Transportation Integrated Search

    1990-01-01

    The validation and evaluation of an expert system for traffic control in highway work zones (TRANZ) is described. The stages in the evaluation process consisted of the following: revisit the experts, selectively distribute copies of TRANZ with docume...

  4. Robotic Welding and Inspection System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    H. B. Smartt; D. P. Pace; E. D. Larsen

    2008-06-01

    This paper presents a robotic system for GTA welding of lids on cylindrical vessels. The system consists of an articulated robot arm, a rotating positioner, end effectors for welding, grinding, ultrasonic and eddy current inspection. Features include weld viewing cameras, modular software, and text-based procedural files for process and motion trajectories.

  5. System Description and Status Report: California Education Information System.

    ERIC Educational Resources Information Center

    California State Dept. of Education, Sacramento.

    The California Education Information System (CEIS) consists of two subsystems of computer programs designed to process business and pupil data for local school districts. Creating and maintaining records concerning the students in the schools, the pupil subsystem provides for a central repository of school district identification information and a…

  6. Digital processing of mesoscale analysis and space sensor data

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.

  7. Soil food web properties explain ecosystem services across European land use systems.

    PubMed

    de Vries, Franciska T; Thébault, Elisa; Liiri, Mira; Birkhofer, Klaus; Tsiafouli, Maria A; Bjørnlund, Lisa; Bracht Jørgensen, Helene; Brady, Mark Vincent; Christensen, Søren; de Ruiter, Peter C; d'Hertefeldt, Tina; Frouz, Jan; Hedlund, Katarina; Hemerik, Lia; Hol, W H Gera; Hotes, Stefan; Mortimer, Simon R; Setälä, Heikki; Sgardelis, Stefanos P; Uteseny, Karoline; van der Putten, Wim H; Wolters, Volkmar; Bardgett, Richard D

    2013-08-27

    Intensive land use reduces the diversity and abundance of many soil biota, with consequences for the processes that they govern and the ecosystem services that these processes underpin. Relationships between soil biota and ecosystem processes have mostly been found in laboratory experiments and rarely are found in the field. Here, we quantified, across four countries of contrasting climatic and soil conditions in Europe, how differences in soil food web composition resulting from land use systems (intensive wheat rotation, extensive rotation, and permanent grassland) influence the functioning of soils and the ecosystem services that they deliver. Intensive wheat rotation consistently reduced the biomass of all components of the soil food web across all countries. Soil food web properties strongly and consistently predicted processes of C and N cycling across land use systems and geographic locations, and they were a better predictor of these processes than land use. Processes of carbon loss increased with soil food web properties that correlated with soil C content, such as earthworm biomass and fungal/bacterial energy channel ratio, and were greatest in permanent grassland. In contrast, processes of N cycling were explained by soil food web properties independent of land use, such as arbuscular mycorrhizal fungi and bacterial channel biomass. Our quantification of the contribution of soil organisms to processes of C and N cycling across land use systems and geographic locations shows that soil biota need to be included in C and N cycling models and highlights the need to map and conserve soil biodiversity across the world.

  8. Soil food web properties explain ecosystem services across European land use systems

    PubMed Central

    de Vries, Franciska T.; Thébault, Elisa; Liiri, Mira; Birkhofer, Klaus; Tsiafouli, Maria A.; Bjørnlund, Lisa; Bracht Jørgensen, Helene; Brady, Mark Vincent; Christensen, Søren; de Ruiter, Peter C.; d’Hertefeldt, Tina; Frouz, Jan; Hedlund, Katarina; Hemerik, Lia; Hol, W. H. Gera; Hotes, Stefan; Mortimer, Simon R.; Setälä, Heikki; Sgardelis, Stefanos P.; Uteseny, Karoline; van der Putten, Wim H.; Wolters, Volkmar; Bardgett, Richard D.

    2013-01-01

    Intensive land use reduces the diversity and abundance of many soil biota, with consequences for the processes that they govern and the ecosystem services that these processes underpin. Relationships between soil biota and ecosystem processes have mostly been found in laboratory experiments and rarely are found in the field. Here, we quantified, across four countries of contrasting climatic and soil conditions in Europe, how differences in soil food web composition resulting from land use systems (intensive wheat rotation, extensive rotation, and permanent grassland) influence the functioning of soils and the ecosystem services that they deliver. Intensive wheat rotation consistently reduced the biomass of all components of the soil food web across all countries. Soil food web properties strongly and consistently predicted processes of C and N cycling across land use systems and geographic locations, and they were a better predictor of these processes than land use. Processes of carbon loss increased with soil food web properties that correlated with soil C content, such as earthworm biomass and fungal/bacterial energy channel ratio, and were greatest in permanent grassland. In contrast, processes of N cycling were explained by soil food web properties independent of land use, such as arbuscular mycorrhizal fungi and bacterial channel biomass. Our quantification of the contribution of soil organisms to processes of C and N cycling across land use systems and geographic locations shows that soil biota need to be included in C and N cycling models and highlights the need to map and conserve soil biodiversity across the world. PMID:23940339

  9. Poster - Thur Eve - 05: Safety systems and failure modes and effects analysis for a magnetic resonance image guided radiation therapy system.

    PubMed

    Lamey, M; Carlone, M; Alasti, H; Bissonnette, J P; Borg, J; Breen, S; Coolens, C; Heaton, R; Islam, M; van Proojen, M; Sharpe, M; Stanescu, T; Jaffray, D

    2012-07-01

    An online Magnetic Resonance guided Radiation Therapy (MRgRT) system is under development. The system is comprised of an MRI with the capability of travel between and into HDR brachytherapy and external beam radiation therapy vaults. The system will provide on-line MR images immediately prior to radiation therapy. The MR images will be registered to a planning image and used for image guidance. With the intention of system safety we have performed a failure modes and effects analysis. A process tree of the facility function was developed. Using the process tree as well as an initial design of the facility as guidelines possible failure modes were identified, for each of these failure modes root causes were identified. For each possible failure the assignment of severity, detectability and occurrence scores was performed. Finally suggestions were developed to reduce the possibility of an event. The process tree consists of nine main inputs and each of these main inputs consisted of 5 - 10 sub inputs and tertiary inputs were also defined. The process tree ensures that the overall safety of the system has been considered. Several possible failure modes were identified and were relevant to the design, construction, commissioning and operating phases of the facility. The utility of the analysis can be seen in that it has spawned projects prior to installation and has lead to suggestions in the design of the facility. © 2012 American Association of Physicists in Medicine.

  10. Signal Processing Expert Code (SPEC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ames, H.S.

    1985-12-01

    The purpose of this paper is to describe a prototype expert system called SPEC which was developed to demonstrate the utility of providing an intelligent interface for users of SIG, a general purpose signal processing code. The expert system is written in NIL, runs on a VAX 11/750 and consists of a backward chaining inference engine and an English-like parser. The inference engine uses knowledge encoded as rules about the formats of SIG commands and about how to perform frequency analyses using SIG. The system demonstrated that expert system can be used to control existing codes.

  11. Atmospheric Boundary Layer Modeling for Combined Meteorology and Air Quality Systems

    EPA Science Inventory

    Atmospheric Eulerian grid models for mesoscale and larger applications require sub-grid models for turbulent vertical exchange processes, particularly within the Planetary Boundary Layer (PSL). In combined meteorology and air quality modeling systems consistent PSL modeling of wi...

  12. RIKEN Integrated Sequence Analysis (RISA) System—384-Format Sequencing Pipeline with 384 Multicapillary Sequencer

    PubMed Central

    Shibata, Kazuhiro; Itoh, Masayoshi; Aizawa, Katsunori; Nagaoka, Sumiharu; Sasaki, Nobuya; Carninci, Piero; Konno, Hideaki; Akiyama, Junichi; Nishi, Katsuo; Kitsunai, Tokuji; Tashiro, Hideo; Itoh, Mari; Sumi, Noriko; Ishii, Yoshiyuki; Nakamura, Shin; Hazama, Makoto; Nishine, Tsutomu; Harada, Akira; Yamamoto, Rintaro; Matsumoto, Hiroyuki; Sakaguchi, Sumito; Ikegami, Takashi; Kashiwagi, Katsuya; Fujiwake, Syuji; Inoue, Kouji; Togawa, Yoshiyuki; Izawa, Masaki; Ohara, Eiji; Watahiki, Masanori; Yoneda, Yuko; Ishikawa, Tomokazu; Ozawa, Kaori; Tanaka, Takumi; Matsuura, Shuji; Kawai, Jun; Okazaki, Yasushi; Muramatsu, Masami; Inoue, Yorinao; Kira, Akira; Hayashizaki, Yoshihide

    2000-01-01

    The RIKEN high-throughput 384-format sequencing pipeline (RISA system) including a 384-multicapillary sequencer (the so-called RISA sequencer) was developed for the RIKEN mouse encyclopedia project. The RISA system consists of colony picking, template preparation, sequencing reaction, and the sequencing process. A novel high-throughput 384-format capillary sequencer system (RISA sequencer system) was developed for the sequencing process. This system consists of a 384-multicapillary auto sequencer (RISA sequencer), a 384-multicapillary array assembler (CAS), and a 384-multicapillary casting device. The RISA sequencer can simultaneously analyze 384 independent sequencing products. The optical system is a scanning system chosen after careful comparison with an image detection system for the simultaneous detection of the 384-capillary array. This scanning system can be used with any fluorescent-labeled sequencing reaction (chain termination reaction), including transcriptional sequencing based on RNA polymerase, which was originally developed by us, and cycle sequencing based on thermostable DNA polymerase. For long-read sequencing, 380 out of 384 sequences (99.2%) were successfully analyzed and the average read length, with more than 99% accuracy, was 654.4 bp. A single RISA sequencer can analyze 216 kb with >99% accuracy in 2.7 h (90 kb/h). For short-read sequencing to cluster the 3′ end and 5′ end sequencing by reading 350 bp, 384 samples can be analyzed in 1.5 h. We have also developed a RISA inoculator, RISA filtrator and densitometer, RISA plasmid preparator which can handle throughput of 40,000 samples in 17.5 h, and a high-throughput RISA thermal cycler which has four 384-well sites. The combination of these technologies allowed us to construct the RISA system consisting of 16 RISA sequencers, which can process 50,000 DNA samples per day. One haploid genome shotgun sequence of a higher organism, such as human, mouse, rat, domestic animals, and plants, can be revealed by seven RISA systems within one month. PMID:11076861

  13. Process of producing liquid hydrocarbon fuels from biomass

    DOEpatents

    Kuester, James L.

    1987-07-07

    A continuous thermochemical indirect liquefaction process to convert various biomass materials into diesel-type transportation fuels which fuels are compatible with current engine designs and distribution systems comprising feeding said biomass into a circulating solid fluidized bed gasification system to produce a synthesis gas containing olefins, hydrogen and carbon monoxide and thereafter introducing the synthesis gas into a catalytic liquefaction system to convert the synthesis gas into liquid hydrocarbon fuel consisting essentially of C.sub.7 -C.sub.17 paraffinic hydrocarbons having cetane indices of 50+.

  14. Parallel machine architecture for production rule systems

    DOEpatents

    Allen, Jr., John D.; Butler, Philip L.

    1989-01-01

    A parallel processing system for production rule programs utilizes a host processor for storing production rule right hand sides (RHS) and a plurality of rule processors for storing left hand sides (LHS). The rule processors operate in parallel in the recognize phase of the system recognize -Act Cycle to match their respective LHS's against a stored list of working memory elements (WME) in order to find a self consistent set of WME's. The list of WME is dynamically varied during the Act phase of the system in which the host executes or fires rule RHS's for those rules for which a self-consistent set has been found by the rule processors. The host transmits instructions for creating or deleting working memory elements as dictated by the rule firings until the rule processors are unable to find any further self-consistent working memory element sets at which time the production rule system is halted.

  15. Integrating policy-based management and SLA performance monitoring

    NASA Astrophysics Data System (ADS)

    Liu, Tzong-Jye; Lin, Chin-Yi; Chang, Shu-Hsin; Yen, Meng-Tzu

    2001-10-01

    Policy-based management system provides the configuration capability for the system administrators to focus on the requirements of customers. The service level agreement performance monitoring mechanism helps system administrators to verify the correctness of policies. However, it is difficult for a device to process the policies directly because the policies are the management concept. This paper proposes a mechanism to decompose a policy into rules that can be efficiently processed by a device. Thus, the device may process the rule and collect the performance statistics information efficiently; and the policy-based management system may collect these performance statistics information and report the service-level agreement performance monitoring information to the system administrator. The proposed policy-based management system achieves both the policy configuration and service-level agreement performance monitoring requirements. A policy consists of a condition part and an action part. The condition part is a Boolean expression of a source host IP group, a destination host IP group, etc. The action part is the parameters of services. We say that an address group is compact if it only consists of a range of IP address that can be denoted by a pair of IP address and corresponding IP mask. If the condition part of a policy only consists of the compact address group, we say that the policy is a rule. Since a device can efficiently process a compact address and a system administrator prefers to define a range of IP address, the policy-based management system has to translate policy into rules and supplements the gaps between policy and rules. The proposed policy-based management system builds the relationships between VPN and policies, policy and rules. Since the system administrator wants to monitor the system performance information of VPNs and policies, the proposed policy-based management system downloads the relationships among VPNs, policies and rules to the SNMP agents. The SNMP agents build the management information base (MIB) of all VPNs, policies and rules according to the relationships obtained from the management server. Thus, the proposed policy-based management system may get all performance monitoring information of VPNs and policies from agents. The proposed policy-based manager achieves two goals: a) provide a management environment for the system administrator to configure their network only considering the policy requirement issues and b) let the device have only to process the packet and then collect the required performance information. These two things make the proposed management system satisfy both the user and device requirements.

  16. IRB Process Improvements: A Machine Learning Analysis.

    PubMed

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  17. Optical Waveguide Solar Energy System for Lunar Materials Processing

    NASA Technical Reports Server (NTRS)

    Nakamura, T.; Case, J. A.; Senior, C. L.

    1997-01-01

    This paper discusses results of our work on development of the Optical Waveguide (OW) Solar Energy System for Lunar Materials Processing. In the OW system as shown, solar radiation is collected by the concentrator which transfers the concentrated solar radiation to the OW transmission line consisting of low-loss optical fibers. The OW line transmits the solar radiation to the thermal reactor of the lunar materials processing plant. The feature of the OW system are: (1) Highly concentrated solar radiation (up to 104 suns) can be transmitted via flexible OW lines directly into the thermal reactor for materials processing: (2) Solar radiation intensity or spectra can be tailored to specific materials processing steps; (3) Provide solar energy to locations or inside of enclosures that would not otherwise have an access to solar energy; and (4) The system can be modularized and can be easily transported to and deployed at the lunar base.

  18. Simulation of APEX data: the SENSOR approach

    NASA Astrophysics Data System (ADS)

    Boerner, Anko; Schaepman, Michael E.; Schlaepfer, Daniel; Wiest, Lorenz; Reulke, Ralf

    1999-10-01

    The consistent simulation of airborne and spaceborne hyperspectral data is an important task and sometimes the only way for the adaptation and optimization of a sensor and its observing conditions, the choice and test of algorithms for data processing, error estimations and the evaluation of the capabilities of the whole sensor system. The integration of three approaches is suggested for the data simulation of APEX (Airborne Prism Experiment): (1) a spectrally consistent approach (e.g. using AVIRIS data), (2) a geometrically consistent approach (e.g. using CASI data), and (3) an end-to- end simulation of the sensor system. In this paper, the last approach is discussed in detail. Such a technique should be used if there is no simple deterministic relation between input and output parameters. The simulation environment SENSOR (Software Environment for the Simulation of Optical Remote Sensing Systems) presented here includes a full model of the sensor system, the observed object and the atmosphere. The simulator consists of three parts. The first part describes the geometrical relations between object, sun, and sensor using a ray tracing algorithm. The second part of the simulation environment considers the radiometry. It calculates the at-sensor-radiance using a pre-calculated multidimensional lookup-table for the atmospheric boundary conditions and bi- directional reflectances. Part three consists of an optical and an electronic sensor model for the generation of digital images. Application-specific algorithms for data processing must be considered additionally. The benefit of using an end- to-end simulation approach is demonstrated, an example of a simulated APEX data cube is given, and preliminary steps of evaluation of SENSOR are carried out.

  19. The Vanderbilt Professional Nursing Practice Program, part 3: managing an advancement process.

    PubMed

    Steaban, Robin; Fudge, Mitzie; Leutgens, Wendy; Wells, Nancy

    2003-11-01

    Consistency of performance standards across multiple clinical settings is an essential component of a credible advancement system. Our advancement process incorporates a central committee, composed of nurses from all clinical settings within the institution, to ensure consistency of performance in inpatient, outpatient, and procedural settings. An analysis of nurses advanced during the first 18 months of the program indicates that performance standards are applicable to nurses in all clinical settings. The first article (September 2003) in this 3-part series described the foundation for and the philosophical background of the Vanderbilt Professional Nursing Practice Program (VPNPP), the career advancement program underway at Vanderbilt University Medical Center. Part 2 described the development of the evaluation tools used in the VPNPP, the implementation and management of this new system, program evaluation, and improvements since the program's inception. The purpose of this article is to review the advancement process, review the roles of those involved in the process, and to describe outcomes and lessons learned.

  20. Washington Community Colleges Factbook. Addendum B: A Description of the Community College Management Information System.

    ERIC Educational Resources Information Center

    Meier, Terre; Bundy, Larry

    The Management Information System (MIS) of the Washington State system of community colleges was designed to be responsive to legislative and district requests for information and to enhance the State Board's capabilities to manage the community college system and integrate its budgeting and planning processes. The MIS consists of seven…

  1. Cogeneration Technology Alternatives Study (CTAS). Volume 1: Summary report

    NASA Technical Reports Server (NTRS)

    Gerlaugh, H. E.; Hall, E. W.; Brown, D. H.; Priestley, R. R.; Knightly, W. F.

    1980-01-01

    Large savings can be made in industry by cogenerating electric power and process heat in single energy conversion systems rather than separately in utility plants and in process boilers. About fifty industrial processes from the largest energy consuming sectors were used as a basis for matching a similar number of energy conversion systems that are considered as candidates which can be made available by the 1985 to 2000 time period. The sectors considered included food, textiles, lumber, paper, chemicals, petroleum, glass, and primary metals. The energy conversion systems included steam and gas turbines, diesels, thermionics, stirling, closed-cycle and steam injected gas turbines, and fuel cells. Fuels considered were coal, both coal and petroleum-based residual and distillate liquid fuels, and low Btu gas obtained through the on-site gasification of coal. An attempt was made to use consistent assumptions and a consistent set of ground rules for determining performance and cost in individual plants and on a national level. It was found that: (1) atmospheric and pressurized fluidized bed steam turbine systems were the most attractive of the direct coal-fired systems; and (2) open-cycle gas turbines with heat recovery steam generators and combined-cycles with NO(x) emission reduction and moderately increased firing temperatures were the most attractive of the coal-derived liquid-fired systems.

  2. System and method for integrating hazard-based decision making tools and processes

    DOEpatents

    Hodgin, C Reed [Westminster, CO

    2012-03-20

    A system and method for inputting, analyzing, and disseminating information necessary for identified decision-makers to respond to emergency situations. This system and method provides consistency and integration among multiple groups, and may be used for both initial consequence-based decisions and follow-on consequence-based decisions. The system and method in a preferred embodiment also provides tools for accessing and manipulating information that are appropriate for each decision-maker, in order to achieve more reasoned and timely consequence-based decisions. The invention includes processes for designing and implementing a system or method for responding to emergency situations.

  3. Quantum Darwinism: Entanglement, branches, and the emergent classicality of redundantly stored quantum information

    NASA Astrophysics Data System (ADS)

    Blume-Kohout, Robin; Zurek, Wojciech H.

    2006-06-01

    We lay a comprehensive foundation for the study of redundant information storage in decoherence processes. Redundancy has been proposed as a prerequisite for objectivity, the defining property of classical objects. We consider two ensembles of states for a model universe consisting of one system and many environments: the first consisting of arbitrary states, and the second consisting of “singly branching” states consistent with a simple decoherence model. Typical states from the random ensemble do not store information about the system redundantly, but information stored in branching states has a redundancy proportional to the environment’s size. We compute the specific redundancy for a wide range of model universes, and fit the results to a simple first-principles theory. Our results show that the presence of redundancy divides information about the system into three parts: classical (redundant); purely quantum; and the borderline, undifferentiated or “nonredundant,” information.

  4. A Process-Centered Tool for Evaluating Patient Safety Performance and Guiding Strategic Improvement

    DTIC Science & Technology

    2005-01-01

    next patient safety steps in individual health care organizations. The low priority given to Category 3 (Focus on patients , other customers , and...presents a patient safety applicator tool for implementing and assessing patient safety systems in health care institutions. The applicator tool consists...the survey rounds. The study addressed three research questions: 1. What critical processes should be included in health care patient safety systems

  5. The Methodology of Diagnosing Group and Intergroup Relations in Organizations.

    DTIC Science & Technology

    1980-06-01

    The aim of organizational diagnosis is to produce learning about the system for its members. Diagnosis is a process consisting of three phases: entry...result, organizational diagnosis is a self-correcting process that permits the activities of subsequent phases to build upon the accomplishments of earlier... organizational diagnosis is shaped by the condition of the system being studied. The effects of underbounded and overbounded organizations influence what

  6. SPALAX new generation: New process design for a more efficient xenon production system for the CTBT noble gas network.

    PubMed

    Topin, Sylvain; Greau, Claire; Deliere, Ludovic; Hovesepian, Alexandre; Taffary, Thomas; Le Petit, Gilbert; Douysset, Guilhem; Moulin, Christophe

    2015-11-01

    The SPALAX (Système de Prélèvement Automatique en Ligne avec l'Analyse du Xénon) is one of the systems used in the International Monitoring System of the Comprehensive Nuclear Test Ban Treaty (CTBT) to detect radioactive xenon releases following a nuclear explosion. Approximately 10 years after the industrialization of the first system, the CEA has developed the SPALAX New Generation, SPALAX-NG, with the aim of increasing the global sensitivity and reducing the overall size of the system. A major breakthrough has been obtained by improving the sampling stage and the purification/concentration stage. The sampling stage evolution consists of increasing the sampling capacity and improving the gas treatment efficiency across new permeation membranes, leading to an increase in the xenon production capacity by a factor of 2-3. The purification/concentration stage evolution consists of using a new adsorbent Ag@ZSM-5 (or Ag-PZ2-25) with a much larger xenon retention capacity than activated charcoal, enabling a significant reduction in the overall size of this stage. The energy consumption of the system is similar to that of the current SPALAX system. The SPALAX-NG process is able to produce samples of almost 7 cm(3) of xenon every 12 h, making it the most productive xenon process among the IMS systems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. A Recommendation System to Facilitate Business Process Modeling.

    PubMed

    Deng, Shuiguang; Wang, Dongjing; Li, Ying; Cao, Bin; Yin, Jianwei; Wu, Zhaohui; Zhou, Mengchu

    2017-06-01

    This paper presents a system that utilizes process recommendation technology to help design new business processes from scratch in an efficient and accurate way. The proposed system consists of two phases: 1) offline mining and 2) online recommendation. At the first phase, it mines relations among activity nodes from existing processes in repository, and then stores the extracted relations as patterns in a database. At the second phase, it compares the new process under construction with the premined patterns, and recommends proper activity nodes of the most matching patterns to help build a new process. Specifically, there are three different online recommendation strategies in this system. Experiments on both real and synthetic datasets are conducted to compare the proposed approaches with the other state-of-the-art ones, and the results show that the proposed approaches outperform them in terms of accuracy and efficiency.

  8. LETTER TO THE EDITOR: Thermally activated processes in magnetic systems consisting of rigid dipoles: equivalence of the Ito and Stratonovich stochastic calculus

    NASA Astrophysics Data System (ADS)

    Berkov, D. V.; Gorn, N. L.

    2002-04-01

    We demonstrate that the Ito and the Stratonovich stochastic calculus lead to identical results when applied to the stochastic dynamics study of magnetic systems consisting of dipoles with the constant magnitude, despite the multiplicative noise appearing in the corresponding Langevin equations. The immediate consequence of this statement is that any numerical method used for the solution of these equations will lead to the physically correct results.

  9. Comparative Effects of Antihistamines on Aircrew Mission Effectiveness under Sustained Operations

    DTIC Science & Technology

    1992-06-01

    measures consist mainly of process measures. Process measures are measures of activities used to accomplish the mission and produce the final results...They include task completion times and response variability, and information processing rates as they relate to unique task assignment. Performance...contains process measures that assess the Individual contributions of hardware/software and human components to overall system performance. Measures

  10. NASA'S Earth Science Data Stewardship Activities

    NASA Technical Reports Server (NTRS)

    Lowe, Dawn R.; Murphy, Kevin J.; Ramapriyan, Hampapuram

    2015-01-01

    NASA has been collecting Earth observation data for over 50 years using instruments on board satellites, aircraft and ground-based systems. With the inception of the Earth Observing System (EOS) Program in 1990, NASA established the Earth Science Data and Information System (ESDIS) Project and initiated development of the Earth Observing System Data and Information System (EOSDIS). A set of Distributed Active Archive Centers (DAACs) was established at locations based on science discipline expertise. Today, EOSDIS consists of 12 DAACs and 12 Science Investigator-led Processing Systems (SIPS), processing data from the EOS missions, as well as the Suomi National Polar Orbiting Partnership mission, and other satellite and airborne missions. The DAACs archive and distribute the vast majority of data from NASA’s Earth science missions, with data holdings exceeding 12 petabytes The data held by EOSDIS are available to all users consistent with NASA’s free and open data policy, which has been in effect since 1990. The EOSDIS archives consist of raw instrument data counts (level 0 data), as well as higher level standard products (e.g., geophysical parameters, products mapped to standard spatio-temporal grids, results of Earth system models using multi-instrument observations, and long time series of Earth System Data Records resulting from multiple satellite observations of a given type of phenomenon). EOSDIS data stewardship responsibilities include ensuring that the data and information content are reliable, of high quality, easily accessible, and usable for as long as they are considered to be of value.

  11. What systems participants know about access and service entry and why managers should listen.

    PubMed

    Duncombe, Rohena

    2017-08-01

    Objective The present study looked at the views of people directly involved in the entry process for community health counselling using the frame of the health access literature. The concurrence of system participants' views with the access literature highlights access issues, particularly for people who are vulnerable or disadvantaged. The paper privileges the voices of the system participants, inviting local health services to consider using participatory design to improve access at the entry point. Methods People involved in the entry process for community health counselling explored the question, 'What, for you, are the features of a good intake system?' They also commented on themes identified during pilot interviews. These were thematically analysed for each participant group by the researcher to develop a voice for each stakeholder group. Results People accessing the service could be vulnerable and the entry process failed to take that into account. People directly involved in the counselling service entry system, system participants, consisted of: professionals referring in, people seeking services and reception staff taking first enquiries. They shared substantially the same concerns as each other. The responses from these system participants are consistent with the international literature on access and entry into health services. Conclusion Participatory service design could improve primary healthcare service entry at the local level. Canvassing the experiences of system participants is important for delivering services to those who have the least access and, in that way, could contribute to health equity. What is known about the topic? People with the highest health needs receive the fewest services. Health inequality is increasing. What does this paper add? System participants can provide advice consistent with the academic research literature that is useful for improving service entry at the local level. What are the implications for practitioners? Participatory design can inform policy makers and service providers. Entry systems could acknowledge the potential vulnerability or disadvantage of people approaching the service.

  12. 15 CFR 911.3 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... CONCERNING USE OF THE NOAA SPACE-BASED DATA COLLECTION SYSTEMS § 911.3 Definitions. For purposes of this part... data from fixed and moving platforms and provides platform location data. This system consists of... Data Processing and Distribution for the National Environmental Satellite, Data, and Information...

  13. Quantum-Carnot engine for particle confined to cubic potential

    NASA Astrophysics Data System (ADS)

    Sutantyo, Trengginas Eka P.; Belfaqih, Idrus H.; Prayitno, T. B.

    2015-09-01

    Carnot cycle consists of isothermal and adiabatic processes which are reversible. Using analogy in quantum mechanics, these processes can be well explained by replacing variables in classical process with a quantum system. Quantum system which is shown in this paper is a particle that moves under the influence of a cubic potential which is restricted only to the state of the two energy levels. At the end, the efficiency of the system is shown as a function of the width ratio between the initial conditions and the farthest wall while expanding. Furthermore, the system efficiency will be considered 1D and 2D cases. The providing efficiencies are different due to the influence of the degeneration of energy and the degrees of freedom of the system.

  14. The Public Health Information Network (PHIN) Preparedness Initiative

    PubMed Central

    Loonsk, John W.; McGarvey, Sunanda R.; Conn, Laura A.; Johnson, Jennifer

    2006-01-01

    The Public Health Information Network (PHIN) Preparedness initiative strives to implement, on an accelerated pace, a consistent national network of information systems that will support public health in being prepared for public health emergencies. Using the principles and practices of the broader PHIN initiative, PHIN Preparedness concentrates in the short term on ensuring that all public health jurisdictions have, or have access to, systems to accomplish known preparedness functions. The PHIN Preparedness initiative defines functional requirements, technical standards and specifications, and a process to achieve consistency and interconnectedness of preparedness systems across public health. PMID:16221945

  15. Early Identification System: Year Two. Research Report 80-15.

    ERIC Educational Resources Information Center

    Stennett, R. G.; Earl, L. M.

    During the academic year 1978-79, school teams implemented a newly developed early identification system in all kindergarten and grade one classes in London, Ontario schools. After analysis and revision of the system, the internal consistency and concurrent validity of the process and a test of its short-term predictive validity were investigated.…

  16. Influence of wetland type, hydrology, and wetland destruction on aquatic communities within wetland reservoir subirrigation systems in northwestern Ohio

    USDA-ARS?s Scientific Manuscript database

    Establishment of an agricultural water recycling system known as the wetland reservoir subirrigation system (WRSIS) results in the creation of two different types of wetlands adjacent to agricultural fields. Each WRSIS consists of one treatment wetland designed to process agricultural contaminants (...

  17. Differences in Fish, Amphibian, and Reptile Communities Within Wetlands Created by an Agricultural Water Recycling System in Northwestern Ohio

    USDA-ARS?s Scientific Manuscript database

    Establishment of a water recycling system known as the wetland-reservoir subirrigation system (WRSIS) results in the creation of wetlands adjacent to agricultural fields. Each WRSIS consists of one wetland designed to process agricultural chemicals (WRSIS wetlands) and one wetland to store subirriga...

  18. Automating Space Station operations planning

    NASA Technical Reports Server (NTRS)

    Ziemer, Kathleen A.

    1989-01-01

    The development and implementation of the operations planning processes for the Space Station are discussed. A three level planning process, consisting of strategic, tactical, and execution level planning, is being developed. The integration of the planning procedures into a tactical planning system is examined and the planning phases are illustrated.

  19. Arduino-based automation of a DNA extraction system.

    PubMed

    Kim, Kyung-Won; Lee, Mi-So; Ryu, Mun-Ho; Kim, Jong-Won

    2015-01-01

    There have been many studies to detect infectious diseases with the molecular genetic method. This study presents an automation process for a DNA extraction system based on microfluidics and magnetic bead, which is part of a portable molecular genetic test system. This DNA extraction system consists of a cartridge with chambers, syringes, four linear stepper actuators, and a rotary stepper actuator. The actuators provide a sequence of steps in the DNA extraction process, such as transporting, mixing, and washing for the gene specimen, magnetic bead, and reagent solutions. The proposed automation system consists of a PC-based host application and an Arduino-based controller. The host application compiles a G code sequence file and interfaces with the controller to execute the compiled sequence. The controller executes stepper motor axis motion, time delay, and input-output manipulation. It drives the stepper motor with an open library, which provides a smooth linear acceleration profile. The controller also provides a homing sequence to establish the motor's reference position, and hard limit checking to prevent any over-travelling. The proposed system was implemented and its functionality was investigated, especially regarding positioning accuracy and velocity profile.

  20. A 45° saw-dicing process applied to a glass substrate for wafer-level optical splitter fabrication for optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Maciel, M. J.; Costa, C. G.; Silva, M. F.; Gonçalves, S. B.; Peixoto, A. C.; Ribeiro, A. Fernando; Wolffenbuttel, R. F.; Correia, J. H.

    2016-08-01

    This paper reports on the development of a technology for the wafer-level fabrication of an optical Michelson interferometer, which is an essential component in a micro opto-electromechanical system (MOEMS) for a miniaturized optical coherence tomography (OCT) system. The MOEMS consists on a titanium dioxide/silicon dioxide dielectric beam splitter and chromium/gold micro-mirrors. These optical components are deposited on 45° tilted surfaces to allow the horizontal/vertical separation of the incident beam in the final micro-integrated system. The fabrication process consists of 45° saw dicing of a glass substrate and the subsequent deposition of dielectric multilayers and metal layers. The 45° saw dicing is fully characterized in this paper, which also includes an analysis of the roughness. The optimum process results in surfaces with a roughness of 19.76 nm (rms). The actual saw dicing process for a high-quality final surface results as a compromise between the dicing blade’s grit size (#1200) and the cutting speed (0.3 mm s-1). The proposed wafer-level fabrication allows rapid and low-cost processing, high compactness and the possibility of wafer-level alignment/assembly with other optical micro components for OCT integrated imaging.

  1. Automatic Welding System of Aluminum Pipe by Monitoring Backside Image of Molten Pool Using Vision Sensor

    NASA Astrophysics Data System (ADS)

    Baskoro, Ario Sunar; Kabutomori, Masashi; Suga, Yasuo

    An automatic welding system using Tungsten Inert Gas (TIG) welding with vision sensor for welding of aluminum pipe was constructed. This research studies the intelligent welding process of aluminum alloy pipe 6063S-T5 in fixed position and moving welding torch with the AC welding machine. The monitoring system consists of a vision sensor using a charge-coupled device (CCD) camera to monitor backside image of molten pool. The captured image was processed to recognize the edge of molten pool by image processing algorithm. Neural network model for welding speed control were constructed to perform the process automatically. From the experimental results it shows the effectiveness of the control system confirmed by good detection of molten pool and sound weld of experimental result.

  2. Intertrial auditory neural stability supports beat synchronization in preschoolers

    PubMed Central

    Carr, Kali Woodruff; Tierney, Adam; White-Schwoch, Travis; Kraus, Nina

    2016-01-01

    The ability to synchronize motor movements along with an auditory beat places stringent demands on the temporal processing and sensorimotor integration capabilities of the nervous system. Links between millisecond-level precision of auditory processing and the consistency of sensorimotor beat synchronization implicate fine auditory neural timing as a mechanism for forming stable internal representations of, and behavioral reactions to, sound. Here, for the first time, we demonstrate a systematic relationship between consistency of beat synchronization and trial-by-trial stability of subcortical speech processing in preschoolers (ages 3 and 4 years old). We conclude that beat synchronization might provide a useful window into millisecond-level neural precision for encoding sound in early childhood, when speech processing is especially important for language acquisition and development. PMID:26760457

  3. Earth Sciences Data and Information System (ESDIS) program planning and evaluation methodology development

    NASA Technical Reports Server (NTRS)

    Dickinson, William B.

    1995-01-01

    An Earth Sciences Data and Information System (ESDIS) Project Management Plan (PMP) is prepared. An ESDIS Project Systems Engineering Management Plan (SEMP) consistent with the developed PMP is also prepared. ESDIS and related EOS program requirements developments, management and analysis processes are evaluated. Opportunities to improve the effectiveness of these processes and program/project responsiveness to requirements are identified. Overall ESDIS cost estimation processes are evaluated, and recommendations to improve cost estimating and modeling techniques are developed. ESDIS schedules and scheduling tools are evaluated. Risk assessment, risk mitigation strategies and approaches, and use of risk information in management decision-making are addressed.

  4. Goal Based Testing: A Risk Informed Process

    NASA Technical Reports Server (NTRS)

    Everline, Chester; Smith, Clayton; Distefano, Sal; Goldin, Natalie

    2014-01-01

    A process for life demonstration testing is developed, which can reduce the number of resources required by conventional sampling theory while still maintaining the same degree of rigor and confidence level. This process incorporates state-of-the-art probabilistic thinking and is consistent with existing NASA guidance documentation. This view of life testing changes the paradigm of testing a system for many hours to show confidence that a system will last for the required number of years to one that focuses efforts and resources on exploring how the system can fail at end-of-life and building confidence that the failure mechanisms are understood and well mitigated.

  5. Parallel processing of real-time dynamic systems simulation on OSCAR (Optimally SCheduled Advanced multiprocessoR)

    NASA Technical Reports Server (NTRS)

    Kasahara, Hironori; Honda, Hiroki; Narita, Seinosuke

    1989-01-01

    Parallel processing of real-time dynamic systems simulation on a multiprocessor system named OSCAR is presented. In the simulation of dynamic systems, generally, the same calculation are repeated every time step. However, we cannot apply to Do-all or the Do-across techniques for parallel processing of the simulation since there exist data dependencies from the end of an iteration to the beginning of the next iteration and furthermore data-input and data-output are required every sampling time period. Therefore, parallelism inside the calculation required for a single time step, or a large basic block which consists of arithmetic assignment statements, must be used. In the proposed method, near fine grain tasks, each of which consists of one or more floating point operations, are generated to extract the parallelism from the calculation and assigned to processors by using optimal static scheduling at compile time in order to reduce large run time overhead caused by the use of near fine grain tasks. The practicality of the scheme is demonstrated on OSCAR (Optimally SCheduled Advanced multiprocessoR) which has been developed to extract advantageous features of static scheduling algorithms to the maximum extent.

  6. Neutron radiographic viewing system

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The design, development and application of a neutron radiographic viewing system for use in nondestructive testing applications is considered. The system consists of a SEC vidicon camera, neutron image intensifier system, disc recorder, and TV readout. Neutron bombardment of the subject is recorded by an image converter and passed through an optical system into the SEC vidicon. The vidicon output may be stored, or processed for visual readout.

  7. Integration process of fermentation and liquid biphasic flotation for lipase separation from Burkholderia cepacia.

    PubMed

    Sankaran, Revathy; Show, Pau Loke; Lee, Sze Ying; Yap, Yee Jiun; Ling, Tau Chuan

    2018-02-01

    Liquid Biphasic Flotation (LBF) is an advanced recovery method that has been effectively applied for biomolecules extraction. The objective of this investigation is to incorporate the fermentation and extraction process of lipase from Burkholderia cepacia using flotation system. Initial study was conducted to compare the performance of bacteria growth and lipase production using flotation and shaker system. From the results obtained, bacteria shows quicker growth and high lipase yield via flotation system. Integration process for lipase separation was investigated and the result showed high efficiency reaching 92.29% and yield of 95.73%. Upscaling of the flotation system exhibited consistent result with the lab-scale which are 89.53% efficiency and 93.82% yield. The combination of upstream and downstream processes in a single system enables the acceleration of product formation, improves the product yield and facilitates downstream processing. This integration system demonstrated its potential for biomolecules fermentation and separation that possibly open new opportunities for industrial production. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Analysis of electromagnetic interference from power system processing and transmission components for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Barber, Peter W.; Demerdash, Nabeel A. O.; Hurysz, B.; Luo, Z.; Denny, Hugh W.; Millard, David P.; Herkert, R.; Wang, R.

    1992-01-01

    The goal of this research project was to analyze the potential effects of electromagnetic interference (EMI) originating from power system processing and transmission components for Space Station Freedom. The approach consists of four steps: (1) developing analytical tools (models and computer programs); (2) conducting parameterization (what if?) studies; (3) predicting the global space station EMI environment; and (4) providing a basis for modification of EMI standards.

  9. The Role of Water Chemistry in Marine Aquarium Design: A Model System for a General Chemistry Class

    ERIC Educational Resources Information Center

    Keaffaber, Jeffrey J.; Palma, Ramiro; Williams, Kathryn R.

    2008-01-01

    Water chemistry is central to aquarium design, and it provides many potential applications for discussion in undergraduate chemistry and engineering courses. Marine aquaria and their life support systems feature many chemical processes. A life support system consists of the entire recirculation system, as well as the habitat tank and all ancillary…

  10. A Conceptual Level Design for a Static Scheduler for Hard Real-Time Systems

    DTIC Science & Technology

    1988-03-01

    The design of hard real - time systems is gaining a great deal of attention in the software engineering field as more and more real-world processes are...for these hard real - time systems . PSDL, as an executable design language, is supported by an execution support system consisting of a static scheduler, dynamic scheduler, and translator.

  11. From Noise to Order: The Psychological Development of Knowledge and Phenocopy in Biology

    ERIC Educational Resources Information Center

    Piaget, Jean

    1975-01-01

    Shows that one of the most general processes in the development of cognitive structures consists in replacing exogenous knowledge by endogenous reconstructions that reconstitute the same forms but incorporate them into systems whose internal composition is a pre-requisite. Biologically equivalent process is discussed. (Author/AM)

  12. Develop and Implement an Integrated Enterprise Information System for a Computer-Integrated Apparel Enterprise (CIAE).

    DTIC Science & Technology

    1998-01-24

    the Apparel Manufacturing Architecture (AMA), a generic architecture for an apparel enterprise. ARN-AIMS consists of three modules - Order Processing , Order...Tracking and Shipping & Invoicing. The Order Processing Module is designed to facilitate the entry of customer orders for stock and special

  13. ECO LOGIC INTERNATIONAL GAS-PHASE CHEMICAL REDUCTION PROCESS - THE REACTOR SYSTEM - APPLICATIONS ANALYSIS REPORT

    EPA Science Inventory

    The ELI Eco Logic International Inc. (Eco Logic) process thermally separates organics, then chemically reduces them in a hydrogen atmosphere, converting them to a reformed gas that consists of light hydrocarbons and water. A scrubber treats the reformed gas to remove hydrogen chl...

  14. A Split-Attention Effect in Multimedia Learning: Evidence for Dual Processing Systems in Working Memory.

    ERIC Educational Resources Information Center

    Mayer, Richard E.; Moreno, Roxana

    1998-01-01

    Multimedia learners (n=146 college students) were able to integrate words and computer-presented pictures more easily when the words were presented aurally rather than visually. This split-attention effect is consistent with a dual-processing model of working memory. (SLD)

  15. Understanding the Refugee Experience: Foundations of a Better Resettlement System.

    ERIC Educational Resources Information Center

    Stein, Barry N.

    1981-01-01

    Suggests the need for understanding refugees' characteristics, experiences, resettlement needs, and their patterns of behavior during the resettlement process. Examines consistencies which have been observed in refugee experiences and the process of adaptation to a new society. Treats specifically the conditions of Soviet Jewish and Indochinese…

  16. Graphic Arts: Book Three. The Press and Related Processes.

    ERIC Educational Resources Information Center

    Farajollahi, Karim; And Others

    The third of a three-volume set of instructional materials for a graphic arts course, this manual consists of nine instructional units dealing with presses and related processes. Covered in the units are basic press fundamentals, offset press systems, offset press operating procedures, offset inks and dampening chemistry, preventive maintenance…

  17. Evaluation: Boundary Identification in the Non-Linear Special Education System.

    ERIC Educational Resources Information Center

    Yacobacci, Patricia M.

    The evaluation process within special education, as in general education, most often becomes one of data collection consisting of formal and informal tests given by the school psychologist and the classroom instructor. Influences of the complex environment on the educational process are often ignored. Evaluation factors include mainstreaming,…

  18. The Swedish system for compensation of patient injuries.

    PubMed

    Johansson, Henry

    2010-05-01

    Since 1975 Sweden has had a patient insurance system to compensate patients for health-related injuries. The system was initially based on a voluntary patient insurance solution, but in 1997 it was replaced by the Patient Insurance Act. The current Act covers both physical and mental injuries. Although about 9,000-10,000 cases are processed in Sweden annually, compensation is paid in barely half of these cases. In the Swedish patient injury claim processing system, the Patient Claims Panel is the authority that plays an important role in ensuring fair and consistent application of the Act.

  19. The Digital Data Acquisition System for the Russian VLBI Network of New Generation

    NASA Technical Reports Server (NTRS)

    Fedotov, Leonid; Nosov, Eugeny; Grenkov, Sergey; Marshalov, Dmitry

    2010-01-01

    The system consists of several identical channels of 1024 MHz bandwidth each. In each channel, the RF band is frequency-translated to the intermediate frequency range 1 - 2 GHz. Each channel consists of two parts: the digitizer and Mark 5C recorder. The digitizer is placed on the antenna close to the corresponding Low-Noise Amplifier output and consists of the analog frequency converter, ADC, and a device for digital processing of the signals using FPGA. In the digitizer the subdigitization on frequency of 2048 MHz is used. For producing narrow-band channels and to interface with existing data acquisition systems, the polyphase filtering with FPGA can be used. Digital signals are re-quantized to 2-bits in the FPGA and are transferred to an input of Mark 5C through a fiber line. The breadboard model of the digitizer is being tested, and the data acquisition system is being designed.

  20. Miss-distance indicator for tank main guns

    NASA Astrophysics Data System (ADS)

    Bornstein, Jonathan A.; Hillis, David B.

    1996-06-01

    Tank main gun systems must possess extremely high levels of accuracy to perform successfully in battle. Under some circumstances, the first round fired in an engagement may miss the intended target, and it becomes necessary to rapidly correct fire. A breadboard automatic miss-distance indicator system was previously developed to assist in this process. The system, which would be mounted on a 'wingman' tank, consists of a charged-coupled device (CCD) camera and computer-based image-processing system, coupled with a separate infrared sensor to detect muzzle flash. For the system to be successfully employed with current generation tanks, it must be reliable, be relatively low cost, and respond rapidly maintaining current firing rates. Recently, the original indicator system was developed further in an effort to assist in achieving these goals. Efforts have focused primarily upon enhanced image-processing algorithms, both to improve system reliability and to reduce processing requirements. Intelligent application of newly refined trajectory models has permitted examination of reduced areas of interest and enhanced rejection of false alarms, significantly improving system performance.

  1. Passive fire building protection system evaluation (case study: millennium ict centre)

    NASA Astrophysics Data System (ADS)

    Rahman, Vinky; Stephanie

    2018-03-01

    Passive fire protection system is a system that refers to the building design, both regarding of architecture and structure. This system usually consists of structural protection that protects the structure of the building and prevents the spread of fire and facilitate the evacuation process in case of fire. Millennium ICT Center is the largest electronic shopping center in Medan, Indonesia. As a public building that accommodates the crowd, this building needs a fire protection system by the standards. Therefore, the purpose of this study is to evaluate passive fire protection system of Millennium ICT Center building. The study was conducted to describe the facts of the building as well as direct observation to the research location. The collected data is then processed using the AHP (Analytical Hierarchy Process) method in its weighting process to obtain the reliability value of passive fire protection fire system. The results showed that there are some components of passive fire protection system in the building, but some are still unqualified. The first section in your paper

  2. AVIRIS and TIMS data processing and distribution at the land processes distributed active archive center

    NASA Technical Reports Server (NTRS)

    Mah, G. R.; Myers, J.

    1993-01-01

    The U.S. Government has initiated the Global Change Research program, a systematic study of the Earth as a complete system. NASA's contribution of the Global Change Research Program is the Earth Observing System (EOS), a series of orbital sensor platforms and an associated data processing and distribution system. The EOS Data and Information System (EOSDIS) is the archiving, production, and distribution system for data collected by the EOS space segment and uses a multilayer architecture for processing, archiving, and distributing EOS data. The first layer consists of the spacecraft ground stations and processing facilities that receive the raw data from the orbiting platforms and then separate the data by individual sensors. The second layer consists of Distributed Active Archive Centers (DAAC) that process, distribute, and archive the sensor data. The third layer consists of a user science processing network. The EOSDIS is being developed in a phased implementation. The initial phase, Version 0, is a prototype of the operational system. Version 0 activities are based upon existing systems and are designed to provide an EOSDIS-like capability for information management and distribution. An important science support task is the creation of simulated data sets for EOS instruments from precursor aircraft or satellite data. The Land Processes DAAC, at the EROS Data Center (EDC), is responsible for archiving and processing EOS precursor data from airborne instruments such as the Thermal Infrared Multispectral Scanner (TIMS), the Thematic Mapper Simulator (TMS), and Airborne Visible and Infrared Imaging Spectrometer (AVIRIS). AVIRIS, TIMS, and TMS are flown by the NASA-Ames Research Center ARC) on an ER-2. The ER-2 flies at 65000 feet and can carry up to three sensors simultaneously. Most jointly collected data sets are somewhat boresighted and roughly registered. The instrument data are being used to construct data sets that simulate the spectral and spatial characteristics of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument scheduled to be flown on the first EOS-AM spacecraft. The ASTER is designed to acquire 14 channels of land science data in the visible and near-IR (VNIR), shortwave-IR (SWIR), and thermal-IR (TIR) regions from 0.52 micron to 11.65 micron at high spatial resolutions of 15 m to 90 m. Stereo data will also be acquired in the VNIR region in a single band. The AVIRIS and TMS cover the ASTER VNIR and SWIR bands, and the TIMS covers the TIR bands. Simulated ASTER data sets have been generated over Death Valley, California, Cuprite, Nevada, and the Drum Mountains, Utah using a combination of AVIRIS, TIMS, amd TMS data, and existing digital elevation models (DEM) for the topographic information.

  3. A Model-Based Approach to Developing Your Mission Operations System

    NASA Technical Reports Server (NTRS)

    Smith, Robert R.; Schimmels, Kathryn A.; Lock, Patricia D; Valerio, Charlene P.

    2014-01-01

    Model-Based System Engineering (MBSE) is an increasingly popular methodology for designing complex engineering systems. As the use of MBSE has grown, it has begun to be applied to systems that are less hardware-based and more people- and process-based. We describe our approach to incorporating MBSE as a way to streamline development, and how to build a model consisting of core resources, such as requirements and interfaces, that can be adapted and used by new and upcoming projects. By comparing traditional Mission Operations System (MOS) system engineering with an MOS designed via a model, we will demonstrate the benefits to be obtained by incorporating MBSE in system engineering design processes.

  4. Dual processing and diagnostic errors.

    PubMed

    Norman, Geoff

    2009-09-01

    In this paper, I review evidence from two theories in psychology relevant to diagnosis and diagnostic errors. "Dual Process" theories of thinking, frequently mentioned with respect to diagnostic error, propose that categorization decisions can be made with either a fast, unconscious, contextual process called System 1 or a slow, analytical, conscious, and conceptual process, called System 2. Exemplar theories of categorization propose that many category decisions in everyday life are made by unconscious matching to a particular example in memory, and these remain available and retrievable individually. I then review studies of clinical reasoning based on these theories, and show that the two processes are equally effective; System 1, despite its reliance in idiosyncratic, individual experience, is no more prone to cognitive bias or diagnostic error than System 2. Further, I review evidence that instructions directed at encouraging the clinician to explicitly use both strategies can lead to consistent reduction in error rates.

  5. Development of a data acquisition system using a RISC/UNIX TM workstation

    NASA Astrophysics Data System (ADS)

    Takeuchi, Y.; Tanimori, T.; Yasu, Y.

    1993-05-01

    We have developed a compact data acquisition system on RISC/UNIX workstations. A SUN TM SPARCstation TM IPC was used, in which an extension bus "SBus TM" was linked to a VMEbus. The transfer rate achieved was better than 7 Mbyte/s between the VMEbus and the SUN. A device driver for CAMAC was developed in order to realize an interruptive feature in UNIX. In addition, list processing has been incorporated in order to keep the high priority of the data handling process in UNIX. The successful developments of both device driver and list processing have made it possible to realize the good real-time feature on the RISC/UNIX system. Based on this architecture, a portable and versatile data taking system has been developed, which consists of a graphical user interface, I/O handler, user analysis process, process manager and a CAMAC device driver.

  6. Self-consistent hybrid functionals for solids: a fully-automated implementation

    NASA Astrophysics Data System (ADS)

    Erba, A.

    2017-08-01

    A fully-automated algorithm for the determination of the system-specific optimal fraction of exact exchange in self-consistent hybrid functionals of the density-functional-theory is illustrated, as implemented into the public Crystal program. The exchange fraction of this new class of functionals is self-consistently updated proportionally to the inverse of the dielectric response of the system within an iterative procedure (Skone et al 2014 Phys. Rev. B 89, 195112). Each iteration of the present scheme, in turn, implies convergence of a self-consistent-field (SCF) and a coupled-perturbed-Hartree-Fock/Kohn-Sham (CPHF/KS) procedure. The present implementation, beside improving the user-friendliness of self-consistent hybrids, exploits the unperturbed and electric-field perturbed density matrices from previous iterations as guesses for subsequent SCF and CPHF/KS iterations, which is documented to reduce the overall computational cost of the whole process by a factor of 2.

  7. IDC Re-Engineering Phase 2 System Requirements Document Version 1.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, James M.; Burns, John F.; Satpathi, Meara Allena

    This System Requirements Document (SRD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The IDC applies, on a routine basis, automatic processing methods and interactive analysis to raw International Monitoring System (IMS) data in order to produce, archive, and distribute standard IDC products on behalf of all States Parties. The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide data,more » but includes requirements for the dissemination of radionuclide data and products.« less

  8. IDC Re-Engineering Phase 2 System Requirements Document V1.3.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, James M.; Burns, John F.; Satpathi, Meara Allena

    2015-12-01

    This System Requirements Document (SRD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The IDC applies, on a routine basis, automatic processing methods and interactive analysis to raw International Monitoring System (IMS) data in order to produce, archive, and distribute standard IDC products on behalf of all States Parties. The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide datamore » but includes requirements for the dissemination of radionuclide data and products.« less

  9. Data Acquisition and Processing System for Airborne Wind Profiling with a Pulsed, 2-Micron, Coherent-Detection, Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, J. Y.; Koch, G. J.; Kavaya, M. J.

    2010-01-01

    A data acquisition and signal processing system is being developed for a 2-micron airborne wind profiling coherent Doppler lidar system. This lidar, called the Doppler Aerosol Wind Lidar (DAWN), is based on a Ho:Tm:LuLiF laser transmitter and 15-cm diameter telescope. It is being packaged for flights onboard the NASA DC-8, with the first flights in the summer of 2010 in support of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The data acquisition and processing system is housed in a compact PCI chassis and consists of four components such as a digitizer, a digital signal processing (DSP) module, a video controller, and a serial port controller. The data acquisition and processing software (DAPS) is also being developed to control the system including real-time data analysis and display. The system detects an external 10 Hz trigger pulse and initiates the data acquisition and processing process, and displays selected wind profile parameters such as Doppler shift, power distribution, wind directions and velocities. Doppler shift created by aircraft motion is measured by an inertial navigation/GPS sensor and fed to the signal processing system for real-time removal of aircraft effects from wind measurements. A general overview of the system and the DAPS as well as the coherent Doppler lidar system is presented in this paper.

  10. Online thesis guidance management information system

    NASA Astrophysics Data System (ADS)

    Nasution, T. H.; Pratama, F.; Tanjung, K.; Siregar, I.; Amalia, A.

    2018-03-01

    The development of internet technology in education is still not maximized, especially in the process of thesis guidance between students and lecturers. Difficulties met the lecturers to help students during thesis guidance is the limited communication time and the compatibility of schedule between students and lecturer. To solve this problem, we designed an online thesis guidance management information system that helps students and lecturers to do thesis tutoring process anytime, anywhere. The system consists of a web-based admin app for usage management and an android-based app for students and lecturers.

  11. Process of producing liquid hydrocarbon fuels from biomass

    DOEpatents

    Kuester, J.L.

    1987-07-07

    A continuous thermochemical indirect liquefaction process is described to convert various biomass materials into diesel-type transportation fuels which fuels are compatible with current engine designs and distribution systems comprising feeding said biomass into a circulating solid fluidized bed gasification system to produce a synthesis gas containing olefins, hydrogen and carbon monoxide and thereafter introducing the synthesis gas into a catalytic liquefaction system to convert the synthesis gas into liquid hydrocarbon fuel consisting essentially of C[sub 7]-C[sub 17] paraffinic hydrocarbons having cetane indices of 50+. 1 fig.

  12. Loran-C digital word generator for use with a KIM-1 microprocessor system

    NASA Technical Reports Server (NTRS)

    Nickum, J. D.

    1977-01-01

    The problem of translating the time of occurrence of received Loran-C pulses into a time, referenced to a particular period of occurrence is addressed and applied to the design of a digital word generator for a Loran-C sensor processor package. The digital information from this word generator is processed in a KIM-1 microprocessor system which is based on the MOS 6502 CPU. This final system will consist of a complete time difference sensor processor for determining position information using Loran-C charts. The system consists of the KIM-1 microprocessor module, a 4K RAM memory board, a user interface, and the Loran-C word generator.

  13. Interlibrary Lending with Computerized Union Catalogues.

    ERIC Educational Resources Information Center

    Lehmann, Klaus-Dieter

    Interlibrary loans in the Federal Republic of Germany are facilitated by applying techniques of data processing and computer output microfilm (COM) to the union catalogs of the national library system. The German library system consists of two national libraries, four central specialized libraries of technology, medicine, agriculture, and…

  14. Business Information Systems. Occupational Competency Analysis Profile.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Vocational Instructional Materials Lab.

    This Occupational Competency Analysis Profile (OCAP) for business information systems is an employer-verified competency list that evolved from a modified DACUM (Developing a Curriculum) job analysis process involving business, industry, labor, and community agency representatives throughout Ohio. The competency list consists of 10 units: (1) data…

  15. Intelligent manipulation technique for multi-branch robotic systems

    NASA Technical Reports Server (NTRS)

    Chen, Alexander Y. K.; Chen, Eugene Y. S.

    1990-01-01

    New analytical development in kinematics planning is reported. The INtelligent KInematics Planner (INKIP) consists of the kinematics spline theory and the adaptive logic annealing process. Also, a novel framework of robot learning mechanism is introduced. The FUzzy LOgic Self Organized Neural Networks (FULOSONN) integrates fuzzy logic in commands, control, searching, and reasoning, the embedded expert system for nominal robotics knowledge implementation, and the self organized neural networks for the dynamic knowledge evolutionary process. Progress on the mechanical construction of SRA Advanced Robotic System (SRAARS) and the real time robot vision system is also reported. A decision was made to incorporate the Local Area Network (LAN) technology in the overall communication system.

  16. Low Vision Enhancement System

    NASA Technical Reports Server (NTRS)

    1995-01-01

    NASA's Technology Transfer Office at Stennis Space Center worked with the Johns Hopkins Wilmer Eye Institute in Baltimore, Md., to incorporate NASA software originally developed by NASA to process satellite images into the Low Vision Enhancement System (LVES). The LVES, referred to as 'ELVIS' by its users, is a portable image processing system that could make it possible to improve a person's vision by enhancing and altering images to compensate for impaired eyesight. The system consists of two orientation cameras, a zoom camera, and a video projection system. The headset and hand-held control weigh about two pounds each. Pictured is Jacob Webb, the first Mississippian to use the LVES.

  17. Fully integrated carbon nanotube composite thin film strain sensors on flexible substrates for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Burton, A. R.; Lynch, J. P.; Kurata, M.; Law, K. H.

    2017-09-01

    Multifunctional thin film materials have opened many opportunities for novel sensing strategies for structural health monitoring. While past work has established methods of optimizing multifunctional materials to exhibit sensing properties, comparatively less work has focused on their integration into fully functional sensing systems capable of being deployed in the field. This study focuses on the advancement of a scalable fabrication process for the integration of multifunctional thin films into a fully integrated sensing system. This is achieved through the development of an optimized fabrication process that can create a broad range of sensing systems using multifunctional materials. A layer-by-layer deposited multifunctional composite consisting of single walled carbon nanotubes (SWNT) in a polyvinyl alcohol and polysodium-4-styrene sulfonate matrix are incorporated with a lithography process to produce a fully integrated sensing system deposited on a flexible substrate. To illustrate the process, a strain sensing platform consisting of a patterned SWNT-composite thin film as a strain-sensitive element within an amplified Wheatstone bridge sensing circuit is presented. Strain sensing is selected because it presents many of the design and processing challenges that are core to patterning multifunctional thin film materials into sensing systems. Strain sensors fabricated on a flexible polyimide substrate are experimentally tested under cyclic loading using standard four-point bending coupons and a partial-scale steel frame assembly under lateral loading. The study reveals the material process is highly repeatable to produce fully integrated strain sensors with linearity and sensitivity exceeding 0.99 and 5 {{V}}/{ε }, respectively. The thin film strain sensors are robust and are capable of high strain measurements beyond 3000 μ {ε }.

  18. SENSOR: a tool for the simulation of hyperspectral remote sensing systems

    NASA Astrophysics Data System (ADS)

    Börner, Anko; Wiest, Lorenz; Keller, Peter; Reulke, Ralf; Richter, Rolf; Schaepman, Michael; Schläpfer, Daniel

    The consistent end-to-end simulation of airborne and spaceborne earth remote sensing systems is an important task, and sometimes the only way for the adaptation and optimisation of a sensor and its observation conditions, the choice and test of algorithms for data processing, error estimation and the evaluation of the capabilities of the whole sensor system. The presented software simulator SENSOR (Software Environment for the Simulation of Optical Remote sensing systems) includes a full model of the sensor hardware, the observed scene, and the atmosphere in between. The simulator consists of three parts. The first part describes the geometrical relations between scene, sun, and the remote sensing system using a ray-tracing algorithm. The second part of the simulation environment considers the radiometry. It calculates the at-sensor radiance using a pre-calculated multidimensional lookup-table taking the atmospheric influence on the radiation into account. The third part consists of an optical and an electronic sensor model for the generation of digital images. Using SENSOR for an optimisation requires the additional application of task-specific data processing algorithms. The principle of the end-to-end-simulation approach is explained, all relevant concepts of SENSOR are discussed, and first examples of its use are given. The verification of SENSOR is demonstrated. This work is closely related to the Airborne PRISM Experiment (APEX), an airborne imaging spectrometer funded by the European Space Agency.

  19. TEX-SIS FOLLOW-UP: Student Follow-up Management Information System. Data Processing Manual.

    ERIC Educational Resources Information Center

    Tarrant County Junior Coll. District, Ft. Worth, TX.

    Project FOLLOW-UP was conducted to develop, test, and validate a statewide management information system for follow-up of Texas public junior and community college students. The result of this project was a student information system (TEX-SIS) consisting of seven subsystems: (1) Student's Educational Intent, (2) Nonreturning Student Follow-up, (3)…

  20. Understanding the Perceived Usefulness and the Ease of Use of a Hospital Information System: the case of a French University Hospital.

    PubMed

    Ologeanu-Taddei, R; Morquin, D; Bourret, R

    2015-01-01

    The goal of this study was to examine the perceived usefulness and the perceived ease of use of a Hospital Information System (HIS) for the care staff. We administrated a questionnaire composed of open-end and closed questions. As results, the perceived usefulness and ease of use are correlated with medical occupations. Content analysis of open questions highlights three factors influencing these constructs: ergonomics, errors in the documenting process and insufficient compatibility with the medical department or the occupation. While the results are consistent with literature, they show medical occupations do not report the same low rate of perceived usefulness and of ease of use. The main explanation consists in the medical risk in the prescription process for anesthesiologists, surgeons and physicians.

  1. The UARS and open data system concept and analysis study. Executive summary

    NASA Technical Reports Server (NTRS)

    Mittal, M.; Nebb, J.; Woodward, H.

    1983-01-01

    Alternative concepts for a common design for the UARS and OPEN Central Data Handling Facility (CDHF) are offered. The designs are consistent with requirements shared by UARS and OPEN and the data storage and data processing demands of these missions. Because more detailed information is available for UARS, the design approach was to size the system and to select components for a UARS CDHF, but in a manner that does not optimize the CDHF at the expense of OPEN. Costs for alternative implementations of the UARS designs are presented showing that the system design does not restrict the implementation to a single manufacturer. Processing demands on the alternative UARS CDHF implementations are discussed. With this information at hand together with estimates for OPEN processing demands, it is shown that any shortfall in system capability for OPEN support can be remedied by either component upgrades or array processing attachments rather than a system redesign.

  2. Empirical modeling for intelligent, real-time manufacture control

    NASA Technical Reports Server (NTRS)

    Xu, Xiaoshu

    1994-01-01

    Artificial neural systems (ANS), also known as neural networks, are an attempt to develop computer systems that emulate the neural reasoning behavior of biological neural systems (e.g. the human brain). As such, they are loosely based on biological neural networks. The ANS consists of a series of nodes (neurons) and weighted connections (axons) that, when presented with a specific input pattern, can associate specific output patterns. It is essentially a highly complex, nonlinear, mathematical relationship or transform. These constructs have two significant properties that have proven useful to the authors in signal processing and process modeling: noise tolerance and complex pattern recognition. Specifically, the authors have developed a new network learning algorithm that has resulted in the successful application of ANS's to high speed signal processing and to developing models of highly complex processes. Two of the applications, the Weld Bead Geometry Control System and the Welding Penetration Monitoring System, are discussed in the body of this paper.

  3. All-IP-Ethernet architecture for real-time sensor-fusion processing

    NASA Astrophysics Data System (ADS)

    Hiraki, Kei; Inaba, Mary; Tezuka, Hiroshi; Tomari, Hisanobu; Koizumi, Kenichi; Kondo, Shuya

    2016-03-01

    Serendipter is a device that distinguishes and selects very rare particles and cells from huge amount of population. We are currently designing and constructing information processing system for a Serendipter. The information processing system for Serendipter is a kind of sensor-fusion system but with much more difficulties: To fulfill these requirements, we adopt All IP based architecture: All IP-Ethernet based data processing system consists of (1) sensor/detector directly output data as IP-Ethernet packet stream, (2) single Ethernet/TCP/IP streams by a L2 100Gbps Ethernet switch, (3) An FPGA board with 100Gbps Ethernet I/F connected to the switch and a Xeon based server. Circuits in the FPGA include 100Gbps Ethernet MAC, buffers and preprocessing, and real-time Deep learning circuits using multi-layer neural networks. Proposed All-IP architecture solves existing problem to construct large-scale sensor-fusion systems.

  4. A Spiking Neural Network System for Robust Sequence Recognition.

    PubMed

    Yu, Qiang; Yan, Rui; Tang, Huajin; Tan, Kay Chen; Li, Haizhou

    2016-03-01

    This paper proposes a biologically plausible network architecture with spiking neurons for sequence recognition. This architecture is a unified and consistent system with functional parts of sensory encoding, learning, and decoding. This is the first systematic model attempting to reveal the neural mechanisms considering both the upstream and the downstream neurons together. The whole system is a consistent temporal framework, where the precise timing of spikes is employed for information processing and cognitive computing. Experimental results show that the system is competent to perform the sequence recognition, being robust to noisy sensory inputs and invariant to changes in the intervals between input stimuli within a certain range. The classification ability of the temporal learning rule used in the system is investigated through two benchmark tasks that outperform the other two widely used learning rules for classification. The results also demonstrate the computational power of spiking neurons over perceptrons for processing spatiotemporal patterns. In summary, the system provides a general way with spiking neurons to encode external stimuli into spatiotemporal spikes, to learn the encoded spike patterns with temporal learning rules, and to decode the sequence order with downstream neurons. The system structure would be beneficial for developments in both hardware and software.

  5. Consistent Chemical Mechanism from Collaborative Data Processing

    DOE PAGES

    Slavinskaya, Nadezda; Starcke, Jan-Hendrik; Abbasi, Mehdi; ...

    2016-04-01

    Numerical tool of Process Informatics Model (PrIMe) is mathematically rigorous and numerically efficient approach for analysis and optimization of chemical systems. It handles heterogeneous data and is scalable to a large number of parameters. The Boundto-Bound Data Collaboration module of the automated data-centric infrastructure of PrIMe was used for the systematic uncertainty and data consistency analyses of the H 2/CO reaction model (73/17) and 94 experimental targets (ignition delay times). The empirical rule for evaluation of the shock tube experimental data is proposed. The initial results demonstrate clear benefits of the PrIMe methods for an evaluation of the kinetic datamore » quality and data consistency and for developing predictive kinetic models.« less

  6. Back to the Future: Consistency-Based Trajectory Tracking

    NASA Technical Reports Server (NTRS)

    Kurien, James; Nayak, P. Pandurand; Norvig, Peter (Technical Monitor)

    2000-01-01

    Given a model of a physical process and a sequence of commands and observations received over time, the task of an autonomous controller is to determine the likely states of the process and the actions required to move the process to a desired configuration. We introduce a representation and algorithms for incrementally generating approximate belief states for a restricted but relevant class of partially observable Markov decision processes with very large state spaces. The algorithm presented incrementally generates, rather than revises, an approximate belief state at any point by abstracting and summarizing segments of the likely trajectories of the process. This enables applications to efficiently maintain a partial belief state when it remains consistent with observations and revisit past assumptions about the process' evolution when the belief state is ruled out. The system presented has been implemented and results on examples from the domain of spacecraft control are presented.

  7. Photoacoustic CO2 sensor system: design and potential for miniaturization and integration in silicon

    NASA Astrophysics Data System (ADS)

    Huber, J.; Wöllenstein, J.

    2015-05-01

    The detection of CO2 indoors has a large impact on today's sensor market. The ambient room climate is important for human health and wellbeing. The CO2 concentration is a main indicator for indoor climate and correlates with the number of persons inside a room. People in Europe spend more than 90% of their time indoors. This leads to a high demand for miniaturized and energy efficient CO2 sensors. To realize small and energy-efficient mass-market sensors, we develop novel miniaturized photoacoustic sensor systems with optimized design for real-time and selective CO2 detection. The sensor system consists of two chambers, a measurement and a detection chamber. The detection chamber consists of an integrated pressure sensor under special gas atmosphere. As pressure sensor we use a commercially available cell phone microphone. We describe a possible miniaturization process of the developed system by regarding the possibility of integration of all sensor parts. The system is manufactured in precision mechanics with IR-optical sapphire windows as optical connections. During the miniaturization process the sapphire windows are replaced by Si chips with a special IR anti-reflection coating. The developed system is characterized in detail with gas measurements and optical transmission investigations. The results of the characterization process offer a high potential for further miniaturization with high capability for mass market applications.

  8. Maturation of Structural Health Management Systems for Solid Rocket Motors

    NASA Technical Reports Server (NTRS)

    Quing, Xinlin; Beard, Shawn; Zhang, Chang

    2011-01-01

    Concepts of an autonomous and automated space-compliant diagnostic system were developed for conditioned-based maintenance (CBM) of rocket motors for space exploration vehicles. The diagnostic system will provide real-time information on the integrity of critical structures on launch vehicles, improve their performance, and greatly increase crew safety while decreasing inspection costs. Using the SMART Layer technology as a basis, detailed procedures and calibration techniques for implementation of the diagnostic system were developed. The diagnostic system is a distributed system, which consists of a sensor network, local data loggers, and a host central processor. The system detects external impact to the structure. The major functions of the system include an estimate of impact location, estimate of impact force at impacted location, and estimate of the structure damage at impacted location. This system consists of a large-area sensor network, dedicated multiple local data loggers with signal processing and data analysis software to allow for real-time, in situ monitoring, and longterm tracking of structural integrity of solid rocket motors. Specifically, the system could provide easy installation of large sensor networks, onboard operation under harsh environments and loading, inspection of inaccessible areas without disassembly, detection of impact events and impact damage in real-time, and monitoring of a large area with local data processing to reduce wiring.

  9. Computer simulation of the human respiratory system for educational purposes.

    PubMed

    Botsis, Taxiarhis; Halkiotis, Stelios-Chris; Kourlaba, Georgia

    2004-01-01

    The main objective of this study was the development of a computer simulation system for the human respiratory system, in order to educate students of nursing. This approach was based on existing mathematical models and on our own constructed specific functions. For the development of this educational tool the appropriate software packages were used according to the special demands of this process. This system is called ReSim (Respiratory Simulation) and consists of two parts: the first part deals with pulmonary volumes and the second one represents the mechanical behavior of lungs. The target group evaluated ReSim. The outcomes of the evaluation process were positive and helped us realize the system characteristics that needed improvements. Our basic conclusion is that the extended use of such systems supports the educational process and offers new potential for learning.

  10. Sensitivity of measurement-based purification processes to inner interactions

    NASA Astrophysics Data System (ADS)

    Militello, Benedetto; Napoli, Anna

    2018-02-01

    The sensitivity of a repeated measurement-based purification scheme to additional undesired couplings is analyzed, focusing on the very simple and archetypical system consisting of two two-level systems interacting with a repeatedly measured one. Several regimes are considered and in the strong coupling limit (i.e., when the coupling constant of the undesired interaction is very large) the occurrence of a quantum Zeno effect is proven to dramatically jeopardize the efficiency of the purification process.

  11. Partnering for Quality under the Workforce Investment Act: A Tool Kit for One-Stop System Building. Module 5: Building a Process for Continuous Improvement. Training Manual with Participant Workbook.

    ERIC Educational Resources Information Center

    Kogan, Deborah; Koller, Vinz; Kozumplik, Richalene; Lawrence, Mary Ann

    This document is part of a five-module training package to help employment and training service providers comply with the Workforce Investment Act (WIA) of 1998 and develop a one-stop training and employment services system. It consists of the participant workbook, trainer manual, and activity worksheets for a module on building a process for…

  12. Molecular forms of C-type natriuretic peptide in cerebrospinal fluid and plasma reflect differential processing in brain and pituitary tissues.

    PubMed

    Wilson, Michele O; Barrell, Graham K; Prickett, Timothy C R; Espiner, Eric A

    2018-01-01

    C-type natriuretic peptide (CNP) is a paracrine growth factor widely expressed within tissues of the central nervous system. Consistent with this is the high concentration of CNP in cerebrospinal fluid (CSF), exceeding levels in the systemic circulation. CNP abundance is high in hypothalamus and especially enriched in pituitary tissue where - in contrast to hypothalamus - processing to CNP-22 is minimal. Recently we have shown that dexamethasone acutely raises CNP peptides throughout the brain as well as in CSF and plasma. Postulating that molecular forms of CNP would differ in central tissues compared to forms in pituitary and plasma, we have characterized the molecular forms of CNP in tissues (hypothalamus, anterior and posterior pituitary gland) and associated fluids (CSF and plasma) using size-exclusion high performance liquid chromatography (SE-HPLC) and radioimmunoassay in control (saline-treated) and dexamethasone-treated adult sheep. Three immunoreactive-CNP components were identified which were consistent with proCNP (1-103), CNP-53 and CNP-22, but the presence and proportions of these different fragments differed among tissues. Peaks consistent with CNP-53 were the dominant form in all tissues and fluids. Peaks consistent with proCNP, conspicuous in hypothalamic extracts, were negligible in CSF whereas proportions of low molecular weight immunoreactivity (IR) consistent with CNP-22 were similar in hypothalamus, posterior pituitary gland and CSF. In contrast, in both plasma and the anterior pituitary gland, proportions of higher molecular weight IR, consistent with CNP-53 and proCNP, predominated, and low molecular weight IR consistent with CNP-22 was very low. After dexamethasone, proCNP like material - but not other forms - was increased in all samples except CSF, consistent with increased synthesis and secretion. In conclusion, immunoreactive forms of CNP in central tissues differ from those identified in anterior pituitary tissue and plasma - suggesting that the anterior pituitary gland may contribute to systemic levels of CNP in some physiological settings. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Microphysics in Multi-scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2012-01-01

    Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the microphysics development and its performance for the multi-scale modeling system will be presented.

  14. Representing Information in Patient Reports Using Natural Language Processing and the Extensible Markup Language

    PubMed Central

    Friedman, Carol; Hripcsak, George; Shagina, Lyuda; Liu, Hongfang

    1999-01-01

    Objective: To design a document model that provides reliable and efficient access to clinical information in patient reports for a broad range of clinical applications, and to implement an automated method using natural language processing that maps textual reports to a form consistent with the model. Methods: A document model that encodes structured clinical information in patient reports while retaining the original contents was designed using the extensible markup language (XML), and a document type definition (DTD) was created. An existing natural language processor (NLP) was modified to generate output consistent with the model. Two hundred reports were processed using the modified NLP system, and the XML output that was generated was validated using an XML validating parser. Results: The modified NLP system successfully processed all 200 reports. The output of one report was invalid, and 199 reports were valid XML forms consistent with the DTD. Conclusions: Natural language processing can be used to automatically create an enriched document that contains a structured component whose elements are linked to portions of the original textual report. This integrated document model provides a representation where documents containing specific information can be accurately and efficiently retrieved by querying the structured components. If manual review of the documents is desired, the salient information in the original reports can also be identified and highlighted. Using an XML model of tagging provides an additional benefit in that software tools that manipulate XML documents are readily available. PMID:9925230

  15. Execution Of Systems Integration Principles During Systems Engineering Design

    DTIC Science & Technology

    2016-09-01

    This thesis discusses integration failures observed by DOD and non - DOD systems as, inadequate stakeholder analysis, incomplete problem space and design ... design , development, test and deployment of a system. A lifecycle structure consists of phases within a methodology or process model. There are many...investigate design decisions without the need to commit to physical forms; “ experimental investigation using a model yields design or operational

  16. SMART Rotor Development and Wind Tunnel Test

    DTIC Science & Technology

    2009-09-01

    amplifier and control system , and data acquisition, processing, and display systems . Boeing�s LRTS (Fig. 2), consists of a sled structure that...Support Test Stand Sled Tail Sting Outrigger Arm Figure 2: System integration test at whirl tower Port Rotor Balance Main Strut Flap Tail...demonstrated. Finally, the reliability of the flap actuation system was successfully proven in more than 60 hours of wind tunnel testing

  17. Washington's Community and Technical Colleges' Student Achievement Initiative: Lessons Learned since the 2012 Revision and Considerations for New Allocation Model. Research Report 16-1

    ERIC Educational Resources Information Center

    Washington State Board for Community and Technical Colleges, 2016

    2016-01-01

    In January 2012, a system-wide task force came together for a nearly year-long process of revising the community and technical college system's performance-based funding (PBF) system, the Student Achievement Initiative. This review was consistent with national experts' recommendations for continuous evaluation of PBF systems to ensure overall…

  18. The Doctoral Portfolio: Centerpiece of a Comprehensive System of Evaluation

    ERIC Educational Resources Information Center

    Cobia, Debra C.; Carney, Jamie S.; Buckhalt, Joseph A.; Middleton, Renee A.; Shannon, David M.; Trippany, Robyn; Kunkel, Elizabeth

    2005-01-01

    The authors describe the process used to revise a traditional doctoral student evaluation system from one that consisted of written comprehensive and final oral examinations to one that features portfolio development. Student competence, expected student outcomes in each competency area, procedures for portfolio development, and documents and…

  19. DEMONSTRATION BULLETIN: X*TRAX MODEL 200 THERMAL DESORPTION SYSTEMS - CHEMICAL WASTE MANAGEMENT, INC.

    EPA Science Inventory

    The X*TRAX™ Mode! 200 Thermal Desorption System developed by Chemical Waste Management, Inc. (CWM), is a low-temperature process designed to separate organic contaminants from soils, sludges, and other solid media. The X*TRAX™ Model 200 is fully transportable and consists of thre...

  20. An introduction to the Marshall information retrieval and display system

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An on-line terminal oriented data storage and retrieval system is presented which allows a user to extract and process information from stored data bases. The use of on-line terminals for extracting and displaying data from the data bases provides a fast and responsive method for obtaining needed information. The system consists of general purpose computer programs that provide the overall capabilities of the total system. The system can process any number of data files via a Dictionary (one for each file) which describes the data format to the system. New files may be added to the system at any time, and reprogramming is not required. Illustrations of the system are shown, and sample inquiries and responses are given.

  1. It Takes a Village to Design a Course: Embedding a Librarian in Course Design

    ERIC Educational Resources Information Center

    Mudd, Alex; Summey, Terri; Upson, Matt

    2015-01-01

    Often associated with online learning, instructional design is a process utilized in efficiently designing training and instruction to help ensure effectiveness. Typically, the instructional systems design (ISD) process uses a team-based approach, consisting of an instructor, a facilitator, a designer and a subject matter expert. Although library…

  2. High Power, High Energy Density Lithium-Ion Batteries

    DTIC Science & Technology

    2010-11-29

    cells and to provide affordable Lithium - Ion battery packs for the combat and tactical vehicle systems. - To address the manufacturing processes that will...reduce cost of lithium - ion battery packs by one half through the improvement of manufacturing process to enhance production consistency and increase the production yield of high power lithium-ion cells.

  3. Composite Materials for Maxillofacial Prostheses.

    DTIC Science & Technology

    1980-08-01

    projected composite systems are elastomeric-shelled, liquid-filled * microcapsules . Experiments continued on the interfacial polymerization process with...filled microcapsules . Experiments continued on the interfacial polymerization process, with spherical, sealed, capsules achieved. Needs identified are...consists of liquid-filled, elastomeric-shelled microcapsules held together to form a deformable mass; this is to simulate the semi-liquid cellular structure

  4. Generic E-Assessment Process Development Based on Reverse Engineering

    ERIC Educational Resources Information Center

    Hajjej, Fahima; Hlaoui, Yousra Bendaly; Ben Ayed, Leila Jemni

    2017-01-01

    The e-assessment, as an important part of any e-learning system, faces the same challenges and problems such as problems related to portability, reusability, adaptability, integration and interoperability. Therefore, we need an approach aiming to generate a general process of the e-assessment. The present study consists of the development of a…

  5. Role of filament annealing in the kinetics and thermodynamics of nucleated polymerization.

    PubMed

    Michaels, Thomas C T; Knowles, Tuomas P J

    2014-06-07

    The formation of nanoscale protein filaments from soluble precursor molecules through nucleated polymerization is a common form of supra-molecular assembly phenomenon. This process underlies the generation of a range of both functional and pathological structures in nature. Filament breakage has emerged as a key process controlling the kinetics of the growth reaction since it increases the number of filament ends in the system that can act as growth sites. In order to ensure microscopic reversibility, however, the inverse process of fragmentation, end-to-end annealing of filaments, is a necessary component of a consistent description of such systems. Here, we combine Smoluchowski kinetics with nucleated polymerization models to generate a master equation description of protein fibrillization, where filamentous structures can undergo end-to-end association, in addition to elongation, fragmentation, and nucleation processes. We obtain self-consistent closed-form expressions for the growth kinetics and discuss the key physics that emerges from considering filament fusion relative to current fragmentation only models. Furthermore, we study the key time scales that describe relaxation to equilibrium.

  6. Expert system and process optimization techniques for real-time monitoring and control of plasma processes

    NASA Astrophysics Data System (ADS)

    Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.

    1991-03-01

    To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).

  7. Assessing Technology in the Absence of Proof: Trust Based on the Interplay of Others' Opinions and the Interaction Process.

    PubMed

    de Vries, Peter W; van den Berg, Stéphanie M; Midden, Cees

    2015-12-01

    The present research addresses the question of how trust in systems is formed when unequivocal information about system accuracy and reliability is absent, and focuses on the interaction of indirect information (others' evaluations) and direct (experiential) information stemming from the interaction process. Trust in decision-supporting technology, such as route planners, is important for satisfactory user interactions. Little is known, however, about trust formation in the absence of outcome feedback, that is, when users have not yet had opportunity to verify actual outcomes. Three experiments manipulated others' evaluations ("endorsement cues") and various forms of experience-based information ("process feedback") in interactions with a route planner and measured resulting trust using rating scales and credits staked on the outcome. Subsequently, an overall analysis was conducted. Study 1 showed that effectiveness of endorsement cues on trust is moderated by mere process feedback. In Study 2, consistent (i.e., nonrandom) process feedback overruled the effect of endorsement cues on trust, whereas inconsistent process feedback did not. Study 3 showed that although the effects of consistent and inconsistent process feedback largely remained regardless of face validity, high face validity in process feedback caused higher trust than those with low face validity. An overall analysis confirmed these findings. Experiential information impacts trust even if outcome feedback is not available, and, moreover, overrules indirect trust cues-depending on the nature of the former. Designing systems so that they allow novice users to make inferences about their inner workings may foster initial trust. © 2015, Human Factors and Ergonomics Society.

  8. Multi-Center Implementation of NPR 7123.1A: A Collaborative Effort

    NASA Technical Reports Server (NTRS)

    Hall, Phillip B.; McNelis, Nancy B.

    2011-01-01

    Collaboration efforts between MSFC and GRC Engineering Directorates to implement the NASA Systems Engineering (SE) Engine have expanded over the past year to include other NASA Centers. Sharing information on designing, developing, and deploying SE processes has sparked further interest based on the realization that there is relative consistency in implementing SE processes at the institutional level. This presentation will provide a status on the ongoing multi-center collaboration and provide insight into how these NPR 7123.1A SE-aligned directives are being implemented and managed to better support the needs of NASA programs and projects. NPR 7123.1A, NASA Systems Engineering Processes and Requirements, was released on March 26, 2007 to clearly articulate and establish the requirements on the implementing organization for performing, supporting, and evaluating SE activities. In early 2009, MSFC and GRC Engineering Directorates undertook a collaborative opportunity to share their research and work associated with developing, updating and revising their SE process policy to comply and align with NPR 7123.1A. The goal is to develop instructions, checklists, templates, and procedures for each of the 17 SE process requirements so that systems engineers will be a position to define work that is process-driven. Greater efficiency and more effective technical management will be achieved due to consistency and repeatability of SE process implementation across and throughout each of the NASA centers. An added benefit will be to encourage NASA centers to pursue and collaborate on joint projects as a result of using common or similar processes, methods, tools, and techniques.

  9. Systems engineering and integration processes involved with manned mission operations

    NASA Technical Reports Server (NTRS)

    Kranz, Eugene F.; Kraft, Christopher C.

    1993-01-01

    This paper will discuss three mission operations functions that are illustrative of the key principles of operations SE&I and of the processes and products involved. The flight systems process was selected to illustrate the role of the systems product line in developing the depth and cross disciplinary skills needed for SE&I and providing the foundation for dialogue between participating elements. FDDD was selected to illustrate the need for a structured process to assure that SE&I provides complete and accurate results that consistently support program needs. The flight director's role in mission operations was selected to illustrate the complexity of the risk/gain tradeoffs involved in the development of the flight techniques and flight rules process as well as the absolute importance of the leadership role in developing the technical, operational, and political trades.

  10. Cosmic non-TEM radiation and synthetic feed array sensor system in ASIC mixed signal technology

    NASA Astrophysics Data System (ADS)

    Centureli, F.; Scotti, G.; Tommasino, P.; Trifiletti, A.; Romano, F.; Cimmino, R.; Saitto, A.

    2014-08-01

    The paper deals with the opportunity to introduce "Not strictly TEM waves" Synthetic detection Method (NTSM), consisting in a Three Axis Digital Beam Processing (3ADBP), to enhance the performances of radio telescope and sensor systems. Current Radio Telescopes generally use the classic 3D "TEM waves" approximation Detection Method, which consists in a linear tomography process (Single or Dual axis beam forming processing) neglecting the small z component. The Synthetic FEED ARRAY three axis Sensor SYSTEM is an innovative technique using a synthetic detection of the generic "NOT strictly TEM Waves radiation coming from the Cosmo, which processes longitudinal component of Angular Momentum too. Than the simultaneous extraction from radiation of both the linear and quadratic information component, may reduce the complexity to reconstruct the Early Universe in the different requested scales. This next order approximation detection of the observed cosmologic processes, may improve the efficacy of the statistical numerical model used to elaborate the same information acquired. The present work focuses on detection of such waves at carrier frequencies in the bands ranging from LF to MMW. The work shows in further detail the new generation of on line programmable and reconfigurable Mixed Signal ASIC technology that made possible the innovative Synthetic Sensor. Furthermore the paper shows the ability of such technique to increase the Radio Telescope Array Antenna performances.

  11. Non-equilibrium synergistic effects in atmospheric pressure plasmas.

    PubMed

    Guo, Heng; Zhang, Xiao-Ning; Chen, Jian; Li, He-Ping; Ostrikov, Kostya Ken

    2018-03-19

    Non-equilibrium is one of the important features of an atmospheric gas discharge plasma. It involves complicated physical-chemical processes and plays a key role in various actual plasma processing. In this report, a novel complete non-equilibrium model is developed to reveal the non-equilibrium synergistic effects for the atmospheric-pressure low-temperature plasmas (AP-LTPs). It combines a thermal-chemical non-equilibrium fluid model for the quasi-neutral plasma region and a simplified sheath model for the electrode sheath region. The free-burning argon arc is selected as a model system because both the electrical-thermal-chemical equilibrium and non-equilibrium regions are involved simultaneously in this arc plasma system. The modeling results indicate for the first time that it is the strong and synergistic interactions among the mass, momentum and energy transfer processes that determine the self-consistent non-equilibrium characteristics of the AP-LTPs. An energy transfer process related to the non-uniform spatial distributions of the electron-to-heavy-particle temperature ratio has also been discovered for the first time. It has a significant influence for self-consistently predicting the transition region between the "hot" and "cold" equilibrium regions of an AP-LTP system. The modeling results would provide an instructive guidance for predicting and possibly controlling the non-equilibrium particle-energy transportation process in various AP-LTPs in future.

  12. Effect of local structures on crystallization in deeply undercooled metallic glass-forming liquids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, S. Q.; Li, M. Z., E-mail: maozhili@ruc.edu.cn; Wu, Z. W.

    2016-04-21

    The crystallization mechanism in deeply undercooled ZrCu metallic glass-forming liquids was investigated via molecular dynamics simulations. It was found that the crystallization process is mainly controlled by the growth of crystal nuclei formed by the BCC-like atomic clusters, consistent with experimental speculations. The crystallization rate is found to relate to the number of growing crystal nuclei in the crystallization process. The crystallization rate in systems with more crystal nuclei is significantly hindered by the larger surface fractions of crystal nuclei and their different crystalline orientations. It is further revealed that in the crystallization in deeply undercooled regions, the BCC-like crystalmore » nuclei are formed from the inside of the precursors formed by the FCC-like atomic clusters, and growing at the expense of the precursors. Meanwhile, the precursors are expanding at the expense of the outside atomic clusters. This process is consistent with the so-called Ostwald step rule. The atomic structures of metallic glasses are found to have significant impact on the subsequent crystallization process. In the Zr{sub 85}Cu{sub 15} system, the stronger spatial correlation of Cu atoms could hinder the crystallization processes in deeply undercooled regions.« less

  13. Major technological innovations introduced in the large antennas of the Deep Space Network

    NASA Technical Reports Server (NTRS)

    Imbriale, W. A.

    2002-01-01

    The NASA Deep Space Network (DSN) is the largest and most sensitive scientific, telecommunications and radio navigation network in the world. Its principal responsibilities are to provide communications, tracking, and science services to most of the world's spacecraft that travel beyond low Earth orbit. The network consists of three Deep Space Communications Complexes. Each of the three complexes consists of multiple large antennas equipped with ultra sensitive receiving systems. A centralized Signal Processing Center (SPC) remotely controls the antennas, generates and transmits spacecraft commands, and receives and processes the spacecraft telemetry.

  14. Integration Processes Compared: Cortical Differences for Consistency Evaluation and Passive Comprehension in Local and Global Coherence.

    PubMed

    Egidi, Giovanna; Caramazza, Alfonso

    2016-10-01

    This research studies the neural systems underlying two integration processes that take place during natural discourse comprehension: consistency evaluation and passive comprehension. Evaluation was operationalized with a consistency judgment task and passive comprehension with a passive listening task. Using fMRI, the experiment examined the integration of incoming sentences with more recent, local context and with more distal, global context in these two tasks. The stimuli were stories in which we manipulated the consistency of the endings with the local context and the relevance of the global context for the integration of the endings. A whole-brain analysis revealed several differences between the two tasks. Two networks previously associated with semantic processing and attention orienting showed more activation during the judgment than the passive listening task. A network previously associated with episodic memory retrieval and construction of mental scenes showed greater activity when global context was relevant, but only during the judgment task. This suggests that evaluation, more than passive listening, triggers the reinstantiation of global context and the construction of a rich mental model for the story. Finally, a network previously linked to fluent updating of a knowledge base showed greater activity for locally consistent endings than inconsistent ones, but only during passive listening, suggesting a mode of comprehension that relies on a local scope approach to language processing. Taken together, these results show that consistency evaluation and passive comprehension weigh differently on distal and local information and are implemented, in part, by different brain networks.

  15. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    PubMed

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  16. The automated system for technological process of spacecraft's waveguide paths soldering

    NASA Astrophysics Data System (ADS)

    Tynchenko, V. S.; Murygin, A. V.; Emilova, O. A.; Bocharov, A. N.; Laptenok, V. D.

    2016-11-01

    The paper solves the problem of automated process control of space vehicles waveguide paths soldering by means of induction heating. The peculiarities of the induction soldering process are analyzed and necessity of information-control system automation is identified. The developed automated system makes the control of the product heating process, by varying the power supplied to the inductor, on the basis of information about the soldering zone temperature, and stabilizing the temperature in a narrow range above the melting point of the solder but below the melting point of the waveguide. This allows the soldering process automating to improve the quality of the waveguides and eliminate burn-troughs. The article shows a block diagram of a software system consisting of five modules, and describes the main algorithm of its work. Also there is a description of the waveguide paths automated soldering system operation, for explaining the basic functions and limitations of the system. The developed software allows setting of the measurement equipment, setting and changing parameters of the soldering process, as well as view graphs of temperatures recorded by the system. There is shown the results of experimental studies that prove high quality of soldering process control and the system applicability to the tasks of automation.

  17. Percolation on bipartite scale-free networks

    NASA Astrophysics Data System (ADS)

    Hooyberghs, H.; Van Schaeybroeck, B.; Indekeu, J. O.

    2010-08-01

    Recent studies introduced biased (degree-dependent) edge percolation as a model for failures in real-life systems. In this work, such process is applied to networks consisting of two types of nodes with edges running only between nodes of unlike type. Such bipartite graphs appear in many social networks, for instance in affiliation networks and in sexual-contact networks in which both types of nodes show the scale-free characteristic for the degree distribution. During the depreciation process, an edge between nodes with degrees k and q is retained with a probability proportional to (, where α is positive so that links between hubs are more prone to failure. The removal process is studied analytically by introducing a generating functions theory. We deduce exact self-consistent equations describing the system at a macroscopic level and discuss the percolation transition. Critical exponents are obtained by exploiting the Fortuin-Kasteleyn construction which provides a link between our model and a limit of the Potts model.

  18. Thermodynamic analysis of a thermal storage unit under the influence of nano-particles added to the phase change material and/or the working fluid

    NASA Astrophysics Data System (ADS)

    Abolghasemi, Mehran; Keshavarz, Ali; Mehrabian, Mozaffar Ali

    2012-11-01

    The thermal storage unit consists of two concentric cylinders where the working fluid flows through the internal cylinder and the annulus is filled with a phase change material. The system carries out a cyclic operation; each cycle consists of two processes. In the charging process the hot working fluid enters the internal cylinder and transfers heat to the phase change material. In the discharging process the cold working fluid enters the internal cylinder and absorbs heat from the phase change material. The differential equations governing the heat transfer between the two media are solved numerically. The numerical results are compared with the experimental results available in the literature. The performance of an energy storage unit is directly related to the thermal conductivity of nano-particles. The energy consumption of a residential unit whose energy is supplied by a thermal storage system can be reduced by 43 % when using nano-particles.

  19. Analytical redundancy and the design of robust failure detection systems

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.; Willsky, A. S.

    1984-01-01

    The Failure Detection and Identification (FDI) process is viewed as consisting of two stages: residual generation and decision making. It is argued that a robust FDI system can be achieved by designing a robust residual generation process. Analytical redundancy, the basis for residual generation, is characterized in terms of a parity space. Using the concept of parity relations, residuals can be generated in a number of ways and the design of a robust residual generation process can be formulated as a minimax optimization problem. An example is included to illustrate this design methodology. Previously announcedd in STAR as N83-20653

  20. The effectiveness of removing precursors of chlorinated organic substances in pilot water treatment plant

    NASA Astrophysics Data System (ADS)

    Wolska, Małgorzata; Szerzyna, Sławomir; Machi, Justyna; Mołczan, Marek; Adamski, Wojciech; Wiśniewski, Jacek

    2017-11-01

    The presence of organic substances in the water intaken for consumption could be hazardous to human health due to the potential formation of disinfection by-products (TOX). The study were carried out in the pilot surface water treatment system consisting of coagulation, sedimentation, filtration, ozonation, adsorption and disinfection. Due to continuous operation of the system and interference with the parameters of the processes it was possible not only assess the effectiveness of individual water treatment processes in removing TOX, but also on factors participating on the course of unit processes.

  1. Simulation Environment Synchronizing Real Equipment for Manufacturing Cell

    NASA Astrophysics Data System (ADS)

    Inukai, Toshihiro; Hibino, Hironori; Fukuda, Yoshiro

    Recently, manufacturing industries face various problems such as shorter product life cycle, more diversified customer needs. In this situation, it is very important to reduce lead-time of manufacturing system constructions. At the manufacturing system implementation stage, it is important to make and evaluate facility control programs for a manufacturing cell, such as ladder programs for programmable logical controllers (PLCs) rapidly. However, before the manufacturing systems are implemented, methods to evaluate the facility control programs for the equipment while mixing and synchronizing real equipment and virtual factory models on the computers have not been developed. This difficulty is caused by the complexity of the manufacturing system composed of a great variety of equipment, and stopped precise and rapid support of a manufacturing engineering process. In this paper, a manufacturing engineering environment (MEE) to support manufacturing engineering processes using simulation technologies is proposed. MEE consists of a manufacturing cell simulation environment (MCSE) and a distributed simulation environment (DSE). MCSE, which consists of a manufacturing cell simulator and a soft-wiring system, is emphatically proposed in detail. MCSE realizes making and evaluating facility control programs by using virtual factory models on computers before manufacturing systems are implemented.

  2. Automation in the Space Station module power management and distribution Breadboard

    NASA Technical Reports Server (NTRS)

    Walls, Bryan; Lollar, Louis F.

    1990-01-01

    The Space Station Module Power Management and Distribution (SSM/PMAD) Breadboard, located at NASA's Marshall Space Flight Center (MSFC) in Huntsville, Alabama, models the power distribution within a Space Station Freedom Habitation or Laboratory module. Originally designed for 20 kHz ac power, the system is now being converted to high voltage dc power with power levels on a par with those expected for a space station module. In addition to the power distribution hardware, the system includes computer control through a hierarchy of processes. The lowest level process consists of fast, simple (from a computing standpoint) switchgear, capable of quickly safing the system. The next level consists of local load center processors called Lowest Level Processors (LLP's). These LLP's execute load scheduling, perform redundant switching, and shed loads which use more than scheduled power. The level above the LLP's contains a Communication and Algorithmic Controller (CAC) which coordinates communications with the highest level. Finally, at this highest level, three cooperating Artificial Intelligence (AI) systems manage load prioritization, load scheduling, load shedding, and fault recovery and management. The system provides an excellent venue for developing and examining advanced automation techniques. The current system and the plans for its future are examined.

  3. Total Quality Management of Information System for Quality Assessment of Pesantren Using Fuzzy-SERVQUAL

    NASA Astrophysics Data System (ADS)

    Faizah, Arbiati; Syafei, Wahyul Amien; Isnanto, R. Rizal

    2018-02-01

    This research proposed a model combining an approach of Total Quality Management (TQM) and Fuzzy method of Service Quality (SERVQUAL) to asses service quality. TQM implementation was as quality management orienting on customer's satisfaction by involving all stakeholders. SERVQUAL model was used to measure quality service based on five dimensions such as tangible, reliability, responsiveness, assurance, and empathy. Fuzzy set theory was to accommodate subjectivity and ambiguity of quality assessment. Input data consisted of indicator data and quality assessment aspect. Input data was, then, processed to be service quality assessment questionnaires of Pesantren by using Fuzzy method to get service quality score. This process consisted of some steps as follows : inputting dimension and questionnaire data to data base system, filling questionnaire through system, then, system calculated fuzzification, defuzzification, gap of quality expected and received by service receivers, and calculating each dimension rating showing quality refinement priority. Rating of each quality dimension was, then, displayed at dashboard system to enable users to see information. From system having been built, it could be known that tangible dimension had the highest gap, -0.399, thus it needs to be prioritized and gets evaluation and refinement action soon.

  4. Payload/GSE/data system interface: Users guide for the VPF (Vertical Processing Facility)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Payload/GSE/data system interface users guide for the Vertical Processing Facility is presented. The purpose of the document is three fold. First, the simulated Payload and Ground Support Equipment (GSE) Data System Interface, which is also known as the payload T-0 (T-Zero) System is described. This simulated system is located with the Cargo Integration Test Equipment (CITE) in the Vertical Processing Facility (VPF) that is located in the KSC Industrial Area. The actual Payload T-0 System consists of the Orbiter, Mobile Launch Platforms (MLPs), and Launch Complex (LC) 39A and B. This is referred to as the Pad Payload T-0 System (Refer to KSC-DL-116 for Pad Payload T-0 System description). Secondly, information is provided to the payload customer of differences between this simulated system and the actual system. Thirdly, a reference guide of the VPF Payload T-0 System for both KSC and payload customer personnel is provided.

  5. Methods of information geometry in computational system biology (consistency between chemical and biological evolution).

    PubMed

    Astakhov, Vadim

    2009-01-01

    Interest in simulation of large-scale metabolic networks, species development, and genesis of various diseases requires new simulation techniques to accommodate the high complexity of realistic biological networks. Information geometry and topological formalisms are proposed to analyze information processes. We analyze the complexity of large-scale biological networks as well as transition of the system functionality due to modification in the system architecture, system environment, and system components. The dynamic core model is developed. The term dynamic core is used to define a set of causally related network functions. Delocalization of dynamic core model provides a mathematical formalism to analyze migration of specific functions in biosystems which undergo structure transition induced by the environment. The term delocalization is used to describe these processes of migration. We constructed a holographic model with self-poetic dynamic cores which preserves functional properties under those transitions. Topological constraints such as Ricci flow and Pfaff dimension were found for statistical manifolds which represent biological networks. These constraints can provide insight on processes of degeneration and recovery which take place in large-scale networks. We would like to suggest that therapies which are able to effectively implement estimated constraints, will successfully adjust biological systems and recover altered functionality. Also, we mathematically formulate the hypothesis that there is a direct consistency between biological and chemical evolution. Any set of causal relations within a biological network has its dual reimplementation in the chemistry of the system environment.

  6. Parallel processing approach to transform-based image coding

    NASA Astrophysics Data System (ADS)

    Normile, James O.; Wright, Dan; Chu, Ken; Yeh, Chia L.

    1991-06-01

    This paper describes a flexible parallel processing architecture designed for use in real time video processing. The system consists of floating point DSP processors connected to each other via fast serial links, each processor has access to a globally shared memory. A multiple bus architecture in combination with a dual ported memory allows communication with a host control processor. The system has been applied to prototyping of video compression and decompression algorithms. The decomposition of transform based algorithms for decompression into a form suitable for parallel processing is described. A technique for automatic load balancing among the processors is developed and discussed, results ar presented with image statistics and data rates. Finally techniques for accelerating the system throughput are analyzed and results from the application of one such modification described.

  7. An Information System Development Method Combining Business Process Modeling with Executable Modeling and its Evaluation by Prototyping

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Hirabayashi, Syuichi; Suzuki, Ryo; Mitsui, Hiroyasu; Koizumi, Hisao

    The business in the enterprise is closely related with the information system to such an extent that the business activities are difficult without the information system. The system design technique that considers the business process well, and that enables a quick system development is requested. In addition, the demand for the development cost is also severe than before. To cope with the current situation, the modeling technology named BPM(Business Process Management/Modeling)is drawing attention and becoming important as a key technology. BPM is a technology to model business activities as business processes and visualize them to improve the business efficiency. However, a general methodology to develop the information system using the analysis result of BPM doesn't exist, and a few development cases are reported. This paper proposes an information system development method combining business process modeling with executable modeling. In this paper we describe a guideline to support consistency of development and development efficiency and the framework enabling to develop the information system from model. We have prototyped the information system with the proposed method and our experience has shown that the methodology is valuable.

  8. Development of a compact and cost effective multi-input digital signal processing system

    NASA Astrophysics Data System (ADS)

    Darvish-Molla, Sahar; Chin, Kenrick; Prestwich, William V.; Byun, Soo Hyun

    2018-01-01

    A prototype digital signal processing system (DSP) was developed using a microcontroller interfaced with a 12-bit sampling ADC, which offers a considerably inexpensive solution for processing multiple detectors with high throughput. After digitization of the incoming pulses, in order to maximize the output counting rate, a simple algorithm was employed for pulse height analysis. Moreover, an algorithm aiming at the real-time pulse pile-up deconvolution was implemented. The system was tested using a NaI(Tl) detector in comparison with a traditional analogue and commercial digital systems for a variety of count rates. The performance of the prototype system was consistently superior to the analogue and the commercial digital systems up to the input count rate of 61 kcps while was slightly inferior to the commercial digital system but still superior to the analogue system in the higher input rates. Considering overall cost, size and flexibility, this custom made multi-input digital signal processing system (MMI-DSP) was the best reliable choice for the purpose of the 2D microdosimetric data collection, or for any measurement in which simultaneous multi-data collection is required.

  9. Proposal of digital interface for the system of the air conditioner's remote control: analysis of the system of feedback.

    PubMed

    da Silva de Queiroz Pierre, Raisa; Kawada, Tarô Arthur Tavares; Fontes, André Guimarães

    2012-01-01

    Develop a proposal of digital interface for the system of the remote control, that functions as support system during the manipulation of air conditioner adjusted for the users in general, from ergonomic parameters, objectifying the reduction of the problems faced for the user and improving the process. 20 people with questionnaire with both qualitative and quantitative level. Linear Method consists of a sequence of steps in which the input of one of them depends on the output from the previous one, although they are independent. The process of feedback, when necessary, must occur within each step separately.

  10. Obstacles encountered in the development of the low vision enhancement system.

    PubMed

    Massof, R W; Rickman, D L

    1992-01-01

    The Johns Hopkins Wilmer Eye Institute and the NASA Stennis Space Center are collaborating on the development of a new high technology low vision aid called the Low Vision Enhancement System (LVES). The LVES consists of a binocular head-mounted video display system, video cameras mounted on the head-mounted display, and real-time video image processing in a system package that is battery powered and portable. Through a phased development approach, several generations of the LVES can be made available to the patient in a timely fashion. This paper describes the LVES project with major emphasis on technical problems encountered or anticipated during the development process.

  11. The MIDAS processor. [Multivariate Interactive Digital Analysis System for multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.; Gordon, M. F.; Mclaughlin, R. H.; Marshall, R. E.

    1975-01-01

    The MIDAS (Multivariate Interactive Digital Analysis System) processor is a high-speed processor designed to process multispectral scanner data (from Landsat, EOS, aircraft, etc.) quickly and cost-effectively to meet the requirements of users of remote sensor data, especially from very large areas. MIDAS consists of a fast multipipeline preprocessor and classifier, an interactive color display and color printer, and a medium scale computer system for analysis and control. The system is designed to process data having as many as 16 spectral bands per picture element at rates of 200,000 picture elements per second into as many as 17 classes using a maximum likelihood decision rule.

  12. Classification of hydrogeologic areas and hydrogeologic flow systems in the basin and range physiographic province, southwestern United States

    USGS Publications Warehouse

    Anning, David W.; Konieczki, Alice D.

    2005-01-01

    The hydrogeology of the Basin and Range Physiographic Province in parts of Arizona, California, New Mexico, Utah, and most of Nevada was classified at basin and larger scales to facilitate information transfer and to provide a synthesis of results from many previous hydrologic investigations. A conceptual model for the spatial hierarchy of the hydrogeology was developed for the Basin and Range Physiographic Province and consists, in order of increasing spatial scale, of hydrogeologic components, hydrogeologic areas, hydrogeologic flow systems, and hydrogeologic regions. This hierarchy formed a framework for hydrogeologic classification. Hydrogeologic areas consist of coincident ground-water and surface-water basins and were delineated on the basis of existing sets of basin boundaries that were used in past investigations by State and Federal government agencies. Within the study area, 344 hydrogeologic areas were identified and delineated. This set of basins not only provides a framework for the classification developed in this report, but also has value for regional and subregional purposes of inventory, study, analysis, and planning throughout the Basin and Range Physiographic Province. The fact that nearly all of the province is delineated by the hydrogeologic areas makes this set well suited to support regional-scale investigations. Hydrogeologic areas are conceptualized as a control volume consisting of three hydrogeologic components: the soils and streams, basin fill, and consolidated rocks. The soils and streams hydrogeologic component consists of all surface-water bodies and soils extending to the bottom of the plant root zone. The basin-fill hydrogeologic component consists of unconsolidated and semiconsolidated sediment deposited in the structural basin. The consolidated-rocks hydrogeologic component consists of the crystalline and sedimentary rocks that form the mountain blocks and basement rock of the structural basin. Hydrogeologic areas were classified into 19 groups through a cluster analysis of 8 characteristics of each area's hydrologic system. Six characteristics represented the inflows and outflows of water through the soils and streams, basin fill, and consolidated rocks, and can be used to determine the hydrogeologic area's position in a hydrogeologic flow system. Source-, link-, and sink-type hydrogeologic areas have outflow but not inflow, inflow and outflow, and inflow but not outflow, respectively, through one or more of the three hydrogeologic components. Isolated hydrogeologic areas have no inflow or outflow through any of the three hydrogeologic components. The remaining two characteristics are indexes that represent natural recharge and discharge processes and anthropogenic recharge and discharge processes occurring in the hydrogeologic area. Of the 19 groups of hydrogeologic areas, 1 consisted of predominantly isolated-type hydrogeologic areas, 7 consisted of source-type hydrogeologic areas, 9 consisted of link-type hydrogeologic areas, and 2 consisted of sink-type hydrogeologic areas. Groups comprising the source-, link-, and sink-type hydrogeologic areas can be distinguished between each other on the basis of the hydrogeologic component(s) through which interbasin flow occurs, as well as typical values for the two indexes. Conceptual models of the hydrologic systems of a representative hydrogeologic area for each group were developed to help distinguish groups and to synthesize the variation in hydrogeologic systems in the Basin and Range Physiographic Province. Hydrogeologic flow systems consist of either a single isolated hydrogeologic area or a series of multiple hydrogeologic areas that are hydraulically connected through interbasin flows. A total of 54 hydrogeologic flow systems were identified and classified into 9 groups. One group consisted of single isolated hydrogeologic areas. The remaining eight groups consisted of multiple hydrogeologic areas and were distinguished o

  13. Elixir - how to handle 2 trillion pixels

    NASA Astrophysics Data System (ADS)

    Magnier, Eugene A.; Cuillandre, Jean-Charles

    2002-12-01

    The Elixir system at CFHT provides automatic data quality assurance and calibration for the wide-field mosaic imager camera CFH12K. Elixir consists of a variety of tools, including: a real-time analysis suite which runs at the telescope to provide quick feedback to the observers; a detailed analysis of the calibration data; and an automated pipeline for processing data to be distributed to observers. To date, 2.4 × 1012 night-time sky pixels from CFH12K have been processed by the Elixir system.

  14. System for verifiable CT radiation dose optimization based on image quality. part II. process control system.

    PubMed

    Larson, David B; Malarik, Remo J; Hall, Seth M; Podberesky, Daniel J

    2013-10-01

    To evaluate the effect of an automated computed tomography (CT) radiation dose optimization and process control system on the consistency of estimated image noise and size-specific dose estimates (SSDEs) of radiation in CT examinations of the chest, abdomen, and pelvis. This quality improvement project was determined not to constitute human subject research. An automated system was developed to analyze each examination immediately after completion, and to report individual axial-image-level and study-level summary data for patient size, image noise, and SSDE. The system acquired data for 4 months beginning October 1, 2011. Protocol changes were made by using parameters recommended by the prediction application, and 3 months of additional data were acquired. Preimplementation and postimplementation mean image noise and SSDE were compared by using unpaired t tests and F tests. Common-cause variation was differentiated from special-cause variation by using a statistical process control individual chart. A total of 817 CT examinations, 490 acquired before and 327 acquired after the initial protocol changes, were included in the study. Mean patient age and water-equivalent diameter were 12.0 years and 23.0 cm, respectively. The difference between actual and target noise increased from -1.4 to 0.3 HU (P < .01) and the standard deviation decreased from 3.9 to 1.6 HU (P < .01). Mean SSDE decreased from 11.9 to 7.5 mGy, a 37% reduction (P < .01). The process control chart identified several special causes of variation. Implementation of an automated CT radiation dose optimization system led to verifiable simultaneous decrease in image noise variation and SSDE. The automated nature of the system provides the opportunity for consistent CT radiation dose optimization on a broad scale. © RSNA, 2013.

  15. Integrated circuits for volumetric ultrasound imaging with 2-D CMUT arrays.

    PubMed

    Bhuyan, Anshuman; Choe, Jung Woo; Lee, Byung Chul; Wygant, Ira O; Nikoozadeh, Amin; Oralkan, Ömer; Khuri-Yakub, Butrus T

    2013-12-01

    Real-time volumetric ultrasound imaging systems require transmit and receive circuitry to generate ultrasound beams and process received echo signals. The complexity of building such a system is high due to requirement of the front-end electronics needing to be very close to the transducer. A large number of elements also need to be interfaced to the back-end system and image processing of a large dataset could affect the imaging volume rate. In this work, we present a 3-D imaging system using capacitive micromachined ultrasonic transducer (CMUT) technology that addresses many of the challenges in building such a system. We demonstrate two approaches in integrating the transducer and the front-end electronics. The transducer is a 5-MHz CMUT array with an 8 mm × 8 mm aperture size. The aperture consists of 1024 elements (32 × 32) with an element pitch of 250 μm. An integrated circuit (IC) consists of a transmit beamformer and receive circuitry to improve the noise performance of the overall system. The assembly was interfaced with an FPGA and a back-end system (comprising of a data acquisition system and PC). The FPGA provided the digital I/O signals for the IC and the back-end system was used to process the received RF echo data (from the IC) and reconstruct the volume image using a phased array imaging approach. Imaging experiments were performed using wire and spring targets, a ventricle model and a human prostrate. Real-time volumetric images were captured at 5 volumes per second and are presented in this paper.

  16. High resolution imaging of a subsonic projectile using automated mirrors with large aperture

    NASA Astrophysics Data System (ADS)

    Tateno, Y.; Ishii, M.; Oku, H.

    2017-02-01

    Visual tracking of high-speed projectiles is required for studying the aerodynamics around the objects. One solution to this problem is a tracking method based on the so-called 1 ms Auto Pan-Tilt (1ms-APT) system that we proposed in previous work, which consists of rotational mirrors and a high-speed image processing system. However, the images obtained with that system did not have high enough resolution to realize detailed measurement of the projectiles because of the size of the mirrors. In this study, we propose a new system consisting of enlarged mirrors for tracking a high-speed projectiles so as to achieve higher-resolution imaging, and we confirmed the effectiveness of the system via an experiment in which a projectile flying at subsonic speed tracked.

  17. Athena: Providing Insight into the History of the Universe

    NASA Technical Reports Server (NTRS)

    Murphy, Gloria A.

    2010-01-01

    The American Institute for Aeronautics and Astronautics has provided a Request for Proposal which calls for a manned mission to a Near-Earth Object. It is the goal of Team COLBERT to respond to their request by providing a reusable system that can be implemented as a solid stepping stone for future manned trips to Mars and beyond. Despite Team COLBERT consisting of only students in Aerospace Engineering, in order to achieve this feat, the team must employ the use of Systems Engineering. Tools and processes from Systems Engineering will provide quantitative and semi-quantitative tools for making design decisions and evaluating items such as budgets and schedules. This paper will provide an in-depth look at some of the Systems Engineering processes employed and will step through the design process of a Human Asteroid Exploration System.

  18. Development of an LSI for Tactile Sensor Systems on the Whole-Body of Robots

    NASA Astrophysics Data System (ADS)

    Muroyama, Masanori; Makihata, Mitsutoshi; Nakano, Yoshihiro; Matsuzaki, Sakae; Yamada, Hitoshi; Yamaguchi, Ui; Nakayama, Takahiro; Nonomura, Yutaka; Fujiyoshi, Motohiro; Tanaka, Shuji; Esashi, Masayoshi

    We have developed a network type tactile sensor system, which realizes high-density tactile sensors on the whole-body of nursing and communication robots. The system consists of three kinds of nodes: host, relay and sensor nodes. Roles of the sensor node are to sense forces and, to encode the sensing data and to transmit the encoded data on serial channels by interruption handling. Relay nodes and host deal with a number of the encoded sensing data from the sensor nodes. A sensor node consists of a capacitive MEMS force sensor and a signal processing/transmission LSI. In this paper, details of an LSI for the sensor node are described. We designed experimental sensor node LSI chips by a commercial 0.18µm standard CMOS process. The 0.18µm LSIs were supplied in wafer level for MEMS post-process. The LSI chip area is 2.4mm × 2.4mm, which includes logic, CF converter and memory circuits. The maximum clock frequency of the chip with a large capacitive load is 10MHz. Measured power consumption at 10MHz clock is 2.23mW. Experimental results indicate that size, response time, sensor sensitivity and power consumption are all enough for practical tactile sensor systems.

  19. Strategies for converting to a DBMS environment

    NASA Technical Reports Server (NTRS)

    Durban, D. M.

    1984-01-01

    The conversion to data base management systems processing techniques consists of three different strategies - one for each of the major stages in the development process. Each strategy was chosen for its approach in bringing about a smooth evolutionary type transition from one mode of operation to the next. The initial strategy of the indoctrination stage consisted of: (1) providing maximum access to current administrative data as soon as possible; (2) select and developing small prototype systems; (3) establishing a user information center as a central focal point for user training and assistance; and (4) developing a training program for programmers, management and ad hoc users in DBMS application and utilization. Security, the rate of the data dictionary, and data base tuning and capacity planning, and the development of a change of attitude in an automated office are issues meriting consideration.

  20. Development of Rene' 41 honeycomb structure as an integral cryogenic tankage/fuselage concept for future space transportation systems

    NASA Technical Reports Server (NTRS)

    Shideler, J. J.; Swegle, A. R.; Fields, R. A.

    1982-01-01

    The status of the structural development of an integral cryogenic-tankage/hot-fuselage concept for future space transportation systems (STS) is discussed. The concept consists of a honeycomb sandwich structure which serves the combined functions of containment of cryogenic fuel, support of vehicle loads, and thermal protection from an entry heating environment. The inner face sheet is exposed to a cryogenic (LH2) temperature of -423 F during boost; and the outer face sheet, which is slotted to reduce thermal stress, is exposed to a maximum temperature of 1400 F during a high altitude, gliding entry. A fabrication process for a Rene' 41 honeycomb sandwich panel with a core density less than 1 percent was developed which is consistent with desirable heat treatment processes for high strength.

  1. NASA's Earth Science Data Systems

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.

    2015-01-01

    NASA's Earth Science Data Systems (ESDS) Program has evolved over the last two decades, and currently has several core and community components. Core components provide the basic operational capabilities to process, archive, manage and distribute data from NASA missions. Community components provide a path for peer-reviewed research in Earth Science Informatics to feed into the evolution of the core components. The Earth Observing System Data and Information System (EOSDIS) is a core component consisting of twelve Distributed Active Archive Centers (DAACs) and eight Science Investigator-led Processing Systems spread across the U.S. The presentation covers how the ESDS Program continues to evolve and benefits from as well as contributes to advances in Earth Science Informatics.

  2. GET: A generic electronics system for TPCs and nuclear physics instrumentation

    NASA Astrophysics Data System (ADS)

    Pollacco, E. C.; Grinyer, G. F.; Abu-Nimeh, F.; Ahn, T.; Anvar, S.; Arokiaraj, A.; Ayyad, Y.; Baba, H.; Babo, M.; Baron, P.; Bazin, D.; Beceiro-Novo, S.; Belkhiria, C.; Blaizot, M.; Blank, B.; Bradt, J.; Cardella, G.; Carpenter, L.; Ceruti, S.; De Filippo, E.; Delagnes, E.; De Luca, S.; De Witte, H.; Druillole, F.; Duclos, B.; Favela, F.; Fritsch, A.; Giovinazzo, J.; Gueye, C.; Isobe, T.; Hellmuth, P.; Huss, C.; Lachacinski, B.; Laffoley, A. T.; Lebertre, G.; Legeard, L.; Lynch, W. G.; Marchi, T.; Martina, L.; Maugeais, C.; Mittig, W.; Nalpas, L.; Pagano, E. V.; Pancin, J.; Poleshchuk, O.; Pedroza, J. L.; Pibernat, J.; Primault, S.; Raabe, R.; Raine, B.; Rebii, A.; Renaud, M.; Roger, T.; Roussel-Chomaz, P.; Russotto, P.; Saccà, G.; Saillant, F.; Sizun, P.; Suzuki, D.; Swartz, J. A.; Tizon, A.; Usher, N.; Wittwer, G.; Yang, J. C.

    2018-04-01

    General Electronics for TPCs (GET) is a generic, reconfigurable and comprehensive electronics and data-acquisition system for nuclear physics instrumentation of up to 33792 channels. The system consists of a custom-designed ASIC for signal processing, front-end cards that each house 4 ASIC chips and digitize the data in parallel through 12-bit ADCs, concentration boards to read and process the digital data from up to 16 ASICs, a 3-level trigger and master clock module to trigger the system and synchronize the data, as well as all of the associated firmware, communication and data-acquisition software. An overview of the system including its specifications and measured performances are presented.

  3. Electrophysiological evidence of sublexical phonological access in character processing by L2 Chinese learners of L1 alphabetic scripts.

    PubMed

    Yum, Yen Na; Law, Sam-Po; Mo, Kwan Nok; Lau, Dustin; Su, I-Fan; Shum, Mark S K

    2016-04-01

    While Chinese character reading relies more on addressed phonology relative to alphabetic scripts, skilled Chinese readers also access sublexical phonological units during recognition of phonograms. However, sublexical orthography-to-phonology mapping has not been found among beginning second language (L2) Chinese learners. This study investigated character reading in more advanced Chinese learners whose native writing system is alphabetic. Phonological regularity and consistency were examined in behavioral responses and event-related potentials (ERPs) in lexical decision and delayed naming tasks. Participants were 18 native English speakers who acquired written Chinese after age 5 years and reached grade 4 Chinese reading level. Behaviorally, regular characters were named more accurately than irregular characters, but consistency had no effect. Similar to native Chinese readers, regularity effects emerged early with regular characters eliciting a greater N170 than irregular characters. Regular characters also elicited greater frontal P200 and smaller N400 than irregular characters in phonograms of low consistency. Additionally, regular-consistent characters and irregular-inconsistent characters had more negative amplitudes than irregular-consistent characters in the N400 and LPC time windows. The overall pattern of brain activities revealed distinct regularity and consistency effects in both tasks. Although orthographic neighbors are activated in character processing of L2 Chinese readers, the timing of their impact seems delayed compared with native Chinese readers. The time courses of regularity and consistency effects across ERP components suggest both assimilation and accommodation of the reading network in learning to read a typologically distinct second orthographic system.

  4. Improving operating room turnover time: a systems based approach.

    PubMed

    Bhatt, Ankeet S; Carlson, Grant W; Deckers, Peter J

    2014-12-01

    Operating room (OR) turnover time (TT) has a broad and significant impact on hospital administrators, providers, staff and patients. Our objective was to identify current problems in TT management and implement a consistent, reproducible process to reduce average TT and process variability. Initial observations of TT were made to document the existing process at a 511 bed, 24 OR, academic medical center. Three control groups, including one consisting of Orthopedic and Vascular Surgery, were used to limit potential confounders such as case acuity/duration and equipment needs. A redesigned process based on observed issues, focusing on a horizontally structured, systems-based approach has three major interventions: developing consistent criteria for OR readiness, utilizing parallel processing for patient and room readiness, and enhancing perioperative communication. Process redesign was implemented in Orthopedics and Vascular Surgery. Comparisons of mean and standard deviation of TT were made using an independent 2-tailed t-test. Using all surgical specialties as controls (n = 237), mean TT (hh:mm:ss) was reduced by 0:20:48 min (95 % CI, 0:10:46-0:30:50), from 0:44:23 to 0:23:25, a 46.9 % reduction. Standard deviation of TT was reduced by 0:10:32 min, from 0:16:24 to 0:05:52 and frequency of TT≥30 min was reduced from 72.5to 11.7 %. P < 0.001 for each. Using Vascular and Orthopedic surgical specialties as controls (n = 13), mean TT was reduced by 0:15:16 min (95 % CI, 0:07:18-0:23:14), from 0:38:51 to 0:23:35, a 39.4 % reduction. Standard deviation of TT reduced by 0:08:47, from 0:14:39 to 0:05:52 and frequency of TT≥30 min reduced from 69.2 to 11.7 %. P < 0.001 for each. Reductions in mean TT present major efficiency, quality improvement, and cost-reduction opportunities. An OR redesign process focusing on parallel processing and enhanced communication resulted in greater than 35 % reduction in TT. A systems-based focus should drive OR TT design.

  5. Towards an integrated optofluidic system for highly sensitive detection of antibiotics in seawater incorporating bimodal waveguide photonic biosensors and complex, active microfluidics

    NASA Astrophysics Data System (ADS)

    Szydzik, C.; Gavela, A. F.; Roccisano, J.; Herranz de Andrés, S.; Mitchell, A.; Lechuga, L. M.

    2016-12-01

    We present recent results on the realisation and demonstration of an integrated optofluidic lab-on-a-chip measurement system. The system consists of an integrated on-chip automated microfluidic fluid handling subsystem, coupled with bimodal nano-interferometer waveguide technology, and is applied in the context of detection of antibiotics in seawater. The bimodal waveguide (BMWG) is a highly sensitive label-free biosensor. Integration of complex microfluidic systems with bimodal waveguide technology enables on-chip sample handling and fluid processing capabilities and allows for significant automation of experimental processes. The on-chip fluid-handling subsystem is realised through the integration of pneumatically actuated elastomer pumps and valves, enabling high temporal resolution sample and reagent delivery and facilitating multiplexed detection processes.

  6. Engineering of mechanical manufacturing from the cradle to cradle

    NASA Astrophysics Data System (ADS)

    Peralta, M. E.; Aguayo, F.; Lama, J. R.

    2012-04-01

    The sustainability of manufacturing processes lies in industrial planning and productive activity. Industrial plants are characterized by the management of resource (inputs and outputs), processing and conversion processes, which usually are organized in a linear system. Good planning will optimize the manufacturing and promoting the quality of the industrial system. Cradle to Cradle is a new paradigm for engineering and sustainable manufacturing that integrates projects (industrial parks, manufacturing plants, systems and products) in a framework consistent with the environment, adapted to the society and technology and economically viable. To carry it out, we implement this paradigm in the MGE2 (Genomic Model of Eco-innovation and Eco-design), as a methodology for designing and developing products and manufacturing systems with an approach from the cradle to cradle.

  7. Digital ultrasonics signal processing: Flaw data post processing use and description

    NASA Technical Reports Server (NTRS)

    Buel, V. E.

    1981-01-01

    A modular system composed of two sets of tasks which interprets the flaw data and allows compensation of the data due to transducer characteristics is described. The hardware configuration consists of two main units. A DEC LSI-11 processor running under the RT-11 sngle job, version 2C-02 operating system, controls the scanner hardware and the ultrasonic unit. A DEC PDP-11/45 processor also running under the RT-11, version 2C-02, operating system, stores, processes and displays the flaw data. The software developed the Ultrasonics Evaluation System, is divided into two catagories; transducer characterization and flaw classification. Each category is divided further into two functional tasks: a data acquisition and a postprocessor ask. The flaw characterization collects data, compresses its, and writes it to a disk file. The data is then processed by the flaw classification postprocessing task. The use and operation of a flaw data postprocessor is described.

  8. Measuring the impact of final demand on global production system based on Markov process

    NASA Astrophysics Data System (ADS)

    Xing, Lizhi; Guan, Jun; Wu, Shan

    2018-07-01

    Input-output table is a comprehensive and detailed in describing the national economic systems, consisting of supply and demand information among various industrial sectors. The complex network, a theory and method for measuring the structure of complex system, can depict the structural properties of social and economic systems, and reveal the complicated relationships between the inner hierarchies and the external macroeconomic functions. This paper tried to measure the globalization degree of industrial sectors on the global value chain. Firstly, it constructed inter-country input-output network models to reproduce the topological structure of global economic system. Secondly, it regarded the propagation of intermediate goods on the global value chain as Markov process and introduced counting first passage betweenness to quantify the added processing amount when globally final demand stimulates this production system. Thirdly, it analyzed the features of globalization at both global and country-sector level

  9. Eight microprocessor-based instrument data systems in the Galileo Orbiter spacecraft

    NASA Technical Reports Server (NTRS)

    Barry, R. C.

    1980-01-01

    Instrument data systems consist of a microprocessor, 3K bytes of Read Only Memory and 3K bytes of Random Access Memory. It interfaces with the spacecraft data bus through an isolated user interface with a direct memory access bus adaptor, and/or parallel data from instrument devices such as registers, buffers, analog to digital converters, multiplexers, and solid state sensors. These data systems support the spacecraft hardware and software communication protocol, decode and process instrument commands, generate continuous instrument operating modes, control the instrument mechanisms, acquire, process, format, and output instrument science data.

  10. NASA Automatic Information Security Handbook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This handbook details the Automated Information Security (AIS) management process for NASA. Automated information system security is becoming an increasingly important issue for all NASA managers and with rapid advancements in computer and network technologies and the demanding nature of space exploration and space research have made NASA increasingly dependent on automated systems to store, process, and transmit vast amounts of mission support information, hence the need for AIS systems and management. This handbook provides the consistent policies, procedures, and guidance to assure that an aggressive and effective AIS programs is developed, implemented, and sustained at all NASA organizations and NASA support contractors.

  11. Control of microstructure in soldered, brazed, welded, plated, cast or vapor deposited manufactured components

    DOEpatents

    Ripley, Edward B.; Hallman, Russell L.

    2015-11-10

    Disclosed are methods and systems for controlling of the microstructures of a soldered, brazed, welded, plated, cast, or vapor deposited manufactured component. The systems typically use relatively weak magnetic fields of either constant or varying flux to affect material properties within a manufactured component, typically without modifying the alloy, or changing the chemical composition of materials or altering the time, temperature, or transformation parameters of a manufacturing process. Such systems and processes may be used with components consisting of only materials that are conventionally characterized as be uninfluenced by magnetic forces.

  12. A methodology for automatic intensity-modulated radiation treatment planning for lung cancer

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodong; Li, Xiaoqiang; Quan, Enzhuo M.; Pan, Xiaoning; Li, Yupeng

    2011-07-01

    In intensity-modulated radiotherapy (IMRT), the quality of the treatment plan, which is highly dependent upon the treatment planner's level of experience, greatly affects the potential benefits of the radiotherapy (RT). Furthermore, the planning process is complicated and requires a great deal of iteration, and is often the most time-consuming aspect of the RT process. In this paper, we describe a methodology to automate the IMRT planning process in lung cancer cases, the goal being to improve the quality and consistency of treatment planning. This methodology (1) automatically sets beam angles based on a beam angle automation algorithm, (2) judiciously designs the planning structures, which were shown to be effective for all the lung cancer cases we studied, and (3) automatically adjusts the objectives of the objective function based on a parameter automation algorithm. We compared treatment plans created in this system (mdaccAutoPlan) based on the overall methodology with plans from a clinical trial of IMRT for lung cancer run at our institution. The 'autoplans' were consistently better, or no worse, than the plans produced by experienced medical dosimetrists in terms of tumor coverage and normal tissue sparing. We conclude that the mdaccAutoPlan system can potentially improve the quality and consistency of treatment planning for lung cancer.

  13. Technical and Economical Aspects of Current Thermal Barrier Coating Systems for Gas Turbine Engines by Thermal Spray and EBPVD: A Review

    NASA Astrophysics Data System (ADS)

    Feuerstein, Albert; Knapp, James; Taylor, Thomas; Ashary, Adil; Bolcavage, Ann; Hitchman, Neil

    2008-06-01

    The most advanced thermal barrier coating (TBC) systems for aircraft engine and power generation hot section components consist of electron beam physical vapor deposition (EBPVD) applied yttria-stabilized zirconia and platinum modified diffusion aluminide bond coating. Thermally sprayed ceramic and MCrAlY bond coatings, however, are still used extensively for combustors and power generation blades and vanes. This article highlights the key features of plasma spray and HVOF, diffusion aluminizing, and EBPVD coating processes. The coating characteristics of thermally sprayed MCrAlY bond coat as well as low density and dense vertically cracked (DVC) Zircoat TBC are described. Essential features of a typical EBPVD TBC coating system, consisting of a diffusion aluminide and a columnar TBC, are also presented. The major coating cost elements such as material, equipment and processing are explained for the different technologies, with a performance and cost comparison given for selected examples.

  14. Intercommunications in Real Time, Redundant, Distributed Computer System

    NASA Technical Reports Server (NTRS)

    Zanger, H.

    1980-01-01

    An investigation into the applicability of fiber optic communication techniques to real time avionic control systems, in particular the total automatic flight control system used for the VSTOL aircraft is presented. The system consists of spatially distributed microprocessors. The overall control function is partitioned to yield a unidirectional data flow between the processing elements (PE). System reliability is enhanced by the use of triple redundancy. Some general overall system specifications are listed here to provide the necessary background for the requirements of the communications system.

  15. Project Planning and Reporting

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Project Planning Analysis and Reporting System (PPARS) is automated aid in monitoring and scheduling of activities within project. PPARS system consists of PPARS Batch Program, five preprocessor programs, and two post-processor programs. PPARS Batch program is full CPM (Critical Path Method) scheduling program with resource capabilities. Can process networks with up to 10,000 activities.

  16. Real-Life Challenges in Using Augmentative and Alternative Communication by Persons with Amyotrophic Lateral Sclerosis

    ERIC Educational Resources Information Center

    Ray, Jayanti

    2015-01-01

    Given the linguistic and cognitive demands of communication, adult Augmentative and Alternative Communication (AAC) users with acquired communication disorders may have difficulty using AAC systems consistently and effectively in "real-life" situations. The process of recommending AAC systems and strategies is an area of exploration,…

  17. A Study of the Effectiveness of Sensory Integration Therapy on Neuro-Physiological Development

    ERIC Educational Resources Information Center

    Reynolds, Christopher; Reynolds, Kathleen Sheena

    2010-01-01

    Background: Sensory integration theory proposes that because there is plasticity within the central nervous system (the brain is moldable) and because the brain consists of systems that are hierarchically organised, it is possible to stimulate and improve neuro-physiological processing and integration and thereby increase learning capacity.…

  18. On board processor development for NASA's spaceborne imaging radar with system-on-chip technology

    NASA Technical Reports Server (NTRS)

    Fang, Wai-Chi

    2004-01-01

    This paper reports a preliminary study result of an on-board spaceborne SAR processor. It consists of a processing requirement analysis, functional specifications, and implementation with system-on-chip technology. Finally, a minimum version of this on-board processor designed for performance evaluation and for partial demonstration is illustrated.

  19. Standardized Review and Approval Process for High-Cost Medication Use Promotes Value-Based Care in a Large Academic Medical System.

    PubMed

    Durvasula, Raghu; Kelly, Janet; Schleyer, Anneliese; Anawalt, Bradley D; Somani, Shabir; Dellit, Timothy H

    2018-04-01

    As healthcare costs rise and reimbursements decrease, healthcare organization leadership and clinical providers must collaborate to provide high-value healthcare. Medications are a key driver of the increasing cost of healthcare, largely as a result of the proliferation of expensive specialty drugs, including biologic agents. Such medications contribute significantly to the inpatient diagnosis-related group payment system, often with minimal or unproved benefit over less-expensive therapies. To describe a systematic review process to reduce non-evidence-based inpatient use of high-cost medications across a large multihospital academic health system. We created a Pharmacy & Therapeutics subcommittee consisting of clinicians, pharmacists, and an ethics representative. This committee developed a standardized process for a timely review (<48 hours) and approval of high-cost medications based on their clinical effectiveness, safety, and appropriateness. The engagement of clinical experts in the development of the consensus-based guidelines for the use of specific medications facilitated the clinicians' acceptance of the review process. Over a 2-year period, a total of 85 patient-specific requests underwent formal review. All reviews were conducted within 48 hours. This review process has reduced the non-evidence-based use of specialty medications and has resulted in a pharmacy savings of $491,000 in fiscal year 2016, with almost 80% of the savings occurring in the last 2 quarters, because our process has matured. The creation of a collaborative review process to ensure consistent, evidence-based utilization of high-cost medications provides value-based care, while minimizing unnecessary practice variation and reducing the cost of inpatient care.

  20. Social-aware Event Handling within the FallRisk Project.

    PubMed

    De Backere, Femke; Van den Bergh, Jan; Coppers, Sven; Elprama, Shirley; Nelis, Jelle; Verstichel, Stijn; Jacobs, An; Coninx, Karin; Ongenae, Femke; De Turck, Filip

    2017-01-09

    With the uprise of the Internet of Things, wearables and smartphones are moving to the foreground. Ambient Assisted Living solutions are, for example, created to facilitate ageing in place. One example of such systems are fall detection systems. Currently, there exists a wide variety of fall detection systems using different methodologies and technologies. However, these systems often do not take into account the fall handling process, which starts after a fall is identified or this process only consists of sending a notification. The FallRisk system delivers an accurate analysis of incidents occurring in the home of the older adults using several sensors and smart devices. Moreover, the input from these devices can be used to create a social-aware event handling process, which leads to assisting the older adult as soon as possible and in the best possible way. The FallRisk system consists of several components, located in different places. When an incident is identified by the FallRisk system, the event handling process will be followed to assess the fall incident and select the most appropriate caregiver, based on the input of the smartphones of the caregivers. In this process, availability and location are automatically taken into account. The event handling process was evaluated during a decision tree workshop to verify if the current day practices reflect the requirements of all the stakeholders. Other knowledge, which is uncovered during this workshop can be taken into account to further improve the process. The FallRisk offers a way to detect fall incidents in an accurate way and uses context information to assign the incident to the most appropriate caregiver. This way, the consequences of the fall are minimized and help is at location as fast as possible. It could be concluded that the current guidelines on fall handling reflect the needs of the stakeholders. However, current technology evolutions, such as the uptake of wearables and smartphones, enables the improvement of these guidelines, such as the automatic ordering of the caregivers based on their location and availability.

  1. 40 CFR 60.5415 - How do I demonstrate continuous compliance with the standards for my gas well affected facility...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... plants? 60.5415 Section 60.5415 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR..., and unavoidable failure of air pollution control equipment, process equipment, or a process to operate... control systems were kept in operation if at all possible, consistent with safety and good air pollution...

  2. A Systems Approach to the Design and Operation of Effective Marketing Programs in Community Colleges.

    ERIC Educational Resources Information Center

    Scigliano, John A.

    1983-01-01

    Presents a research-based marketing model consisting of an environmental scanning process, a series of marketing audits, and an information-processing scheme. Views the essential elements of college marketing as information flow; high-level, long-term commitment; diverse strategies; innovation; and a broad view of marketing. Includes a marketing…

  3. Hydrological hysteresis and its value for assessing process consistency in catchment conceptual models

    Treesearch

    O. Fovet; L. Ruiz; M. Hrachowitz; M. Faucheux; C. Gascuel-Odoux

    2015-01-01

    While most hydrological models reproduce the general flow dynamics, they frequently fail to adequately mimic system-internal processes. In particular, the relationship between storage and discharge, which often follows annual hysteretic patterns in shallow hard-rock aquifers, is rarely considered in modelling studies. One main reason is that catchment storage is...

  4. What the Instructors and Administrators of Russia's Higher Educational Institutions Think about the Bologna Process

    ERIC Educational Resources Information Center

    Aref'ev, A. L.

    2009-01-01

    The increasing integration of national educational systems, in particular in Europe, is giving rise to conflict among traditional forms of instruction, curricula, pedagogical norms and values, and firmly established standards of education. The center of this conflict now, which was catalyzed by Russia's joining the Bologna process, consists of the…

  5. Study on intelligent processing system of man-machine interactive garment frame model

    NASA Astrophysics Data System (ADS)

    Chen, Shuwang; Yin, Xiaowei; Chang, Ruijiang; Pan, Peiyun; Wang, Xuedi; Shi, Shuze; Wei, Zhongqian

    2018-05-01

    A man-machine interactive garment frame model intelligent processing system is studied in this paper. The system consists of several sensor device, voice processing module, mechanical parts and data centralized acquisition devices. The sensor device is used to collect information on the environment changes brought by the body near the clothes frame model, the data collection device is used to collect the information of the environment change induced by the sensor device, voice processing module is used for speech recognition of nonspecific person to achieve human-machine interaction, mechanical moving parts are used to make corresponding mechanical responses to the information processed by data collection device.it is connected with data acquisition device by a means of one-way connection. There is a one-way connection between sensor device and data collection device, two-way connection between data acquisition device and voice processing module. The data collection device is one-way connection with mechanical movement parts. The intelligent processing system can judge whether it needs to interact with the customer, realize the man-machine interaction instead of the current rigid frame model.

  6. Interplanetary dust in the transmission electron microscope - Diverse materials from the early solar system

    NASA Technical Reports Server (NTRS)

    Fraundorf, P.

    1981-01-01

    An analytical electron microscope study of dispersed interplanetary dust aggregates collected in the earth's stratosphere shows that, in spite of their similarities, the aggregates exhibit significant differences in composition, internal morphology, and mineralogy. Of 11 chondritic particles examined, two consist mostly of a noncrystalline chondritic material with an atomic S/Fe ratio equal to or greater than 2 in places, one consists of submicron metal and reduced silicate 'microchondrules' and sulfide grains embedded in a carbonaceous matrix, and another consists of submicron magnetic-decorated unequilibrated silicate and sulfide grains with thick low-Z coatings. Although the particles are unmetamorphosed by criteria commonly applied for chondritic meteorites, the presence of reduced chemistries and the ubiquity of mafic, instead of hydrated, silicates confirm that they are not simply C1 or C2 chondrite matrix material. The observations indicate that portions of some particles have not been significantly altered by thermal or radiation processes since their assembly, and that the particles probably contain fine debris from diverse processes in the early solar system.

  7. Microvax-based data management and reduction system for the regional planetary image facilities

    NASA Technical Reports Server (NTRS)

    Arvidson, R.; Guinness, E.; Slavney, S.; Weiss, B.

    1987-01-01

    Presented is a progress report for the Regional Planetary Image Facilities (RPIF) prototype image data management and reduction system being jointly implemented by Washington University and the USGS, Flagstaff. The system will consist of a MicroVAX with a high capacity (approx 300 megabyte) disk drive, a compact disk player, an image display buffer, a videodisk player, USGS image processing software, and SYSTEM 1032 - a commercial relational database management package. The USGS, Flagstaff, will transfer their image processing software including radiometric and geometric calibration routines, to the MicroVAX environment. Washington University will have primary responsibility for developing the database management aspects of the system and for integrating the various aspects into a working system.

  8. Testing of Environmentally Preferable Aluminum Pretreatments and Coating Systems for Use on Space Shuttle Solid Rocket Boosters (SRB)

    NASA Technical Reports Server (NTRS)

    Clayton, C.; Raley, R.; Zook, L.

    2001-01-01

    The solid rocket booster (SRB) has historically used a chromate conversion coating prior to protective finish application. After conversion coating, an organic paint system consisting of a chromated epoxy primer and polyurethane topcoat is applied. An overall systems approach was selected to reduce waste generation from the coatings application and removal processes. While the most obvious waste reduction opportunity involved elimination of the chromate conversion coating, several other coating system configurations were explored in an attempt to reduce the total waste. This paper will briefly discuss the use of a systems view to reduce waste generation from the coating process and present the results of the qualification testing of nonchromated aluminum pretreatments and alternate coating systems configurations.

  9. Re-Engineering of the Hubble Space Telescope (HST) to Reduce Operational Costs

    NASA Technical Reports Server (NTRS)

    Garvis, Michael; Dougherty, Andrew; Whittier, Wallace

    1996-01-01

    Satellite telemetry processing onboard the Hubble Space Telescope (HST) is carried out using dedicated software and hardware. The current ground system is expensive to operate and maintain. The mandate to reduce satellite ground system operations and maintenance costs by the year 2000 led NASA to upgrade the command and control systems in order to improve the data processing capabilities, reduce operator experience levels and increase system standardization. As a result, a command and control system product development team was formed to redesign and develop the HST ground system. The command and control system ground system development consists of six elements. The results of the prototyping phase carried out for the following of these elements are presented: the front end processor; middleware, and the graphical user interface.

  10. RPP-PRT-58489, Revision 1, One Systems Consistent Safety Analysis Methodologies Report. 24590-WTP-RPT-MGT-15-014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gupta, Mukesh; Niemi, Belinda; Paik, Ingle

    2015-09-02

    In 2012, One System Nuclear Safety performed a comparison of the safety bases for the Tank Farms Operations Contractor (TOC) and Hanford Tank Waste Treatment and Immobilization Plant (WTP) (RPP-RPT-53222 / 24590-WTP-RPT-MGT-12-018, “One System Report of Comparative Evaluation of Safety Bases for Hanford Waste Treatment and Immobilization Plant Project and Tank Operations Contract”), and identified 25 recommendations that required further evaluation for consensus disposition. This report documents ten NSSC approved consistent methodologies and guides and the results of the additional evaluation process using a new set of evaluation criteria developed for the evaluation of the new methodologies.

  11. A dual-systems perspective on addiction: contributions from neuroimaging and cognitive training.

    PubMed

    McClure, Samuel M; Bickel, Warren K

    2014-10-01

    Dual-systems theories explain lapses in self-control in terms of a conflict between automatic and deliberative modes of behavioral control. Numerous studies have now tested whether the brain areas that control behavior are organized in a manner consistent with dual-systems models. Brain regions directly associated with the mesolimbic dopamine system, the nucleus accumbens and ventromedial prefrontal cortex in particular, capture some of the features assumed by automatic processing. Regions in the lateral prefrontal cortex are more closely linked to deliberative processing and the exertion of self-control in the suppression of impulses. While identifying these regions crudely supports dual-systems theories, important modifications to what constitutes automatic and deliberative behavioral control are also suggested. Experiments have identified various means by which automatic processes may be sculpted. Additional work decomposes deliberative processes into component functions such as generalized working memory, reappraisal of emotional stimuli, and prospection. The importance of deconstructing dual-systems models into specific cognitive processes is clear for understanding and treating addiction. We discuss intervention possibilities suggested by recent research, and focus in particular on cognitive training approaches to bolster deliberative control processes that may aid quit attempts. © 2014 New York Academy of Sciences.

  12. A dual-systems perspective on addiction: contributions from neuroimaging and cognitive training

    PubMed Central

    McClure, Samuel M.; Bickel, Warren K.

    2014-01-01

    Dual-systems theories explain lapses in self-control in terms of a conflict between automatic and deliberative modes of behavioral control. Numerous studies have now tested whether the brain areas that control behavior are organized in a manner consistent with dual-systems models. Brain regions directly associated with the mesolimbic dopamine system, the nucleus accumbens (NAcc) and ventromedial prefrontal cortex (vmPFC) in particular, capture some of the features assumed by automatic processing. Regions in the lateral prefrontal cortex (lPFC) are more closely linked to deliberative processing and the exertion of self-control in the suppression of impulses. While identifying these regions crudely supports dual-system theories, important modifications to what constitutes automatic and deliberative behavioral control are also suggested. Experiments have identified various means by which automatic processes may be sculpted. Additional work decomposes deliberative processes into component functions such as generalized working memory, reappraisal of emotional stimuli, and prospection. The importance of deconstructing dual-systems models into specific cognitive processes is clear for understanding and treating addiction. We discuss intervention possibilities suggested by recent research, and focus in particular on cognitive training approaches to bolster deliberative control processes that may aid quit attempts. PMID:25336389

  13. Use of computer systems and process information for blast furnace operations at U. S. Steel, Gary Works

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sherman, G.J.; Zmierski, M.L.

    1994-09-01

    US Steel Iron Producing Div. consists of four operating blast furnaces ranging in process control capabilities from 1950's and 1960's era hardware to state of the art technology. The oldest control system consists of a large number of panels containing numerous relays, indicating lights, selector switches, push buttons, analog controllers, strip chart recorders and annunciators. In contrast, the state of the art control system utilizes remote I/O, two sets of redundant PLC's, redundant charge director computer, redundant distributed control system, high resolution video-graphic display system and supervisory computer for real-time data acquisition. Process data are collected and archived on twomore » DEC VAX computers, one for No. 13 blast furnace and the other for the three south end furnaces. Historical trending, data analysis and reporting are available to iron producing personnel through terminals and PC's connected directly to the systems, dial-up modems and various network configurations. These two machines are part of the iron producing network which allows them to pass and receive information from each other as well as numerous other sources throughout the division. This configuration allows personnel to access most pertinent furnace information from a single source. The basic objective of the control systems is to charge raw materials to the top of the furnace at aim weights and sequence, while maintaining blast conditions at the bottom of the furnace at required temperature, pressure and composition. Control changes by the operators are primarily supervisory based on review of system generated plots and tables.« less

  14. System enhancements of Mesoscale Analysis and Space Sensor (MASS) computer system

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The interactive information processing for the mesoscale analysis and space sensor (MASS) program is reported. The development and implementation of new spaceborne remote sensing technology to observe and measure atmospheric processes is described. The space measurements and conventional observational data are processed together to gain an improved understanding of the mesoscale structure and dynamical evolution of the atmosphere relative to cloud development and precipitation processes. A Research Computer System consisting of three primary computers was developed (HP-1000F, Perkin-Elmer 3250, and Harris/6) which provides a wide range of capabilities for processing and displaying interactively large volumes of remote sensing data. The development of a MASS data base management and analysis system on the HP-1000F computer and extending these capabilities by integration with the Perkin-Elmer and Harris/6 computers using the MSFC's Apple III microcomputer workstations is described. The objectives are: to design hardware enhancements for computer integration and to provide data conversion and transfer between machines.

  15. Electrochemical noise measurements of sustained microbially influenced pitting corrosion in a laboratory flow loop system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Y.; Frank, J.R.; St. Martin, E.J.

    Because of the chaotic nature of the corrosion process and the complexity of the electrochemical noise signals that are generated, there is no generally accepted method of measuring and interpreting these signals that allows the consistent detection and identification of sustained localized pitting (SLP) as compared to general corrosion. The authors have reexamined electrochemical noise analysis (ENA) of localized corrosion using different hardware, signal collection, and signal processing designs than those used in conventional ENA techniques. The new data acquisition system was designed to identify and monitor the progress of SLP by analyzing the power spectral density (PSD) of themore » trend of the corrosion current noise level (CNL) and potential noise level (PNL). Each CNL and PNL data point was calculated from the root-mean-square value of the ac components of current and potential fluctuation signals, which were measured simultaneously during a short time period. The PSD analysis results consistently demonstrated that the trends of PNL and CNL contain information that can be used to differentiate between SLP and general corrosion mechanisms. The degree of linear slope in the low-frequency portion of the PSD analysis was correlated with the SLP process. Laboratory metal coupons as well as commercial corrosion probes were tested to ensure the reproducibility and consistency of the results. The on-line monitoring capability of this new ENA method was evaluated in a bench-scale flow-loop system, which simulated microbially influenced corrosion (MIC) activity. The conditions in the test flow-loop system were controlled by the addition of microbes and different substrates to favor accelerated corrosion. The ENA results demonstrated that this in-situ corrosion monitoring system could effectively identify SLP corrosion associated with MIC, compared to a more uniform general corrosion mechanism. A reduction in SLP activity could be clearly detected by the ENA monitoring system when a corrosion inhibitor was added into one of the test loops during the corrosion testing.« less

  16. Electrochemical noise measurements of sustained microbially influenced pitting corrosion in a laboratory flow loop system.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Y. J.

    Because of the chaotic nature of the corrosion process and the complexity of the electrochemical noise signals that are generated, there is no generally accepted method of measuring and interpreting these signals that allows the consistent detection and identification of sustained localized pitting (SLP) as compared to general corrosion. We have reexamined electrochemical noise analysis (ENA) of localized corrosion using different hardware, signal collection, and signal processing designs than those used in conventional ENA techniques. The new data acquisition system was designed to identify and monitor the progress of SLP by analyzing the power spectral density (PSD) of the trendmore » of the corrosion current noise level (CNL) and potential noise level (PNL). Each CNL and PNL data point was calculated from the root-mean- square value of the ac components of current and potential fluctuation signals, which were measured simultaneously during a short time period. The PSD analysis results consistently demonstrated that the trends of PNL and CNL contain information that can be used to differentiate between SLP and general corrosion mechanisms. The degree of linear slope in the low-frequency portion of the PSD analysis was correlated with the SLP process. Laboratory metal coupons as well as commercial corrosion probes were tested to ensure the reproducibility and consistency of the results. The on-line monitoring capability of this new ENA method was evaluated in a bench-scale flow-loop system, which simulated microbially influenced corrosion (MIC) activity. The conditions in the test flow-loop system were controlled by the addition of microbes and different substrates to favor accelerated corrosion. The ENA results demonstrated that this in-situ corrosion monitoring system could effectively identify SLP corrosion associated with MIC, compared to a more uniform general corrosion mechanism. A reduction in SLP activity could be clearly detected by the ENA monitoring system when a corrosion inhibitor was added into one of the test loops during the corrosion testing.« less

  17. Process-based quality for thermal spray via feedback control

    NASA Astrophysics Data System (ADS)

    Dykhuizen, R. C.; Neiser, R. A.

    2006-09-01

    Quality control of a thermal spray system manufacturing process is difficult due to the many input variables that need to be controlled. Great care must be taken to ensure that the process remains constant to obtain a consistent quality of the parts. Control is greatly complicated by the fact that measurement of particle velocities and temperatures is a noisy stochastic process. This article illustrates the application of quality control concepts to a wire flame spray process. A central feature of the real-time control system is an automatic feedback control scheme that provides fine adjustments to ensure that uncontrolled variations are accommodated. It is shown how the control vectors can be constructed from simple process maps to independently control particle velocity and temperature. This control scheme is shown to perform well in a real production environment. We also demonstrate that slight variations in the feed wire curvature can greatly influence the process. Finally, the geometry of the spray system and sensor must remain constant for the best reproducibility.

  18. Software Suite to Support In-Flight Characterization of Remote Sensing Systems

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas; Holekamp, Kara; Gasser, Gerald; Tabor, Wes; Vaughan, Ronald; Ryan, Robert; Pagnutti, Mary; Blonski, Slawomir; Kenton, Ross

    2014-01-01

    A characterization software suite was developed to facilitate NASA's in-flight characterization of commercial remote sensing systems. Characterization of aerial and satellite systems requires knowledge of ground characteristics, or ground truth. This information is typically obtained with instruments taking measurements prior to or during a remote sensing system overpass. Acquired ground-truth data, which can consist of hundreds of measurements with different data formats, must be processed before it can be used in the characterization. Accurate in-flight characterization of remote sensing systems relies on multiple field data acquisitions that are efficiently processed, with minimal error. To address the need for timely, reproducible ground-truth data, a characterization software suite was developed to automate the data processing methods. The characterization software suite is engineering code, requiring some prior knowledge and expertise to run. The suite consists of component scripts for each of the three main in-flight characterization types: radiometric, geometric, and spatial. The component scripts for the radiometric characterization operate primarily by reading the raw data acquired by the field instruments, combining it with other applicable information, and then reducing it to a format that is appropriate for input into MODTRAN (MODerate resolution atmospheric TRANsmission), an Air Force Research Laboratory-developed radiative transport code used to predict at-sensor measurements. The geometric scripts operate by comparing identified target locations from the remote sensing image to known target locations, producing circular error statistics defined by the Federal Geographic Data Committee Standards. The spatial scripts analyze a target edge within the image, and produce estimates of Relative Edge Response and the value of the Modulation Transfer Function at the Nyquist frequency. The software suite enables rapid, efficient, automated processing of ground truth data, which has been used to provide reproducible characterizations on a number of commercial remote sensing systems. Overall, this characterization software suite improves the reliability of ground-truth data processing techniques that are required for remote sensing system in-flight characterizations.

  19. Lithofacies identification using multiple adaptive resonance theory neural networks and group decision expert system

    USGS Publications Warehouse

    Chang, H.-C.; Kopaska-Merkel, D. C.; Chen, H.-C.; Rocky, Durrans S.

    2000-01-01

    Lithofacies identification supplies qualitative information about rocks. Lithofacies represent rock textures and are important components of hydrocarbon reservoir description. Traditional techniques of lithofacies identification from core data are costly and different geologists may provide different interpretations. In this paper, we present a low-cost intelligent system consisting of three adaptive resonance theory neural networks and a rule-based expert system to consistently and objectively identify lithofacies from well-log data. The input data are altered into different forms representing different perspectives of observation of lithofacies. Each form of input is processed by a different adaptive resonance theory neural network. Among these three adaptive resonance theory neural networks, one neural network processes the raw continuous data, another processes categorial data, and the third processes fuzzy-set data. Outputs from these three networks are then combined by the expert system using fuzzy inference to determine to which facies the input data should be assigned. Rules are prioritized to emphasize the importance of firing order. This new approach combines the learning ability of neural networks, the adaptability of fuzzy logic, and the expertise of geologists to infer facies of the rocks. This approach is applied to the Appleton Field, an oil field located in Escambia County, Alabama. The hybrid intelligence system predicts lithofacies identity from log data with 87.6% accuracy. This prediction is more accurate than those of single adaptive resonance theory networks, 79.3%, 68.0% and 66.0%, using raw, fuzzy-set, and categorical data, respectively, and by an error-backpropagation neural network, 57.3%. (C) 2000 Published by Elsevier Science Ltd. All rights reserved.

  20. Graphical Language for Data Processing

    NASA Technical Reports Server (NTRS)

    Alphonso, Keith

    2011-01-01

    A graphical language for processing data allows processing elements to be connected with virtual wires that represent data flows between processing modules. The processing of complex data, such as lidar data, requires many different algorithms to be applied. The purpose of this innovation is to automate the processing of complex data, such as LIDAR, without the need for complex scripting and programming languages. The system consists of a set of user-interface components that allow the user to drag and drop various algorithmic and processing components onto a process graph. By working graphically, the user can completely visualize the process flow and create complex diagrams. This innovation supports the nesting of graphs, such that a graph can be included in another graph as a single step for processing. In addition to the user interface components, the system includes a set of .NET classes that represent the graph internally. These classes provide the internal system representation of the graphical user interface. The system includes a graph execution component that reads the internal representation of the graph (as described above) and executes that graph. The execution of the graph follows the interpreted model of execution in that each node is traversed and executed from the original internal representation. In addition, there are components that allow external code elements, such as algorithms, to be easily integrated into the system, thus making the system infinitely expandable.

  1. Concreteness effects in semantic processing: ERP evidence supporting dual-coding theory.

    PubMed

    Kounios, J; Holcomb, P J

    1994-07-01

    Dual-coding theory argues that processing advantages for concrete over abstract (verbal) stimuli result from the operation of 2 systems (i.e., imaginal and verbal) for concrete stimuli, rather than just 1 (for abstract stimuli). These verbal and imaginal systems have been linked with the left and right hemispheres of the brain, respectively. Context-availability theory argues that concreteness effects result from processing differences in a single system. The merits of these theories were investigated by examining the topographic distribution of event-related brain potentials in 2 experiments (lexical decision and concrete-abstract classification). The results were most consistent with dual-coding theory. In particular, different scalp distributions of an N400-like negativity were elicited by concrete and abstract words.

  2. Method for sequentially processing a multi-level interconnect circuit in a vacuum chamber

    NASA Technical Reports Server (NTRS)

    Routh, D. E.; Sharma, G. C. (Inventor)

    1984-01-01

    An apparatus is disclosed which includes a vacuum system having a vacuum chamber in which wafers are processed on rotating turntables. The vacuum chamber is provided with an RF sputtering system and a dc magnetron sputtering system. A gas inlet introduces various gases to the vacuum chamber and creates various gas plasma during the sputtering steps. The rotating turntables insure that the respective wafers are present under the sputtering guns for an average amount of time such that consistency in sputtering and deposition is achieved. By continuous and sequential processing of the wafers in a common vacuum chamber without removal, the adverse affects of exposure to atmospheric conditions are eliminated providing higher quality circuit contacts and functional device.

  3. Microscopic theory for the time irreversibility and the entropy production

    NASA Astrophysics Data System (ADS)

    Chun, Hyun-Myung; Noh, Jae Dong

    2018-02-01

    In stochastic thermodynamics, the entropy production of a thermodynamic system is defined by the irreversibility measured by the logarithm of the ratio of the path probabilities in the forward and reverse processes. We derive the relation between the irreversibility and the entropy production starting from the deterministic equations of motion of the whole system consisting of a physical system and a surrounding thermal environment. The derivation assumes the Markov approximation that the environmental degrees of freedom equilibrate instantaneously. Our approach provides a guideline for the choice of the proper reverse process to a given forward process, especially when there exists a velocity-dependent force. We demonstrate our idea with an example of a charged particle in the presence of a time-varying magnetic field.

  4. System configured for applying a modifying agent to a non-equidimensional substrate

    DOEpatents

    Janikowski,; Stuart K. , Argyle; Mark D. , Fox; Robert V. , Propp; W Alan, Toth [Idaho Falls, ID; William J. , Ginosar; Daniel M. , Allen; Charles A. , Miller; David, L [Idaho Falls, ID

    2007-07-10

    The present invention is related to systems and methods for modifying various non-equidimensional substrates with modifying agents. The system comprises a processing chamber configured for passing the non-equidimensional substrate therethrough, wherein the processing chamber is further configured to accept a treatment mixture into the chamber during movement of the non-equidimensional substrate through the processing chamber. The treatment mixture can comprise of the modifying agent in a carrier medium, wherein the carrier medium is selected from the group consisting of a supercritical fluid, a near-critical fluid, a superheated fluid, a superheated liquid, and a liquefied gas. Thus, the modifying agent can be applied to the non-equidimensional substrate upon contact between the treatment mixture and the non-equidimensional substrate.

  5. System configured for applying a modifying agent to a non-equidimensional substrate

    DOEpatents

    Janikowski, Stuart K.; Toth, William J.; Ginosar, Daniel M.; Allen, Charles A.; Argyle, Mark D.; Fox, Robert V.; Propp, W. Alan; Miller, David L.

    2003-09-23

    The present invention is related to systems and methods for modifying various non-equidimensional substrates with modifying agents. The system comprises a processing chamber configured for passing the non-equidimensional substrate therethrough, wherein the processing chamber is further configured to accept a treatment mixture into the chamber during movement of the non-equidimensional substrate through the processing chamber. The treatment mixture can comprise of the modifying agent in a carrier medium, wherein the carrier medium is selected from the group consisting of a supercritical fluid, a near-critical fluid, a superheated fluid, a superheated liquid, and a liquefied gas. Thus, the modifying agent can be applied to the non-equidimensional substrate upon contact between the treatment mixture and the non-equidimensional substrate.

  6. Appendices to the user's manual for a computer program for the emulation/simulation of a space station environmental control and life support system

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    A user's Manual for the Emulation Simulation Computer Model was published previously. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation/simulation combination in the design, development, and test of a piece of ARS hardware - SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from the test. In addition, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system, the second a potential Space Station air revitalization system.

  7. Evidence for supernova injection into the solar nebula and the decoupling of r-process nucleosynthesis

    PubMed Central

    Brennecka, Gregory A.; Borg, Lars E.; Wadhwa, Meenakshi

    2013-01-01

    The isotopic composition of our Solar System reflects the blending of materials derived from numerous past nucleosynthetic events, each characterized by a distinct isotopic signature. We show that the isotopic compositions of elements spanning a large mass range in the earliest formed solids in our Solar System, calcium–aluminum-rich inclusions (CAIs), are uniform, and yet distinct from the average Solar System composition. Relative to younger objects in the Solar System, CAIs contain positive r-process anomalies in isotopes A < 140 and negative r-process anomalies in isotopes A > 140. This fundamental difference in the isotopic character of CAIs around mass 140 necessitates (i) the existence of multiple sources for r-process nucleosynthesis and (ii) the injection of supernova material into a reservoir untapped by CAIs. A scenario of late supernova injection into the protoplanetary disk is consistent with formation of our Solar System in an active star-forming region of the galaxy. PMID:24101483

  8. Microscopic heat engine and control of work fluctuations

    NASA Astrophysics Data System (ADS)

    Xiao, Gaoyang

    In this thesis, we study novel behaviors of microscopic work and heat in systems involving few degrees of freedom. We firstly report that a quantum Carnot cycle should consist of two isothermal processes and two mechanical adiabatic processes if we want to maximize its heat-to-work conversion efficiency. We then find that the efficiency can be further optimized, and it is generally system specific, lower than the Carnot efficiency, and dependent upon both temperatures of the cold and hot reservoirs. We then move on to the studies the fluctuations of microscopic work. We find a principle of minimal work fluctuations related to the Jarzynski equality. In brief, an adiabatic process without energy level crossing yields the minimal fluctuations in exponential work, given a thermally isolated system initially prepared at thermal equilibrium. Finally, we investigate an optimal control approach to suppress the work fluctuations and accelerate the adiabatic processes. This optimal control approach can apply to wide variety of systems even when we do not have full knowledge of the systems.

  9. Quantum chemical methods for the investigation of photoinitiated processes in biological systems: theory and applications.

    PubMed

    Dreuw, Andreas

    2006-11-13

    With the advent of modern computers and advances in the development of efficient quantum chemical computer codes, the meaningful computation of large molecular systems at a quantum mechanical level became feasible. Recent experimental effort to understand photoinitiated processes in biological systems, for instance photosynthesis or vision, at a molecular level also triggered theoretical investigations in this field. In this Minireview, standard quantum chemical methods are presented that are applicable and recently used for the calculation of excited states of photoinitiated processes in biological molecular systems. These methods comprise configuration interaction singles, the complete active space self-consistent field method, and time-dependent density functional theory and its variants. Semiempirical approaches are also covered. Their basic theoretical concepts and mathematical equations are briefly outlined, and their properties and limitations are discussed. Recent successful applications of the methods to photoinitiated processes in biological systems are described and theoretical tools for the analysis of excited states are presented.

  10. Evidence for supernova injection into the solar nebula and the decoupling of r-process nucleosynthesis.

    PubMed

    Brennecka, Gregory A; Borg, Lars E; Wadhwa, Meenakshi

    2013-10-22

    The isotopic composition of our Solar System reflects the blending of materials derived from numerous past nucleosynthetic events, each characterized by a distinct isotopic signature. We show that the isotopic compositions of elements spanning a large mass range in the earliest formed solids in our Solar System, calcium-aluminum-rich inclusions (CAIs), are uniform, and yet distinct from the average Solar System composition. Relative to younger objects in the Solar System, CAIs contain positive r-process anomalies in isotopes A < 140 and negative r-process anomalies in isotopes A > 140. This fundamental difference in the isotopic character of CAIs around mass 140 necessitates (i) the existence of multiple sources for r-process nucleosynthesis and (ii) the injection of supernova material into a reservoir untapped by CAIs. A scenario of late supernova injection into the protoplanetary disk is consistent with formation of our Solar System in an active star-forming region of the galaxy.

  11. Defining the cortical visual systems: "what", "where", and "how"

    NASA Technical Reports Server (NTRS)

    Creem, S. H.; Proffitt, D. R.; Kaiser, M. K. (Principal Investigator)

    2001-01-01

    The visual system historically has been defined as consisting of at least two broad subsystems subserving object and spatial vision. These visual processing streams have been organized both structurally as two distinct pathways in the brain, and functionally for the types of tasks that they mediate. The classic definition by Ungerleider and Mishkin labeled a ventral "what" stream to process object information and a dorsal "where" stream to process spatial information. More recently, Goodale and Milner redefined the two visual systems with a focus on the different ways in which visual information is transformed for different goals. They relabeled the dorsal stream as a "how" system for transforming visual information using an egocentric frame of reference in preparation for direct action. This paper reviews recent research from psychophysics, neurophysiology, neuropsychology and neuroimaging to define the roles of the ventral and dorsal visual processing streams. We discuss a possible solution that allows for both "where" and "how" systems that are functionally and structurally organized within the posterior parietal lobe.

  12. Innovative vitrification for soil remediation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jetta, N.W.; Patten, J.S.; Hart, J.G.

    1995-12-01

    The objective of this DOE demonstration program is to validate the performance and operation of the Vortec Cyclone Melting System (CMS{trademark}) for the processing of LLW contaminated soils found at DOE sites. This DOE vitrification demonstration project has successfully progressed through the first two phases. Phase 1 consisted of pilot scale testing with surrogate wastes and the conceptual design of a process plant operating at a generic DOE site. The objective of Phase 2, which is scheduled to be completed the end of FY 95, is to develop a definitive process plant design for the treatment of wastes at amore » specific DOE facility. During Phase 2, a site specific design was developed for the processing of LLW soils and muds containing TSCA organics and RCRA metal contaminants. Phase 3 will consist of a full scale demonstration at the DOE gaseous diffusion plant located in Paducah, KY. Several DOE sites were evaluated for potential application of the technology. Paducah was selected for the demonstration program because of their urgent waste remediation needs as well as their strong management and cost sharing financial support for the project. During Phase 2, the basic nitrification process design was modified to meet the specific needs of the new waste streams available at Paducah. The system design developed for Paducah has significantly enhanced the processing capabilities of the Vortec vitrification process. The overall system design now includes the capability to shred entire drums and drum packs containing mud, concrete, plastics and PCB`s as well as bulk waste materials. This enhanced processing capability will substantially expand the total DOE waste remediation applications of the technology.« less

  13. Surface Composition Influence on Internal Gas Flow at Large Knudsen Numbers

    DTIC Science & Technology

    2000-07-09

    situated in an ultra high vacuum system . The system is supplied with means of gas phase, surface CP585, Rarefied Gas Dynamics: 22nd International...control and gas flow measuring system . The experimental procedure consists in a few stages. The first stage includes surface preparation process at...solid body system , Proceedings 20-th Int. Symp. Rarefied Gas Dynamics, Peking University Press, Beijing, China, 1997, pp. 387-391. 3. Lord, R.G

  14. [Computerized system validation of clinical researches].

    PubMed

    Yan, Charles; Chen, Feng; Xia, Jia-lai; Zheng, Qing-shan; Liu, Daniel

    2015-11-01

    Validation is a documented process that provides a high degree of assurance. The computer system does exactly and consistently what it is designed to do in a controlled manner throughout the life. The validation process begins with the system proposal/requirements definition, and continues application and maintenance until system retirement and retention of the e-records based on regulatory rules. The objective to do so is to clearly specify that each application of information technology fulfills its purpose. The computer system validation (CSV) is essential in clinical studies according to the GCP standard, meeting product's pre-determined attributes of the specifications, quality, safety and traceability. This paper describes how to perform the validation process and determine relevant stakeholders within an organization in the light of validation SOPs. Although a specific accountability in the implementation of the validation process might be outsourced, the ultimate responsibility of the CSV remains on the shoulder of the business process owner-sponsor. In order to show that the compliance of the system validation has been properly attained, it is essential to set up comprehensive validation procedures and maintain adequate documentations as well as training records. Quality of the system validation should be controlled using both QC and QA means.

  15. Process for producing peracids from aliphatic hydroxy carboxylic acids

    DOEpatents

    Chum, H.L.; Palasz, P.D.; Ratcliff, M.A.

    1984-12-20

    A process is described for producing peracids from lactic acid-containing solutions derived from biomass processing systems. It consists of adjusting the pH of the solution to about 8 to 9 and removing alkaline residue fractions therefrom to form a solution comprised substantially of lower aliphatic hydroxy acids. The solution is oxidized to produce volatile lower aliphatic aldehydes. The aldehydes are removed as they are generated and converted to peracids.

  16. Formation of Superhard Chromium Carbide Crystal Microrods in Ni-Cr-C Systems

    NASA Astrophysics Data System (ADS)

    Val'chuk, V. P.; Zmienko, D. S.; Kolesov, V. V.; Chernozatonskii, L. A.

    2018-04-01

    Ni-Cr-C materials with a high hardness determined by the presence of regions consisting of Cr3C2 microrods with a record microhardness reaching 3200 kg/mm2 have been obtained. Their self-organization in a powder consisting of Ni, Cr, and carbon microparticles with a high weight percentage occurs in the process of its sintering at a temperature of 1300°C and the subsequent sharp cooling of the resulting alloy. A model has been proposed for the process of formation of such crystal microrods whose characteristics have been determined by hardness measurement, electron microscopy, and microchemical and X-ray diffraction analyses.

  17. Optoelectronic Reservoir Computing

    PubMed Central

    Paquot, Y.; Duport, F.; Smerieri, A.; Dambre, J.; Schrauwen, B.; Haelterman, M.; Massar, S.

    2012-01-01

    Reservoir computing is a recently introduced, highly efficient bio-inspired approach for processing time dependent data. The basic scheme of reservoir computing consists of a non linear recurrent dynamical system coupled to a single input layer and a single output layer. Within these constraints many implementations are possible. Here we report an optoelectronic implementation of reservoir computing based on a recently proposed architecture consisting of a single non linear node and a delay line. Our implementation is sufficiently fast for real time information processing. We illustrate its performance on tasks of practical importance such as nonlinear channel equalization and speech recognition, and obtain results comparable to state of the art digital implementations. PMID:22371825

  18. From pattern to process: The strategy of the Earth Observing System: Volume 2: EOS Science Steering Committee report

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The Earth Observing System (EOS) represents a new approach to the study of the Earth. It consists of remotely sensed and correlative in situ observations designed to address important, interrelated global-scale processes. There is an urgent need to study the Earth as a complete, integrated system in order to understand and predict changes caused by human activities and natural processes. The EOS approach is based on an information system concept and designed to provide a long-term study of the Earth using a variety of measurement methods from both operational and research satellite payloads and continuing ground-based Earth science studies. The EOS concept builds on the foundation of the earlier, single-discipline space missions designed for relatively short observation periods. Continued progress in our understanding of the Earth as a system will come from EOS observations spanning several decades using a variety of contemporaneous measurements.

  19. Geochemistry of water in the Fort Union Formation of the northern Powder River basin, southeastern Montana

    USGS Publications Warehouse

    Lee, Roger W.

    1980-01-01

    Shallow water in the coal-bearing Fort Union Formation of southeastern Montana was investigated to provide a better understanding of the geochemistry. Springs, wells less than 200 feet deep, and wells greater then 200 feet deep were observed to have different water qualities. Overall, the ground water exists as two systems: a mosaic of shallow, chemically dynamic, and localized recharge-discharge cells superimposed on a deeper, chemically static regional system. Water chemistry is highly variable in the shallow system, whereas sodium and bicarbonate waters characterize the deeper system. Within the shallow system , springs, and wells less than 200 feet deep show predominantly sodium and sulfate enrichment processes from recharge to discharge. These processes are consistent with the observed aquifer mineralogy and aqueous chemistry. However, intermittent mixing with downward moving recharge waters or upward moving deeper waters, and bacterially catalyzed sulfate reduction, may cause apparent reversals in these processes. (USGS)

  20. Geochemistry of water in the Fort Union formation of the northern Powder River basin, southeastern Montana

    USGS Publications Warehouse

    Lee, Roger W.

    1981-01-01

    Shallow water in the coal-bearing Paleocene Fort Union Formation of southeastern Montana was investigated to provide a better understanding of its geochemistry. Springs, wells less than 200 feet deep, and wells greater than 200 feet deep were observed to have different water qualities. Overall, the ground water exists as two systems: a mosaic of shallow, chemically dynamic, and localized recharge-discharge cells superimposed on a deeper, chemically static regional system. Water chemistry is highly variable in the shallow system; whereas, waters containing sodium and bicarbonate characterize the deeper system. Within the shallow system, springs and wells less than 200 feet deep show predominantly sodium and sulfate enrichment processes from recharge to discharge. These processes are consistent with the observed aquifer mineralogy and aqueous chemistry. However, intermittent mixing with downward moving recharge waters or upward moving deeper waters, and bacterially catalyzed sulfate reduction, may cause apparent reversals in these processes.

  1. Field-scale electrolysis/ceramic membrane system for the treatment of sewage from decentralized small communities.

    PubMed

    Son, Dong-Jin; Kim, Woo-Yeol; Yun, Chan-Young; Kim, Dae-Gun; Chang, Duk; Sunwoo, Young; Hong, Ki-Ho

    2017-07-05

    The electrolysis process adopting copper electrodes and ceramic membrane with pore sizes of 0.1-0.2 μm were consisted to a system for the treatment of sewage from decentralized small communities. The system was operated under an HRT of 0.1 hour, voltage of 24 V, and TMP of 0.05 MPa. The system showed average removals of organics, nitrogen, phosphorus, and solids of up to 80%, 52%, 92%, and 100%, respectively. Removal of organics and nitrogen dramatically increased in proportion to increment of influent loading. Phosphorus and solids were remarkably eliminated by both electro-coagulation and membrane filtration. The residual particulate constituents could also be removed successfully through membrane process. A system composed of electrolysis process with ceramic membrane would be a compact, reliable, and flexible option for the treatment of sewage from decentralized small communities.

  2. Vessel-Mounted ADCP Data Calibration and Correction

    NASA Astrophysics Data System (ADS)

    de Andrade, A. F.; Barreira, L. M.; Violante-Carvalho, N.

    2013-05-01

    A set of scripts for vessel-mounted ADCP (Acoustic Doppler Current Profiler) data processing is presented. The need for corrections in the data measured by a ship-mounted ADCP and the complexities found during installation, implementation and identification of tasks performed by currently available systems for data processing consist the main motivating factors for the development of a system that would be more practical in manipulation, open code and more manageable for the user. The proposed processing system consists of a set of scripts developed in Matlab TM programming language. The system is able to read the binary files provided by the data acquisition program VMDAS (Vessel Mounted Data Acquisition System), Teledyne RDInstruments proprietary, and calculate calibration factors to correct the data and visualize them after correction. For use the new system, it is only necessary that the ADCP data collected with VMDAS program is in a processing diretory and Matlab TM software be installed on the user's computer. Developed algorithms were extensively tested with ADCP data obtained during Oceano Sul III (Southern Ocean III - OSIII) cruise, conducted by Brazilian Navy aboard the R/V "Antares", from March 26th to May 10th 2007, in the oceanic region between the states of São Paulo and Rio Grande do Sul. For read the data the function rdradcp.m, developed by Rich Pawlowicz and available on his website (http://www.eos.ubc.ca/~rich/#RDADCP), was used. To calculate the calibration factors, alignment error (α) and sensitivity error (β) in Water Tracking and Bottom Tracking Modes, equations deduced by Joyce (1998), Pollard & Read (1989) and Trump & Marmorino (1996) were implemented in Matlab. To validate the calibration factors obtained in the processing system developed, the parameters were compared with the factors provided by CODAS (Common Ocean Data Access System, available at http://currents.soest.hawaii.edu/docs/doc/index.html), post-processing program. For the same data analyzed, the factors provided by both systems were similar. Thereafter, the values obtained were used to correct the data and finally matrices were saved with data corrected and they can be plotted. The values of volume transport of the Brazil Current (BC) were calculated using the corrected data by the two systems and proved quite close, confirming the quality of the correction of the system.

  3. Near-optimal integration of facial form and motion.

    PubMed

    Dobs, Katharina; Ma, Wei Ji; Reddy, Leila

    2017-09-08

    Human perception consists of the continuous integration of sensory cues pertaining to the same object. While it has been fairly well shown that humans use an optimal strategy when integrating low-level cues proportional to their relative reliability, the integration processes underlying high-level perception are much less understood. Here we investigate cue integration in a complex high-level perceptual system, the human face processing system. We tested cue integration of facial form and motion in an identity categorization task and found that an optimal model could successfully predict subjects' identity choices. Our results suggest that optimal cue integration may be implemented across different levels of the visual processing hierarchy.

  4. Low-cost, high-speed back-end processing system for high-frequency ultrasound B-mode imaging.

    PubMed

    Chang, Jin Ho; Sun, Lei; Yen, Jesse T; Shung, K Kirk

    2009-07-01

    For real-time visualization of the mouse heart (6 to 13 beats per second), a back-end processing system involving high-speed signal processing functions to form and display images has been developed. This back-end system was designed with new signal processing algorithms to achieve a frame rate of more than 400 images per second. These algorithms were implemented in a simple and cost-effective manner with a single field-programmable gate array (FPGA) and software programs written in C++. The operating speed of the back-end system was investigated by recording the time required for transferring an image to a personal computer. Experimental results showed that the back-end system is capable of producing 433 images per second. To evaluate the imaging performance of the back-end system, a complete imaging system was built. This imaging system, which consisted of a recently reported high-speed mechanical sector scanner assembled with the back-end system, was tested by imaging a wire phantom, a pig eye (in vitro), and a mouse heart (in vivo). It was shown that this system is capable of providing high spatial resolution images with fast temporal resolution.

  5. Low-Cost, High-Speed Back-End Processing System for High-Frequency Ultrasound B-Mode Imaging

    PubMed Central

    Chang, Jin Ho; Sun, Lei; Yen, Jesse T.; Shung, K. Kirk

    2009-01-01

    For real-time visualization of the mouse heart (6 to 13 beats per second), a back-end processing system involving high-speed signal processing functions to form and display images has been developed. This back-end system was designed with new signal processing algorithms to achieve a frame rate of more than 400 images per second. These algorithms were implemented in a simple and cost-effective manner with a single field-programmable gate array (FPGA) and software programs written in C++. The operating speed of the back-end system was investigated by recording the time required for transferring an image to a personal computer. Experimental results showed that the back-end system is capable of producing 433 images per second. To evaluate the imaging performance of the back-end system, a complete imaging system was built. This imaging system, which consisted of a recently reported high-speed mechanical sector scanner assembled with the back-end system, was tested by imaging a wire phantom, a pig eye (in vitro), and a mouse heart (in vivo). It was shown that this system is capable of providing high spatial resolution images with fast temporal resolution. PMID:19574160

  6. Robotic Processing Of Rocket-Engine Nozzles

    NASA Technical Reports Server (NTRS)

    Gilbert, Jeffrey L.; Maslakowski, John E.; Gutow, David A.; Deily, David C.

    1994-01-01

    Automated manufacturing cell containing computer-controlled robotic processing system developed to implement some important related steps in fabrication of rocket-engine nozzles. Performs several tedious and repetitive fabrication, measurement, adjustment, and inspection processes and subprocesses now performed manually. Offers advantages of reduced processing time, greater consistency, excellent collection of data, objective inspections, greater productivity, and simplified fixturing. Also affords flexibility: by making suitable changes in hardware and software, possible to modify process and subprocesses. Flexibility makes work cell adaptable to fabrication of heat exchangers and other items structured similarly to rocket nozzles.

  7. Integrated information systems for electronic chemotherapy medication administration.

    PubMed

    Levy, Mia A; Giuse, Dario A; Eck, Carol; Holder, Gwen; Lippard, Giles; Cartwright, Julia; Rudge, Nancy K

    2011-07-01

    Chemotherapy administration is a highly complex and distributed task in both the inpatient and outpatient infusion center settings. The American Society of Clinical Oncology and the Oncology Nursing Society (ASCO/ONS) have developed standards that specify procedures and documentation requirements for safe chemotherapy administration. Yet paper-based approaches to medication administration have several disadvantages and do not provide any decision support for patient safety checks. Electronic medication administration that includes bar coding technology may provide additional safety checks, enable consistent documentation structure, and have additional downstream benefits. We describe the specialized configuration of clinical informatics systems for electronic chemotherapy medication administration. The system integrates the patient registration system, the inpatient order entry system, the pharmacy information system, the nursing documentation system, and the electronic health record. We describe the process of deploying this infrastructure in the adult and pediatric inpatient oncology, hematology, and bone marrow transplant wards at Vanderbilt University Medical Center. We have successfully adapted the system for the oncology-specific documentation requirements detailed in the ASCO/ONS guidelines for chemotherapy administration. However, several limitations remain with regard to recording the day of treatment and dose number. Overall, the configured systems facilitate compliance with the ASCO/ONS guidelines and improve the consistency of documentation and multidisciplinary team communication. Our success has prompted us to deploy this infrastructure in our outpatient chemotherapy infusion centers, a process that is currently underway and that will require a few unique considerations.

  8. Multicriterion problem of allocation of resources in the heterogeneous distributed information processing systems

    NASA Astrophysics Data System (ADS)

    Antamoshkin, O. A.; Kilochitskaya, T. R.; Ontuzheva, G. A.; Stupina, A. A.; Tynchenko, V. S.

    2018-05-01

    This study reviews the problem of allocation of resources in the heterogeneous distributed information processing systems, which may be formalized in the form of a multicriterion multi-index problem with the linear constraints of the transport type. The algorithms for solution of this problem suggest a search for the entire set of Pareto-optimal solutions. For some classes of hierarchical systems, it is possible to significantly speed up the procedure of verification of a system of linear algebraic inequalities for consistency due to the reducibility of them to the stream models or the application of other solution schemes (for strongly connected structures) that take into account the specifics of the hierarchies under consideration.

  9. Arc-Jet Thrustor Development

    NASA Technical Reports Server (NTRS)

    Curran, F. M.; Hamley, J. A.; Gruber, R. P.; Sankovic, J. M.; Haag, T. W.; Marren, W. E.; Sarmiento, C. J.; Carney, L.

    1993-01-01

    Two flight-type 1.4-kW hydrazine arcjet systems developed and tested under Lewis program. Each consists of thrustor, gas generator, and power-processing unit. Performance significantly improved. Technology transferred to user community, and first commercial flight anticipated in 1993.

  10. Fabrication and performance of pressure-sensing device consisting of electret film and organic semiconductor

    NASA Astrophysics Data System (ADS)

    Kodzasa, Takehito; Nobeshima, Daiki; Kuribara, Kazunori; Uemura, Sei; Yoshida, Manabu

    2017-04-01

    We propose a new concept of a pressure-sensitive device that consists of an organic electret film and an organic semiconductor. This device exhibits high sensitivity and selectivity against various types of pressure. The sensing mechanism of this device originates from a modulation of the electric conductivity of the organic semiconductor film induced by the interaction between the semiconductor film and the charged electret film placed face to face. It is expected that a complicated sensor array will be fabricated by using a roll-to-roll manufacturing system, because this device can be prepared by an all-printing and simple lamination process without high-level positional adjustment for printing processes. This also shows that this device with a simple structure is suitable for application to a highly flexible device array sheet for an Internet of Things (IoT) or wearable sensing system.

  11. Smart signal processing for an evolving electric grid

    NASA Astrophysics Data System (ADS)

    Silva, Leandro Rodrigues Manso; Duque, Calos Augusto; Ribeiro, Paulo F.

    2015-12-01

    Electric grids are interconnected complex systems consisting of generation, transmission, distribution, and active loads, recently called prosumers as they produce and consume electric energy. Additionally, these encompass a vast array of equipment such as machines, power transformers, capacitor banks, power electronic devices, motors, etc. that are continuously evolving in their demand characteristics. Given these conditions, signal processing is becoming an essential assessment tool to enable the engineer and researcher to understand, plan, design, and operate the complex and smart electronic grid of the future. This paper focuses on recent developments associated with signal processing applied to power system analysis in terms of characterization and diagnostics. The following techniques are reviewed and their characteristics and applications discussed: active power system monitoring, sparse representation of power system signal, real-time resampling, and time-frequency (i.e., wavelets) applied to power fluctuations.

  12. Out of Place, Out of Mind: Schema-Driven False Memory Effects for Object-Location Bindings

    ERIC Educational Resources Information Center

    Lew, Adina R.; Howe, Mark L.

    2017-01-01

    Events consist of diverse elements, each processed in specialized neocortical networks, with temporal lobe memory systems binding these elements to form coherent event memories. We provide a novel theoretical analysis of an unexplored consequence of the independence of memory systems for elements and their bindings, 1 that raises the paradoxical…

  13. The Interactivity Effect in Multimedia Learning

    ERIC Educational Resources Information Center

    Evans, Chris; Gibbons, Nicola J.

    2007-01-01

    The aim of this study was to determine whether the addition of interactivity to a computer-based learning package enhances the learning process. A sample of 33 (22 male and 11 female) undergraduates on a Business and Management degree used a multimedia system to learn about the operation of a bicycle pump. The system consisted of a labelled…

  14. Evaluation of a UV/Ozone Treatment Process for Removal of MTBE in Groundwater Supplies in New Mexico

    EPA Science Inventory

    EPA’s Office of Research and Development is funding pilot-scale studies on MTBE contaminated groundwater using UV/ozone treatment technology (254 nm UV, 5.8 mg/L ozone). The pilot-scale treatment system consists of a GW well pump, a feed tank, a pretreatment system (water soften...

  15. Evaluation of a UV/Ozone Treatment Process for Removal of MTBE in Groundwater Supplies in New Mexico

    EPA Science Inventory

    EPA’s Office of Research and Development is funding pilot-scale studies on MTBE contaminated groundwater using UV/ozone treatment technology (254 nm UV, 5.8 mg/L ozone). The pilot-scale treatment system consists of a GW well pump, a feed tank, a pretreatment system (water softene...

  16. The Development of a Test System for the Evaluation of Reverse Osmosis Water Purification Membranes

    DTIC Science & Technology

    1984-06-01

    processes consist of high rate filtration followed by the reverse osmosis system. Under the present concept there will be two units: one will produce 600...of the National Toxicology Program, National Institute of Health. No official data has. been released on the teratogenicity of D.MMP. Healh ýz

  17. Improving Assessment Processes in Higher Education: Student and Teacher Perceptions of the Effectiveness of a Rubric Embedded in a LMS

    ERIC Educational Resources Information Center

    Atkinson, Doug; Lim, Siew Leng

    2013-01-01

    Students and teachers play different roles and thus have different perceptions about the effectiveness of assessment including structure, feedback, consistency, fairness and efficiency. In an undergraduate Business Information Systems course, a rubric was designed and semi-automated through a learning management system (LMS) to provide formative…

  18. 75 FR 54657 - University of Florida; University of Florida Training Reactor; Environmental Assessment and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-08

    ... operation of the UFTR to routinely provide teaching, research, and services to numerous institutions for a... confinement. The Nuclear Reactor Building and its annex, the Nuclear Sciences Center, are located in an area... primary system consisting of a 200-gallon coolant storage tank, a heat removal system, and a processing...

  19. Process and system - A dual definition, revisited with consequences in metrology

    NASA Astrophysics Data System (ADS)

    Ruhm, K. H.

    2010-07-01

    Lets assert that metrology life could be easier scientifically as well as technologically, if we, intentionally, would make an explicit distinction between two outstanding domains, namely the given, really existent domain of processes and the just virtually existent domain of systems, the latter of which is designed and used by the human mind. The abstract domain of models, by which we map the manifold reality of processes, is itself part of the domain of systems. Models support comprehension and communication, although they are normally extreme simplifications of properties and behaviour of a concrete reality. So, systems and signals represent processes and quantities, which are described by means of Signal and System Theory as well as by Stochastics and Statistics. The following presentation of this new, demanding and somehow irritating definition of the terms process and system as a dual pair is unusual indeed, but it opens the door widely to a better and more consistent discussion and understanding of manifold scientific tools in many areas. Metrology [4] is one of the important fields of concern due to many reasons: One group of the soft and hard links between the domain of processes and the domain of systems is realised by concepts of measurement science on the one hand and by instrumental tools of measurement technology on the other hand.

  20. A broadband multimedia TeleLearning system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Ruiping; Karmouch, A.

    1996-12-31

    In this paper we discuss a broadband multimedia TeleLearning system under development in the Multimedia Information Research Laboratory at the University of Ottawa. The system aims at providing a seamless environment for TeleLearning using the latest telecommunication and multimedia information processing technology. It basically consists of a media production center, a courseware author site, a courseware database, a courseware user site, and an on-line facilitator site. All these components are distributed over an ATM network and work together to offer a multimedia interactive courseware service. An MHEG-based model is exploited in designing the system architecture to achieve the real-time, interactive,more » and reusable information interchange through heterogeneous platforms. The system architecture, courseware processing strategies, courseware document models are presented.« less

  1. REPHLEX II: An information management system for the ARS Water Data Base

    NASA Astrophysics Data System (ADS)

    Thurman, Jane L.

    1993-08-01

    The REPHLEX II computer system is an on-line information management system which allows scientists, engineers, and other researchers to retrieve data from the ARS Water Data Base using asynchronous communications. The system features two phone lines handling baud rates from 300 to 2400, customized menus to facilitate browsing, help screens, direct access to information and data files, electronic mail processing, file transfers using the XMODEM protocol, and log-in procedures which capture information on new users, process passwords, and log activity for a permanent audit trail. The primary data base on the REPHLEX II system is the ARS Water Data Base which consists of rainfall and runoff data from experimental agricultural watersheds located in the United States.

  2. Upper and lower bounds for semi-Markov reliability models of reconfigurable systems

    NASA Technical Reports Server (NTRS)

    White, A. L.

    1984-01-01

    This paper determines the information required about system recovery to compute the reliability of a class of reconfigurable systems. Upper and lower bounds are derived for these systems. The class consists of those systems that satisfy five assumptions: the components fail independently at a low constant rate, fault occurrence and system reconfiguration are independent processes, the reliability model is semi-Markov, the recovery functions which describe system configuration have small means and variances, and the system is well designed. The bounds are easy to compute, and examples are included.

  3. The swiss army knife of job submission tools: grid-control

    NASA Astrophysics Data System (ADS)

    Stober, F.; Fischer, M.; Schleper, P.; Stadie, H.; Garbers, C.; Lange, J.; Kovalchuk, N.

    2017-10-01

    grid-control is a lightweight and highly portable open source submission tool that supports all common workflows in high energy physics (HEP). It has been used by a sizeable number of HEP analyses to process tasks that sometimes consist of up to 100k jobs. grid-control is built around a powerful plugin and configuration system, that allows users to easily specify all aspects of the desired workflow. Job submission to a wide range of local or remote batch systems or grid middleware is supported. Tasks can be conveniently specified through the parameter space that will be processed, which can consist of any number of variables and data sources with complex dependencies on each other. Dataset information is processed through a configurable pipeline of dataset filters, partition plugins and partition filters. The partition plugins can take the number of files, size of the work units, metadata or combinations thereof into account. All changes to the input datasets or variables are propagated through the processing pipeline and can transparently trigger adjustments to the parameter space and the job submission. While the core functionality is completely experiment independent, full integration with the CMS computing environment is provided by a small set of plugins.

  4. Ecosystem Services and Climate Change Considerations for ...

    EPA Pesticide Factsheets

    Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework “iemWatersheds” has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water

  5. New Zealand doctors' attitudes towards the complaints and disciplinary process.

    PubMed

    Cunningham, Wayne

    2004-07-23

    To examine attitudes held by doctors in New Zealand towards the complaints and disciplinary process. A questionnaire was sent to New Zealand doctors randomly selected to include vocationally registered general practitioners, vocationally registered hospital-based specialists, and general registrants. 598 respondents (33.6% having ever and 66.4% having never received a medical complaint) indicated that New Zealand doctors strongly support society's right to complain, having lay input, a sense of completion, and appropriate advice provided to the complaints process. Doctors also support society's notions of rights and responsibilities, and believe that the medical profession is capable of self-regulation. Fifty percent of doctors do not believe that complaints are a useful tool to improve medical practice. Doctor's attitudes diverge about how they believe society interacts with the profession through the complaints process. They are divided in their opinion as to whether complaints are warranted, whether complainants are normal people, and whether complaints are judged by appropriate standards. Doctor's attitudes towards the complaints and disciplinary system fall on a continuum between being consistent and divergent. Their attitudes are consistent with notions of professionalism, but suggest that using the complaints system to improve the delivery of medical care may be problematic.

  6. Process description language: an experiment in robust programming for manufacturing systems

    NASA Astrophysics Data System (ADS)

    Spooner, Natalie R.; Creak, G. Alan

    1998-10-01

    Maintaining stable, robust, and consistent software is difficult in face of the increasing rate of change of customers' preferences, materials, manufacturing techniques, computer equipment, and other characteristic features of manufacturing systems. It is argued that software is commonly difficult to keep up to date because many of the implications of these changing features on software details are obscure. A possible solution is to use a software generation system in which the transformation of system properties into system software is made explicit. The proposed generation system stores the system properties, such as machine properties, product properties and information on manufacturing techniques, in databases. As a result this information, on which system control is based, can also be made available to other programs. In particular, artificial intelligence programs such as fault diagnosis programs, can benefit from using the same information as the control system, rather than a separate database which must be developed and maintained separately to ensure consistency. Experience in developing a simplified model of such a system is presented.

  7. Vision Algorithms to Determine Shape and Distance for Manipulation of Unmodeled Objects

    NASA Technical Reports Server (NTRS)

    Montes, Leticia; Bowers, David; Lumia, Ron

    1998-01-01

    This paper discusses the development of a robotic system for general use in an unstructured environment. This is illustrated through pick and place of randomly positioned, un-modeled objects. There are many applications for this project, including rock collection for the Mars Surveyor Program. This system is demonstrated with a Puma560 robot, Barrett hand, Cognex vision system, and Cimetrix simulation and control, all running on a PC. The demonstration consists of two processes: vision system and robotics. The vision system determines the size and location of the unknown objects. The robotics part consists of moving the robot to the object, configuring the hand based on the information from the vision system, then performing the pick/place operation. This work enhances and is a part of the Low Cost Virtual Collaborative Environment which provides remote simulation and control of equipment.

  8. The MSFC Collaborative Engineering Process for Preliminary Design and Concept Definition Studies

    NASA Technical Reports Server (NTRS)

    Mulqueen, Jack; Jones, David; Hopkins, Randy

    2011-01-01

    This paper describes a collaborative engineering process developed by the Marshall Space Flight Center's Advanced Concepts Office for performing rapid preliminary design and mission concept definition studies for potential future NASA missions. The process has been developed and demonstrated for a broad range of mission studies including human space exploration missions, space transportation system studies and in-space science missions. The paper will describe the design team structure and specialized analytical tools that have been developed to enable a unique rapid design process. The collaborative engineering process consists of integrated analysis approach for mission definition, vehicle definition and system engineering. The relevance of the collaborative process elements to the standard NASA NPR 7120.1 system engineering process will be demonstrated. The study definition process flow for each study discipline will be will be outlined beginning with the study planning process, followed by definition of ground rules and assumptions, definition of study trades, mission analysis and subsystem analyses leading to a standardized set of mission concept study products. The flexibility of the collaborative engineering design process to accommodate a wide range of study objectives from technology definition and requirements definition to preliminary design studies will be addressed. The paper will also describe the applicability of the collaborative engineering process to include an integrated systems analysis approach for evaluating the functional requirements of evolving system technologies and capabilities needed to meet the needs of future NASA programs.

  9. NASA Systems Engineering Handbook

    NASA Technical Reports Server (NTRS)

    Hirshorn, Steven R.; Voss, Linda D.; Bromley, Linda K.

    2017-01-01

    The update of this handbook continues the methodology of the previous revision: a top-down compatibility with higher level Agency policy and a bottom-up infusion of guidance from the NASA practitioners in the field. This approach provides the opportunity to obtain best practices from across NASA and bridge the information to the established NASA systems engineering processes and to communicate principles of good practice as well as alternative approaches rather than specify a particular way to accomplish a task. The result embodied in this handbook is a top-level implementation approach on the practice of systems engineering unique to NASA. Material used for updating this handbook has been drawn from many sources, including NPRs, Center systems engineering handbooks and processes, other Agency best practices, and external systems engineering textbooks and guides. This handbook consists of six chapters: (1) an introduction, (2) a systems engineering fundamentals discussion, (3) the NASA program project life cycles, (4) systems engineering processes to get from a concept to a design, (5) systems engineering processes to get from a design to a final product, and (6) crosscutting management processes in systems engineering. The chapters are supplemented by appendices that provide outlines, examples, and further information to illustrate topics in the chapters. The handbook makes extensive use of boxes and figures to define, refine, illustrate, and extend concepts in the chapters.

  10. Optimization of the production process using virtual model of a workspace

    NASA Astrophysics Data System (ADS)

    Monica, Z.

    2015-11-01

    Optimization of the production process is an element of the design cycle consisting of: problem definition, modelling, simulation, optimization and implementation. Without the use of simulation techniques, the only thing which could be achieved is larger or smaller improvement of the process, not the optimization (i.e., the best result it is possible to get for the conditions under which the process works). Optimization is generally management actions that are ultimately bring savings in time, resources, and raw materials and improve the performance of a specific process. It does not matter whether it is a service or manufacturing process. Optimizing the savings generated by improving and increasing the efficiency of the processes. Optimization consists primarily of organizational activities that require very little investment, or rely solely on the changing organization of work. Modern companies operating in a market economy shows a significant increase in interest in modern methods of production management and services. This trend is due to the high competitiveness among companies that want to achieve success are forced to continually modify the ways to manage and flexible response to changing demand. Modern methods of production management, not only imply a stable position of the company in the sector, but also influence the improvement of health and safety within the company and contribute to the implementation of more efficient rules for standardization work in the company. This is why in the paper is presented the application of such developed environment like Siemens NX to create the virtual model of a production system and to simulate as well as optimize its work. The analyzed system is the robotized workcell consisting of: machine tools, industrial robots, conveyors, auxiliary equipment and buffers. In the program could be defined the control program realizing the main task in the virtual workcell. It is possible, using this tool, to optimize both the object trajectory and the cooperation process.

  11. High speed real-time wavefront processing system for a solid-state laser system

    NASA Astrophysics Data System (ADS)

    Liu, Yuan; Yang, Ping; Chen, Shanqiu; Ma, Lifang; Xu, Bing

    2008-03-01

    A high speed real-time wavefront processing system for a solid-state laser beam cleanup system has been built. This system consists of a core2 Industrial PC (IPC) using Linux and real-time Linux (RT-Linux) operation system (OS), a PCI image grabber, a D/A card. More often than not, the phase aberrations of the output beam from solid-state lasers vary fast with intracavity thermal effects and environmental influence. To compensate the phase aberrations of solid-state lasers successfully, a high speed real-time wavefront processing system is presented. Compared to former systems, this system can improve the speed efficiently. In the new system, the acquisition of image data, the output of control voltage data and the implementation of reconstructor control algorithm are treated as real-time tasks in kernel-space, the display of wavefront information and man-machine conversation are treated as non real-time tasks in user-space. The parallel processing of real-time tasks in Symmetric Multi Processors (SMP) mode is the main strategy of improving the speed. In this paper, the performance and efficiency of this wavefront processing system are analyzed. The opened-loop experimental results show that the sampling frequency of this system is up to 3300Hz, and this system can well deal with phase aberrations from solid-state lasers.

  12. [Process orientation as a tool of strategic approaches to corporate governance and integrated management systems].

    PubMed

    Sens, Brigitte

    2010-01-01

    The concept of general process orientation as an instrument of organisation development is the core principle of quality management philosophy, i.e. the learning organisation. Accordingly, prestigious quality awards and certification systems focus on process configuration and continual improvement. In German health care organisations, particularly in hospitals, this general process orientation has not been widely implemented yet - despite enormous change dynamics and the requirements of both quality and economic efficiency of health care processes. But based on a consistent process architecture that considers key processes as well as management and support processes, the strategy of excellent health service provision including quality, safety and transparency can be realised in daily operative work. The core elements of quality (e.g., evidence-based medicine), patient safety and risk management, environmental management, health and safety at work can be embedded in daily health care processes as an integrated management system (the "all in one system" principle). Sustainable advantages and benefits for patients, staff, and the organisation will result: stable, high-quality, efficient, and indicator-based health care processes. Hospitals with their broad variety of complex health care procedures should now exploit the full potential of total process orientation. Copyright © 2010. Published by Elsevier GmbH.

  13. Study on photochemical analysis system (VLES) for EUV lithography

    NASA Astrophysics Data System (ADS)

    Sekiguchi, A.; Kono, Y.; Kadoi, M.; Minami, Y.; Kozawa, T.; Tagawa, S.; Gustafson, D.; Blackborow, P.

    2007-03-01

    A system for photo-chemical analysis of EUV lithography processes has been developed. This system has consists of 3 units: (1) an exposure that uses the Z-Pinch (Energetiq Tech.) EUV Light source (DPP) to carry out a flood exposure, (2) a measurement system RDA (Litho Tech Japan) for the development rate of photo-resists, and (3) a simulation unit that utilizes PROLITH (KLA-Tencor) to calculate the resist profiles and process latitude using the measured development rate data. With this system, preliminary evaluation of the performance of EUV lithography can be performed without any lithography tool (Stepper and Scanner system) that is capable of imaging and alignment. Profiles for 32 nm line and space pattern are simulated for the EUV resist (Posi-2 resist by TOK) by using VLES that hat has sensitivity at the 13.5nm wavelength. The simulation successfully predicts the resist behavior. Thus it is confirmed that the system enables efficient evaluation of the performance of EUV lithography processes.

  14. Using task analysis to understand the Data System Operations Team

    NASA Technical Reports Server (NTRS)

    Holder, Barbara E.

    1994-01-01

    The Data Systems Operations Team (DSOT) currently monitors the Multimission Ground Data System (MGDS) at JPL. The MGDS currently supports five spacecraft and within the next five years, it will support ten spacecraft simultaneously. The ground processing element of the MGDS consists of a distributed UNIX-based system of over 40 nodes and 100 processes. The MGDS system provides operators with little or no information about the system's end-to-end processing status or end-to-end configuration. The lack of system visibility has become a critical issue in the daily operation of the MGDS. A task analysis was conducted to determine what kinds of tools were needed to provide DSOT with useful status information and to prioritize the tool development. The analysis provided the formality and structure needed to get the right information exchange between development and operations. How even a small task analysis can improve developer-operator communications is described, and the challenges associated with conducting a task analysis in a real-time mission operations environment are examined.

  15. Cognitive load disrupts implicit theory-of-mind processing.

    PubMed

    Schneider, Dana; Lam, Rebecca; Bayliss, Andrew P; Dux, Paul E

    2012-08-01

    Eye movements in Sally-Anne false-belief tasks appear to reflect the ability to implicitly monitor the mental states of other individuals (theory of mind, or ToM). It has recently been proposed that an early-developing, efficient, and automatically operating ToM system subserves this ability. Surprisingly absent from the literature, however, is an empirical test of the influence of domain-general executive processing resources on this implicit ToM system. In the study reported here, a dual-task method was employed to investigate the impact of executive load on eye movements in an implicit Sally-Anne false-belief task. Under no-load conditions, adult participants displayed eye movement behavior consistent with implicit belief processing, whereas evidence for belief processing was absent for participants under cognitive load. These findings indicate that the cognitive system responsible for implicitly tracking beliefs draws at least minimally on executive processing resources. Thus, even the most low-level processing of beliefs appears to reflect a capacity-limited operation.

  16. Agile based "Semi-"Automated Data ingest process : ORNL DAAC example

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.

    2015-12-01

    The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.

  17. A real time mobile-based face recognition with fisherface methods

    NASA Astrophysics Data System (ADS)

    Arisandi, D.; Syahputra, M. F.; Putri, I. L.; Purnamawati, S.; Rahmat, R. F.; Sari, P. P.

    2018-03-01

    Face Recognition is a field research in Computer Vision that study about learning face and determine the identity of the face from a picture sent to the system. By utilizing this face recognition technology, learning process about people’s identity between students in a university will become simpler. With this technology, student won’t need to browse student directory in university’s server site and look for the person with certain face trait. To obtain this goal, face recognition application use image processing methods consist of two phase, pre-processing phase and recognition phase. In pre-processing phase, system will process input image into the best image for recognition phase. Purpose of this pre-processing phase is to reduce noise and increase signal in image. Next, to recognize face phase, we use Fisherface Methods. This methods is chosen because of its advantage that would help system of its limited data. Therefore from experiment the accuracy of face recognition using fisherface is 90%.

  18. Ergonomics action research II: a framework for integrating HF into work system design.

    PubMed

    Neumann, W P; Village, J

    2012-01-01

    This paper presents a conceptual framework that can support efforts to integrate human factors (HF) into the work system design process, where improved and cost-effective application of HF is possible. The framework advocates strategies of broad stakeholder participation, linking of performance and health goals, and process focussed change tools that can help practitioners engage in improvements to embed HF into a firm's work system design process. Recommended tools include business process mapping of the design process, implementing design criteria, using cognitive mapping to connect to managers' strategic goals, tactical use of training and adopting virtual HF (VHF) tools to support the integration effort. Consistent with organisational change research, the framework provides guidance but does not suggest a strict set of steps. This allows more adaptability for the practitioner who must navigate within a particular organisational context to secure support for embedding HF into the design process for improved operator wellbeing and system performance. There has been little scientific literature about how a practitioner might integrate HF into a company's work system design process. This paper proposes a framework for this effort by presenting a coherent conceptual framework, process tools, design tools and procedural advice that can be adapted for a target organisation.

  19. A KPI-based process monitoring and fault detection framework for large-scale processes.

    PubMed

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  20. When static media promote active learning: annotated illustrations versus narrated animations in multimedia instruction.

    PubMed

    Mayer, Richard E; Hegarty, Mary; Mayer, Sarah; Campbell, Julie

    2005-12-01

    In 4 experiments, students received a lesson consisting of computer-based animation and narration or a lesson consisting of paper-based static diagrams and text. The lessons used the same words and graphics in the paper-based and computer-based versions to explain the process of lightning formation (Experiment 1), how a toilet tank works (Experiment 2), how ocean waves work (Experiment 3), and how a car's braking system works (Experiment 4). On subsequent retention and transfer tests, the paper group performed significantly better than the computer group on 4 of 8 comparisons, and there was no significant difference on the rest. These results support the static media hypothesis, in which static illustrations with printed text reduce extraneous processing and promote germane processing as compared with narrated animations.

  1. Sensitivity Analysis of Algan/GAN High Electron Mobility Transistors to Process Variation

    DTIC Science & Technology

    2008-02-01

    delivery system gas panel including both hydride and alkyl delivery modules and the vent/valve configurations [14...Reactor Gas Delivery Systems A basic schematic diagram of an MOCVD reactor delivery gas panel is shown in Figure 13. The reactor gas delivery...system, or gas panel , consists of a network of stainless steel tubing, automatic valves and electronic mass flow controllers (MFC). There are separate

  2. Technical Standards for Command and Control Information Systems (CCISs) and Information Technology

    DTIC Science & Technology

    1994-02-01

    formatting, transmitting, receiving, and processing imagery and imagery-related information. The N1TFS is in essence the suite of individual standards...also known as Limited Operational Capability-Europe) and the German Joint Analysis System Military Intelligence ( JASMIN ). Among the approaches being... essence , the other systems utilize a one-level address space where addressing consists of identifying the fire support unit. However, AFATDS utilizes a two

  3. Automating Rule Strengths in Expert Systems.

    DTIC Science & Technology

    1987-05-01

    systems were designed in an incremental, iterative way. One of the most easily identifiable phases in this process, sometimes called tuning, consists...attenuators. The designer of the knowledge-based system must determine (synthesize) or adjust (xfine, if estimates of the values are given) these...values. We consider two ways in which the designer can learn the values. We call the first model of learning the complete case and the second model the

  4. Development of megasonic cleaning for silicon wafers

    NASA Technical Reports Server (NTRS)

    Mayer, A.

    1980-01-01

    A cleaning and drying system for processing at least 2500 three in. diameter wafers per hour was developed with a reduction in process cost. The system consists of an ammonia hydrogen peroxide bath in which both surfaces of 3/32 in. spaced, ion implanted wafers are cleaned in quartz carriers moved on a belt past two pairs of megasonic transducers. The wafers are dried in the novel room temperature, high velocity air dryer in the same carriers used for annealing. A new laser scanner was used effectively to monitor the cleaning ability on a sampling basis.

  5. Application of importance sampling to the computation of large deviations in nonequilibrium processes.

    PubMed

    Kundu, Anupam; Sabhapandit, Sanjib; Dhar, Abhishek

    2011-03-01

    We present an algorithm for finding the probabilities of rare events in nonequilibrium processes. The algorithm consists of evolving the system with a modified dynamics for which the required event occurs more frequently. By keeping track of the relative weight of phase-space trajectories generated by the modified and the original dynamics one can obtain the required probabilities. The algorithm is tested on two model systems of steady-state particle and heat transport where we find a huge improvement from direct simulation methods.

  6. Characterizing the Processes for Navigating Internet Health Information Using Real-Time Observations: A Mixed-Methods Approach.

    PubMed

    Perez, Susan L; Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L

    2015-07-20

    Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant's information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. We identified and classified two approaches to processing Internet health information. System 2 processing, a methodical approach, most resembles the strategies for information processing that have been found in other studies to be associated with higher-quality decisions. We conclude that the quality of Internet health-information seeking could be improved through consumer education on methodical Internet navigation strategies and the incorporation of decision aids into health information websites.

  7. Characterizing the Processes for Navigating Internet Health Information Using Real-Time Observations: A Mixed-Methods Approach

    PubMed Central

    Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L

    2015-01-01

    Background Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Objective Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Methods Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant’s information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. Results We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. Conclusions We identified and classified two approaches to processing Internet health information. System 2 processing, a methodical approach, most resembles the strategies for information processing that have been found in other studies to be associated with higher-quality decisions. We conclude that the quality of Internet health-information seeking could be improved through consumer education on methodical Internet navigation strategies and the incorporation of decision aids into health information websites. PMID:26194787

  8. An Attachable Electromagnetic Energy Harvester Driven Wireless Sensing System Demonstrating Milling-Processes and Cutter-Wear/Breakage-Condition Monitoring.

    PubMed

    Chung, Tien-Kan; Yeh, Po-Chen; Lee, Hao; Lin, Cheng-Mao; Tseng, Chia-Yung; Lo, Wen-Tuan; Wang, Chieh-Min; Wang, Wen-Chin; Tu, Chi-Jen; Tasi, Pei-Yuan; Chang, Jui-Wen

    2016-02-23

    An attachable electromagnetic-energy-harvester driven wireless vibration-sensing system for monitoring milling-processes and cutter-wear/breakage-conditions is demonstrated. The system includes an electromagnetic energy harvester, three single-axis Micro Electro-Mechanical Systems (MEMS) accelerometers, a wireless chip module, and corresponding circuits. The harvester consisting of magnets with a coil uses electromagnetic induction to harness mechanical energy produced by the rotating spindle in milling processes and consequently convert the harnessed energy to electrical output. The electrical output is rectified by the rectification circuit to power the accelerometers and wireless chip module. The harvester, circuits, accelerometer, and wireless chip are integrated as an energy-harvester driven wireless vibration-sensing system. Therefore, this completes a self-powered wireless vibration sensing system. For system testing, a numerical-controlled machining tool with various milling processes is used. According to the test results, the system is fully self-powered and able to successfully sense vibration in the milling processes. Furthermore, by analyzing the vibration signals (i.e., through analyzing the electrical outputs of the accelerometers), criteria are successfully established for the system for real-time accurate simulations of the milling-processes and cutter-conditions (such as cutter-wear conditions and cutter-breaking occurrence). Due to these results, our approach can be applied to most milling and other machining machines in factories to realize more smart machining technologies.

  9. An Attachable Electromagnetic Energy Harvester Driven Wireless Sensing System Demonstrating Milling-Processes and Cutter-Wear/Breakage-Condition Monitoring

    PubMed Central

    Chung, Tien-Kan; Yeh, Po-Chen; Lee, Hao; Lin, Cheng-Mao; Tseng, Chia-Yung; Lo, Wen-Tuan; Wang, Chieh-Min; Wang, Wen-Chin; Tu, Chi-Jen; Tasi, Pei-Yuan; Chang, Jui-Wen

    2016-01-01

    An attachable electromagnetic-energy-harvester driven wireless vibration-sensing system for monitoring milling-processes and cutter-wear/breakage-conditions is demonstrated. The system includes an electromagnetic energy harvester, three single-axis Micro Electro-Mechanical Systems (MEMS) accelerometers, a wireless chip module, and corresponding circuits. The harvester consisting of magnets with a coil uses electromagnetic induction to harness mechanical energy produced by the rotating spindle in milling processes and consequently convert the harnessed energy to electrical output. The electrical output is rectified by the rectification circuit to power the accelerometers and wireless chip module. The harvester, circuits, accelerometer, and wireless chip are integrated as an energy-harvester driven wireless vibration-sensing system. Therefore, this completes a self-powered wireless vibration sensing system. For system testing, a numerical-controlled machining tool with various milling processes is used. According to the test results, the system is fully self-powered and able to successfully sense vibration in the milling processes. Furthermore, by analyzing the vibration signals (i.e., through analyzing the electrical outputs of the accelerometers), criteria are successfully established for the system for real-time accurate simulations of the milling-processes and cutter-conditions (such as cutter-wear conditions and cutter-breaking occurrence). Due to these results, our approach can be applied to most milling and other machining machines in factories to realize more smart machining technologies. PMID:26907297

  10. Plans for the development of EOS SAR systems using the Alaska SAR facility. [Earth Observing System (EOS)

    NASA Technical Reports Server (NTRS)

    Carsey, F. D.; Weeks, W.

    1988-01-01

    The Alaska SAR Facility (ASF) program for the acquisition and processing of data from the ESA ERS-1, the NASDA ERS-1, and Radarsat and to carry out a program of science investigations using the data is introduced. Agreements for data acquisition and analysis are in place except for the agreement between NASA and Radarsat which is in negotiation. The ASF baseline system, consisting of the Receiving Ground System, the SAR Processor System and the Archive and Operations System, passed critical design review and is fully in implementation phase. Augments to the baseline system for systems to perform geophysical processing and for processing of J-ERS-1 optical data are in the design and implementation phase. The ASF provides a very effective vehicle with which to prepare for the Earth Observing System (EOS) in that it will aid the development of systems and technologies for handling the data volumes produced by the systems of the next decades, and it will also supply some of the data types that will be produced by EOS.

  11. Multi-channel automotive night vision system

    NASA Astrophysics Data System (ADS)

    Lu, Gang; Wang, Li-jun; Zhang, Yi

    2013-09-01

    A four-channel automotive night vision system is designed and developed .It is consist of the four active near-infrared cameras and an Mulit-channel image processing display unit,cameras were placed in the automobile front, left, right and rear of the system .The system uses near-infrared laser light source,the laser light beam is collimated, the light source contains a thermoelectric cooler (TEC),It can be synchronized with the camera focusing, also has an automatic light intensity adjustment, and thus can ensure the image quality. The principle of composition of the system is description in detail,on this basis, beam collimation,the LD driving and LD temperature control of near-infrared laser light source,four-channel image processing display are discussed.The system can be used in driver assistance, car BLIS, car parking assist system and car alarm system in day and night.

  12. Autonomous characterization of plastic-bonded explosives

    NASA Astrophysics Data System (ADS)

    Linder, Kim Dalton; DeRego, Paul; Gomez, Antonio; Baumgart, Chris

    2006-08-01

    Plastic-Bonded Explosives (PBXs) are a newer generation of explosive compositions developed at Los Alamos National Laboratory (LANL). Understanding the micromechanical behavior of these materials is critical. The size of the crystal particles and porosity within the PBX influences their shock sensitivity. Current methods to characterize the prominent structural characteristics include manual examination by scientists and attempts to use commercially available image processing packages. Both methods are time consuming and tedious. LANL personnel, recognizing this as a manually intensive process, have worked with the Kansas City Plant / Kirtland Operations to develop a system which utilizes image processing and pattern recognition techniques to characterize PBX material. System hardware consists of a CCD camera, zoom lens, two-dimensional, motorized stage, and coaxial, cross-polarized light. System integration of this hardware with the custom software is at the core of the machine vision system. Fundamental processing steps involve capturing images from the PBX specimen, and extraction of void, crystal, and binder regions. For crystal extraction, a Quadtree decomposition segmentation technique is employed. Benefits of this system include: (1) reduction of the overall characterization time; (2) a process which is quantifiable and repeatable; (3) utilization of personnel for intelligent review rather than manual processing; and (4) significantly enhanced characterization accuracy.

  13. Eigensystem realization algorithm user's guide forVAX/VMS computers: Version 931216

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.

    1994-01-01

    The eigensystem realization algorithm (ERA) is a multiple-input, multiple-output, time domain technique for structural modal identification and minimum-order system realization. Modal identification is the process of calculating structural eigenvalues and eigenvectors (natural vibration frequencies, damping, mode shapes, and modal masses) from experimental data. System realization is the process of constructing state-space dynamic models for modern control design. This user's guide documents VAX/VMS-based FORTRAN software developed by the author since 1984 in conjunction with many applications. It consists of a main ERA program and 66 pre- and post-processors. The software provides complete modal identification capabilities and most system realization capabilities.

  14. Modeling and simulation of enzymatic gluconic acid production using immobilized enzyme and CSTR-PFTR circulation reaction system.

    PubMed

    Li, Can; Lin, Jianqun; Gao, Ling; Lin, Huibin; Lin, Jianqiang

    2018-04-01

    Production of gluconic acid by using immobilized enzyme and continuous stirred tank reactor-plug flow tubular reactor (CSTR-PFTR) circulation reaction system. A production system is constructed for gluconic acid production, which consists of a continuous stirred tank reactor (CSTR) for pH control and liquid storage and a plug flow tubular reactor (PFTR) filled with immobilized glucose oxidase (GOD) for gluconic acid production. Mathematical model is developed for this production system and simulation is made for the enzymatic reaction process. The pH inhibition effect on GOD is modeled by using a bell-type curve. Gluconic acid can be efficiently produced by using the reaction system and the mathematical model developed for this system can simulate and predict the process well.

  15. Within-subject neural reactivity to reward and threat is inverted in young adolescents.

    PubMed

    Thomason, M E; Marusak, H A

    2017-07-01

    As children mature, they become increasingly independent and less reliant on caregiver support. Changes in brain systems are likely to stimulate and guide this process. One mechanistic hypothesis suggests that changes in neural systems that process reward and threat support the increase in exploratory behavior observed in the transition to adolescence. This study examines the basic tenets of this hypothesis by performing functional magnetic resonance imaging (fMRI) during well-established reward and threat processing tasks in 40 children and adolescents, aged 9-15 years. fMRI responses in the striatum and amygdala are fit to a model predicting that striatal reward and amygdala threat-responses will be unrelated in younger participants (aged 9-12 years), while older participants (aged 13-15 years) will differentially engage these structures. Our data are consistent with this model. Activity in the striatum and amygdala are comparable in younger children, but in older children, they are inversely related; those more responsive to reward show a reduced threat-response. Analyses testing age as a continuous variable yield consistent results. In addition, the proportion of threat to reward-response relates to self-reported approach behavior in older but not younger youth, exposing behavioral relevance in the relative level of activity in these structures. Results are consistent with the notion that both individual and developmental differences drive reward-seeking behavior in adolescence. While these response patterns may serve adaptive functions in the shift to independence, skew in these systems may relate to increased rates of emotional psychopathology and risk-taking observed in adolescence.

  16. Ecological Factors in Human Development.

    PubMed

    Cross, William E

    2017-05-01

    Urie Bronfenbrenner (1992) helped developmental psychologists comprehend and define "context" as a rich, thick multidimensional construct. His ecological systems theory consists of five layers, and within each layer are developmental processes unique to each layer. The four articles in this section limit the exploration of context to the three innermost systems: the individual plus micro- and macrolayers. Rather than examine both the physical features and processes, the articles tend to focus solely on processes associated with a niche. Processes explored include social identity development, social network dynamics, peer influences, and school-based friendship patterns. The works tend to extend the generalization of extant theory to the developmental experience of various minority group experiences. © 2017 The Authors. Child Development © 2017 Society for Research in Child Development, Inc.

  17. Avionics Architecture Standards as an Approach to Obsolescence Management

    DTIC Science & Technology

    2000-10-01

    and goals is one method of system. The term System Architecture refers to a achieving the necessary critical mass of skilled and consistent set of such...Processing Module (GPM), Mass Memory Module executed on the modules within an ASAAC system will (MMM) and Power Conversion Module (PCM). be stored in a central...location, the Mass Memory * MOS -Module Support Layer to Operating System Module (MMM). Therefore, if modules are to be The purpose of the MOS

  18. Practical experience with full-scale structured sheet media (SSM) integrated fixed-film activated sludge (IFAS) systems for nitrification.

    PubMed

    Li, Hua; Zhu, Jia; Flamming, James J; O'Connell, Jack; Shrader, Michael

    2015-01-01

    Many wastewater treatment plants in the USA, which were originally designed as secondary treatment systems with no or partial nitrification requirements, are facing increased flows, loads, and more stringent ammonia discharge limits. Plant expansion is often not cost-effective due to either high construction costs or lack of land. Under these circumstances, integrated fixed-film activated sludge (IFAS) systems using both suspended growth and biofilms that grow attached to a fixed plastic structured sheet media are found to be a viable solution for solving the challenges. Multiple plants have been retrofitted with such IFAS systems in the past few years. The system has proven to be efficient and reliable in achieving not only consistent nitrification, but also enhanced bio-chemical oxygen demand removal and sludge settling characteristics. This paper presents long-term practical experiences with the IFAS system design, operation and maintenance, and performance for three full-scale plants with distinct processes; that is, a trickling filter/solids contact process, a conventional plug flow activated sludge process and an extended aeration process.

  19. Implementing the space shuttle data processing system with the space generic open avionics architecture

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.; Stovall, John R.

    1993-01-01

    This paper presents an overview of the application of the Space Generic Open Avionics Architecture (SGOAA) to the Space Shuttle Data Processing System (DPS) architecture design. This application has been performed to validate the SGOAA, and its potential use in flight critical systems. The paper summarizes key elements of the Space Shuttle avionics architecture, data processing system requirements and software architecture as currently implemented. It then summarizes the SGOAA architecture and describes a tailoring of the SGOAA to the Space Shuttle. The SGOAA consists of a generic system architecture for the entities in spacecraft avionics, a generic processing external and internal hardware architecture, a six class model of interfaces and functional subsystem architectures for data services and operations control capabilities. It has been proposed as an avionics architecture standard with the National Aeronautics and Space Administration (NASA), through its Strategic Avionics Technology Working Group, and is being considered by the Society of Aeronautic Engineers (SAE) as an SAE Avionics Standard. This architecture was developed for the Flight Data Systems Division of JSC by the Lockheed Engineering and Sciences Company, Houston, Texas.

  20. Automated assembly of fast-axis collimation (FAC) lenses for diode laser bar modules

    NASA Astrophysics Data System (ADS)

    Miesner, Jörn; Timmermann, Andre; Meinschien, Jens; Neumann, Bernhard; Wright, Steve; Tekin, Tolga; Schröder, Henning; Westphalen, Thomas; Frischkorn, Felix

    2009-02-01

    Laser diodes and diode laser bars are key components in high power semiconductor lasers and solid state laser systems. During manufacture, the assembly of the fast axis collimation (FAC) lens is a crucial step. The goal of our activities is to design an automated assembly system for high volume production. In this paper the results of an intermediate milestone will be reported: a demonstration system was designed, realized and tested to prove the feasibility of all of the system components and process features. The demonstration system consists of a high precision handling system, metrology for process feedback, a powerful digital image processing system and tooling for glue dispensing, UV curing and laser operation. The system components as well as their interaction with each other were tested in an experimental system in order to glean design knowledge for the fully automated assembly system. The adjustment of the FAC lens is performed by a series of predefined steps monitored by two cameras concurrently imaging the far field and the near field intensity distributions. Feedback from these cameras processed by a powerful and efficient image processing algorithm control a five axis precision motion system to optimize the fast axis collimation of the laser beam. Automated cementing of the FAC to the diode bar completes the process. The presentation will show the system concept, the algorithm of the adjustment as well as experimental results. A critical discussion of the results will close the talk.

  1. Standardized Review and Approval Process for High-Cost Medication Use Promotes Value-Based Care in a Large Academic Medical System

    PubMed Central

    Durvasula, Raghu; Kelly, Janet; Schleyer, Anneliese; Anawalt, Bradley D.; Somani, Shabir; Dellit, Timothy H.

    2018-01-01

    Background As healthcare costs rise and reimbursements decrease, healthcare organization leadership and clinical providers must collaborate to provide high-value healthcare. Medications are a key driver of the increasing cost of healthcare, largely as a result of the proliferation of expensive specialty drugs, including biologic agents. Such medications contribute significantly to the inpatient diagnosis-related group payment system, often with minimal or unproved benefit over less-expensive therapies. Objective To describe a systematic review process to reduce non–evidence-based inpatient use of high-cost medications across a large multihospital academic health system. Methods We created a Pharmacy & Therapeutics subcommittee consisting of clinicians, pharmacists, and an ethics representative. This committee developed a standardized process for a timely review (<48 hours) and approval of high-cost medications based on their clinical effectiveness, safety, and appropriateness. The engagement of clinical experts in the development of the consensus-based guidelines for the use of specific medications facilitated the clinicians' acceptance of the review process. Results Over a 2-year period, a total of 85 patient-specific requests underwent formal review. All reviews were conducted within 48 hours. This review process has reduced the non–evidence-based use of specialty medications and has resulted in a pharmacy savings of $491,000 in fiscal year 2016, with almost 80% of the savings occurring in the last 2 quarters, because our process has matured. Conclusion The creation of a collaborative review process to ensure consistent, evidence-based utilization of high-cost medications provides value-based care, while minimizing unnecessary practice variation and reducing the cost of inpatient care.

  2. Space Reclamation for Uncoordinated Checkpointing in Message-Passing Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min

    1993-01-01

    Checkpointing and rollback recovery are techniques that can provide efficient recovery from transient process failures. In a message-passing system, the rollback of a message sender may cause the rollback of the corresponding receiver, and the system needs to roll back to a consistent set of checkpoints called recovery line. If the processes are allowed to take uncoordinated checkpoints, the above rollback propagation may result in the domino effect which prevents recovery line progression. Traditionally, only obsolete checkpoints before the global recovery line can be discarded, and the necessary and sufficient condition for identifying all garbage checkpoints has remained an open problem. A necessary and sufficient condition for achieving optimal garbage collection is derived and it is proved that the number of useful checkpoints is bounded by N(N+1)/2, where N is the number of processes. The approach is based on the maximum-sized antichain model of consistent global checkpoints and the technique of recovery line transformation and decomposition. It is also shown that, for systems requiring message logging to record in-transit messages, the same approach can be used to achieve optimal message log reclamation. As a final topic, a unifying framework is described by considering checkpoint coordination and exploiting piecewise determinism as mechanisms for bounding rollback propagation, and the applicability of the optimal garbage collection algorithm to domino-free recovery protocols is demonstrated.

  3. Operational seasonal and interannual predictions of ocean conditions

    NASA Technical Reports Server (NTRS)

    Leetmaa, Ants

    1992-01-01

    Dr. Leetmaa described current work at the U.S. National Meteorological Center (NMC) on coupled systems leading to a seasonal prediction system. He described the way in which ocean thermal data is quality controlled and used in a four dimensional data assimilation system. This consists of a statistical interpolation scheme, a primitive equation ocean general circulation model, and the atmospheric fluxes that are required to force this. This whole process generated dynamically consist thermohaline and velocity fields for the ocean. Currently routine weekly analyses are performed for the Atlantic and Pacific oceans. These analyses are used for ocean climate diagnostics and as initial conditions for coupled forecast models. Specific examples of output products were shown both in the Pacific and the Atlantic Ocean.

  4. Installation and management of the SPS and LEP control system computers

    NASA Astrophysics Data System (ADS)

    Bland, Alastair

    1994-12-01

    Control of the CERN SPS and LEP accelerators and service equipment on the two CERN main sites is performed via workstations, file servers, Process Control Assemblies (PCAs) and Device Stub Controllers (DSCs). This paper describes the methods and tools that have been developed to manage the file servers, PCAs and DSCs since the LEP startup in 1989. There are five operational DECstation 5000s used as file servers and boot servers for the PCAs and DSCs. The PCAs consist of 90 SCO Xenix 386 PCs, 40 LynxOS 486 PCs and more than 40 older NORD 100s. The DSCs consist of 90 OS-968030 VME crates and 10 LynxOS 68030 VME crates. In addition there are over 100 development systems. The controls group is responsible for installing the computers, starting all the user processes and ensuring that the computers and the processes run correctly. The operators in the SPS/LEP control room and the Services control room have a Motif-based X window program which gives them, in real time, the state of all the computers and allows them to solve problems or reboot them.

  5. Observing Consistency in Online Communication Patterns for User Re-Identification.

    PubMed

    Adeyemi, Ikuesan Richard; Razak, Shukor Abd; Salleh, Mazleena; Venter, Hein S

    2016-01-01

    Comprehension of the statistical and structural mechanisms governing human dynamics in online interaction plays a pivotal role in online user identification, online profile development, and recommender systems. However, building a characteristic model of human dynamics on the Internet involves a complete analysis of the variations in human activity patterns, which is a complex process. This complexity is inherent in human dynamics and has not been extensively studied to reveal the structural composition of human behavior. A typical method of anatomizing such a complex system is viewing all independent interconnectivity that constitutes the complexity. An examination of the various dimensions of human communication pattern in online interactions is presented in this paper. The study employed reliable server-side web data from 31 known users to explore characteristics of human-driven communications. Various machine-learning techniques were explored. The results revealed that each individual exhibited a relatively consistent, unique behavioral signature and that the logistic regression model and model tree can be used to accurately distinguish online users. These results are applicable to one-to-one online user identification processes, insider misuse investigation processes, and online profiling in various areas.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jee, S.S.; DiMasi, E.; Kasinath, R.K.

    Bone is a hierarchically structured composite which imparts it with unique mechanical properties and bioresorptive potential. These properties are primarily influenced by the underlying nanostructure of bone, which consists of nanocrystals of hydroxyapatite embedded and uniaxially aligned within collagen fibrils. There is also a small fraction of non-collagenous proteins in bone, and these are thought to play an important role in bone's formation. In our in vitro model system of bone formation, polyanionic peptides are used to mimic the role of the non-collagenous proteins. In our prior studies, we have shown that intrafibrillar mineralization can be achieved in synthetic reconstitutedmore » collagen sponges using a polymer-induced liquid-precursor (PILP) mineralization process. This led to a nanostructured arrangement of hydroxyapatite crystals within the individual fibrils which closely mimics that of bone. This report demonstrates that biogenic collagen scaffolds obtained from turkey tendon, which consist of densely packed and oriented collagen fibrils, can also be mineralized by the PILP process. Synchrotron X-ray diffraction studies show that the mineralization process leads to a high degree of crystallographic orientation at the macroscale, thus emulating that found in the biological system of naturally mineralizing turkey tendon.« less

  7. The Architecture of Personality

    ERIC Educational Resources Information Center

    Cervone, Daniel

    2004-01-01

    This article presents a theoretical framework for analyzing psychological systems that contribute to the variability, consistency, and cross-situational coherence of personality functioning. In the proposed knowledge-and-appraisal personality architecture (KAPA), personality structures and processes are delineated by combining 2 principles:…

  8. [Real-time detection and processing of medical signals under windows using Lcard analog interfaces].

    PubMed

    Kuz'min, A A; Belozerov, A E; Pronin, T V

    2008-01-01

    Multipurpose modular software for an analog interface based on Lcard 761 is considered. Algorithms for pipeline processing of medical signals under Windows with dynamic control of computational resources are suggested. The software consists of user-friendly completable modifiable modules. The module hierarchy is based on object-oriented heritage principles, which make it possible to construct various real-time systems for long-term detection, processing, and imaging of multichannel medical signals.

  9. Stochastic thermodynamics for Ising chain and symmetric exclusion process.

    PubMed

    Toral, R; Van den Broeck, C; Escaff, D; Lindenberg, Katja

    2017-03-01

    We verify the finite-time fluctuation theorem for a linear Ising chain in contact with heat reservoirs at its ends. Analytic results are derived for a chain consisting of two spins. The system can be mapped onto a model for particle transport, namely, the symmetric exclusion process in contact with thermal and particle reservoirs. We modify the symmetric exclusion process to represent a thermal engine and reproduce universal features of the efficiency at maximum power.

  10. Collaborative Project: Improving the Representation of Coastal and Estuarine Processes in Earth System Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, Frank; Dennis, John; MacCready, Parker

    This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.

  11. Final Report Collaborative Project: Improving the Representation of Coastal and Estuarine Processes in Earth System Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, Frank; Dennis, John; MacCready, Parker

    This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.

  12. Toward a formal verification of a floating-point coprocessor and its composition with a central processing unit

    NASA Technical Reports Server (NTRS)

    Pan, Jing; Levitt, Karl N.; Cohen, Gerald C.

    1991-01-01

    Discussed here is work to formally specify and verify a floating point coprocessor based on the MC68881. The HOL verification system developed at Cambridge University was used. The coprocessor consists of two independent units: the bus interface unit used to communicate with the cpu and the arithmetic processing unit used to perform the actual calculation. Reasoning about the interaction and synchronization among processes using higher order logic is demonstrated.

  13. Parallel asynchronous systems and image processing algorithms

    NASA Technical Reports Server (NTRS)

    Coon, D. D.; Perera, A. G. U.

    1989-01-01

    A new hardware approach to implementation of image processing algorithms is described. The approach is based on silicon devices which would permit an independent analog processing channel to be dedicated to evey pixel. A laminar architecture consisting of a stack of planar arrays of the device would form a two-dimensional array processor with a 2-D array of inputs located directly behind a focal plane detector array. A 2-D image data stream would propagate in neuronlike asynchronous pulse coded form through the laminar processor. Such systems would integrate image acquisition and image processing. Acquisition and processing would be performed concurrently as in natural vision systems. The research is aimed at implementation of algorithms, such as the intensity dependent summation algorithm and pyramid processing structures, which are motivated by the operation of natural vision systems. Implementation of natural vision algorithms would benefit from the use of neuronlike information coding and the laminar, 2-D parallel, vision system type architecture. Besides providing a neural network framework for implementation of natural vision algorithms, a 2-D parallel approach could eliminate the serial bottleneck of conventional processing systems. Conversion to serial format would occur only after raw intensity data has been substantially processed. An interesting challenge arises from the fact that the mathematical formulation of natural vision algorithms does not specify the means of implementation, so that hardware implementation poses intriguing questions involving vision science.

  14. A novel anchoring system for use in a nonfusion scoliosis correction device.

    PubMed

    Wessels, Martijn; Homminga, Jasper J; Hekman, Edsko E G; Verkerke, Gijsbertus J

    2014-11-01

    Insertion of a pedicle screw in the mid- and high thoracic regions has a serious risk of facet joint damage. Because flexible implant systems require intact facet joints, we developed an enhanced fixation that is less destructive to spinal structures. The XSFIX is a posterior fixation system that uses cables that are attached to the transverse processes of a vertebra. To determine whether a fixation to the transverse process using the XSFIX is strong enough to withstand the loads applied by the XSLATOR (a novel, highly flexible nonfusion implant system) and thus, whether it is a suitable alternative for pedicle screw fixation. The strength of a novel fixation system using transverse process cables was determined and compared with the strength of a similar fixation using polyaxial pedicle screws on different vertebral levels. Each of the 58 vertebrae, isolated from four adult human cadavers, was instrumented with either a pedicle screw anchor (PSA) system or a prototype of the XSFIX. The PSA consisted of two polyaxial pedicle screws and a 5 mm diameter rod. The XSFIX prototype consisted of two bodies that were fixed to the transverse processes, interconnected with a similar rod. Each fixation system was subjected to a lateral or an axial torque. The PSA demonstrated fixation strength in lateral loading and torsion higher than required for use in the XSLATOR. The XSFIX demonstrated high enough fixation strength (in both lateral loading and torsion), only in the high and midthoracic regions (T10-T12). This experiment showed that the fixation strength of XSFIX is sufficient for use with the XSLATOR only in mid- and high thoracic regions. For the low thoracic and lumbar region, the PSA is a more rigid fixation. Because the performance of the new fixation system appears to be favorable in the high and midthoracic regions, a clinical study is the next challenge. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Stereomotion is processed by the third-order motion system: reply to comment on Three-systems theory of human visual motion perception: review and update

    NASA Astrophysics Data System (ADS)

    Lu, Zhong-Lin; Sperling, George

    2002-10-01

    Two theories are considered to account for the perception of motion of depth-defined objects in random-dot stereograms (stereomotion). In the LuSperling three-motion-systems theory J. Opt. Soc. Am. A 18 , 2331 (2001), stereomotion is perceived by the third-order motion system, which detects the motion of areas defined as figure (versus ground) in a salience map. Alternatively, in his comment J. Opt. Soc. Am. A 19 , 2142 (2002), Patterson proposes a low-level motion-energy system dedicated to stereo depth. The critical difference between these theories is the preprocessing (figureground based on depth and other cues versus simply stereo depth) rather than the motion-detection algorithm itself (because the motion-extraction algorithm for third-order motion is undetermined). Furthermore, the ability of observers to perceive motion in alternating feature displays in which stereo depth alternates with other features such as texture orientation indicates that the third-order motion system can perceive stereomotion. This reduces the stereomotion question to Is it third-order alone or third-order plus dedicated depth-motion processing? Two new experiments intended to support the dedicated depth-motion processing theory are shown here to be perfectly accounted for by third-order motion, as are many older experiments that have previously been shown to be consistent with third-order motion. Cyclopean and rivalry images are shown to be a likely confound in stereomotion studies, rivalry motion being as strong as stereomotion. The phase dependence of superimposed same-direction stereomotion stimuli, rivalry stimuli, and isoluminant color stimuli indicates that these stimuli are processed in the same (third-order) motion system. The phase-dependence paradigm Lu and Sperling, Vision Res. 35 , 2697 (1995) ultimately can resolve the question of which types of signals share a single motion detector. All the evidence accumulated so far is consistent with the three-motion-systems theory. 2002 Optical Society of America

  16. Preparation, characterization and dissolution of passive oxide film on the 400 series stainless steel surfaces

    NASA Astrophysics Data System (ADS)

    Sathyaseelan, V. S.; Rufus, A. L.; Chandramohan, P.; Subramanian, H.; Velmurugan, S.

    2015-12-01

    Full system decontamination of Primary Heat Transport (PHT) system of Pressurised Heavy Water Reactors (PHWRs) resulted in low decontamination factors (DF) on stainless steel (SS) surfaces. Hence, studies were carried out with 403 SS and 410 SS that are the material of construction of "End-Fitting body" and "End-Fitting Liner tubes". Three formulations were evaluated for the dissolution of passive films formed over these alloys viz., i) Two-step process consisting of oxidation and reduction reactions, ii) Dilute Chemical Decontamination (DCD) and iii) High Temperature Process. The two-step and high temperature processes could dissolve the oxide completely while the DCD process could remove only 60%. Various techniques like XRD, Raman spectroscopy and SEM-EDX were used for assessing the dissolution process. The two-step process is time consuming, laborious while the high temperature process is less time consuming and is recommended for SS decontamination.

  17. State Analysis Database Tool

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Bennett, Matthew

    2006-01-01

    The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.

  18. Quantum thermodynamics of general quantum processes.

    PubMed

    Binder, Felix; Vinjanampathy, Sai; Modi, Kavan; Goold, John

    2015-03-01

    Accurately describing work extraction from a quantum system is a central objective for the extension of thermodynamics to individual quantum systems. The concepts of work and heat are surprisingly subtle when generalizations are made to arbitrary quantum states. We formulate an operational thermodynamics suitable for application to an open quantum system undergoing quantum evolution under a general quantum process by which we mean a completely positive and trace-preserving map. We derive an operational first law of thermodynamics for such processes and show consistency with the second law. We show that heat, from the first law, is positive when the input state of the map majorizes the output state. Moreover, the change in entropy is also positive for the same majorization condition. This makes a strong connection between the two operational laws of thermodynamics.

  19. Theoretical study of production of unique glasses in space. [kinetic relationships describing nucleation and crystallization phenomena

    NASA Technical Reports Server (NTRS)

    Larsen, D. C.; Sievert, J. L.

    1975-01-01

    The potential of producing the glassy form of selected materials in the weightless, containerless nature of space processing is examined through the development of kinetic relationships describing nucleation and crystallization phenomena. Transformation kinetics are applied to a well-characterized system (SiO2), an excellent glass former (B2O3), and a poor glass former (Al2O3) by conventional earth processing methods. Viscosity and entropy of fusion are shown to be the primary materials parameters controlling the glass forming tendency. For multicomponent systems diffusion-controlled kinetics and heterogeneous nucleation effects are considered. An analytical empirical approach is used to analyze the mullite system. Results are consistent with experimentally observed data and indicate the promise of mullite as a future space processing candidate.

  20. Diffusion of small Cu islands on the Ni(111) surface: A self-learning kinetic Monte Carlo study

    NASA Astrophysics Data System (ADS)

    Acharya, Shree Ram; Shah, Syed Islamuddin; Rahman, Talat S.

    2017-08-01

    We elucidate the diffusion kinetics of a heteroepitaxial system consisting of two-dimensional small (1-8 atoms) Cu islands on the Ni(111) surface at (100-600) K using the Self-Learning Kinetic Monte Carlo (SLKMC-II) method. Study of the statics of the system shows that compact CuN (3≤N≤8) clusters made up of triangular units on fcc occupancy sites are the energetically most stable structures of those clusters. Interestingly, we find a correlation between the height of the activation energy barrier (Ea) and the location of the transition state (TS). The Ea of processes for Cu islands on the Ni(111) surface are in general smaller than those of their counterpart Ni islands on the same surface. We find this difference to correlate with the relative strength of the lateral interaction of the island atoms in the two systems. While our database consists of hundreds of possible processes, we identify and discuss the energetics of those that are the most dominant, or are rate-limiting, or most contributory to the diffusion of the islands. Since the Ea of single- and multi-atom processes that convert compact island shapes into non-compact ones are larger (with a significantly smaller Ea for their reverse processes) than that for the collective (concerted) motion of the island, the later dominate in the system kinetics - except for the cases of the dimer, pentamer and octamer. Short-jump involving one atom, long jump dimer-shearing, and long-jump corner shearing (via a single-atom) are, respectively, the dominating processes in the diffusion of the dimer, pentamer and octamer. Furthermore single-atom corner-rounding are the rate-limiting processes for the pentamer and octamer islands. Comparison of the energetics of selected processes and lateral interactions obtained from semi-empirical interatomic potentials with those from density functional theory show minor quantitative differences and overall qualitative agreement.

  1. The Experience Factory: Strategy and Practice

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Caldiera, Gianluigi

    1995-01-01

    The quality movement, that has had in recent years a dramatic impact on all industrial sectors, has recently reached the system and software industry. Although some concepts of quality management, originally developed for other product types, can be applied to software, its specificity as a product which is developed and not produced requires a special approach. This paper introduces a quality paradigm specifically tailored on the problem of the systems and software industry. Reuse of products, processes and experiences originating from the system life cycle is seen today as a feasible solution to the problem of developing higher quality systems at a lower cost. In fact, quality improvement is very often achieved by defining and developing an appropriate set of strategic capabilities and core competencies to support them. A strategic capability is, in this context, a corporate goal defined by the business position of the organization and implemented by key business processes. Strategic capabilities are supported by core competencies, which are aggregate technologies tailored to the specific needs of the organization in performing the needed business processes. Core competencies are non-transitional, have a consistent evolution, and are typically fueled by multiple technologies. Their selection and development requires commitment, investment and leadership. The paradigm introduced in this paper for developing core competencies is the Quality Improvement Paradigm which consists of six steps: (1) Characterize the environment, (2) Set the goals, (3) Choose the process, (4) Execute the process, (5) Analyze the process data, and (6) Package experience. The process must be supported by a goal oriented approach to measurement and control, and an organizational infrastructure, called Experience Factory. The Experience Factory is a logical and physical organization distinct from the project organizations it supports. Its goal is development and support of core competencies through capitalization and reuse of its cycle experience and products. The paper introduces the major concepts of the proposed approach, discusses their relationship with other approaches used in the industry, and presents a case in which those concepts have been successfully applied.

  2. Innovation in managing the referral process at a Canadian pediatric hospital.

    PubMed

    MacGregor, Daune; Parker, Sandra; MacMillan, Sharon; Blais, Irene; Wong, Eugene; Robertson, Chris J; Bruce-Barrett, Cindy

    2009-01-01

    The provision of timely and optimal patient care is a priority in pediatric academic health science centres. Timely access to care is optimized when there is an efficient and consistent referral system in place. In order to improve the patient referral process and, therefore, access to care, an innovative web-based system was developed and implemented. The Ambulatory Referral Management System enables the electronic routing for submission, review, triage and management of all outpatient referrals. The implementation of this system has provided significant metrics that have informed how processes can be improved to increase access to care. Use of the system has improved efficiency in the referral process and has reduced the work associated with the previous paper-based referral system. It has also enhanced communication between the healthcare provider and the patient and family and has improved the security and confidentiality of patient information management. Referral guidelines embedded within the system have helped to ensure that referrals are more complete and that the patient being referred meets the criteria for assessment and treatment in an ambulatory setting. The system calculates and reports on wait times, as well as other measures.

  3. A production-theory-based framework for analysing recycling systems in the e-waste sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, Mario

    2005-07-15

    Modern approaches in the production theory of business and management economics propose that objects (e.g. materials) be divided into good, bad or neutral. In transformation processes such as occur in production or recycling this makes it possible to distinguish stringently between the economic revenue of a process and the economic and ecological expenditures for it. This approach can be transferred to entire systems of processes in order to determine the system revenue and the system expenditure. Material flow nets or graphs are used for this purpose. In complex material flow systems it becomes possible to calculate not only the costs,more » but also the direct and indirect environmental impacts of an individual process or a system revenue (for example a product or the elimination of waste) consistently. The approach permits a stringent analysis as well as different analysis perspectives of a material flow system. It is particularly suitable for closed-loop economic systems in which material backflows occur. With the aid of an example developed jointly with Hewlett Packard Europe, the paper outlines how this approach can be employed in the field of e-waste management.« less

  4. Gaia DR2 documentation Chapter 3: Astrometry

    NASA Astrophysics Data System (ADS)

    Hobbs, D.; Lindegren, L.; Bastian, U.; Klioner, S.; Butkevich, A.; Stephenson, C.; Hernandez, J.; Lammers, U.; Bombrun, A.; Mignard, F.; Altmann, M.; Davidson, M.; de Bruijne, J. H. J.; Fernández-Hernández, J.; Siddiqui, H.; Utrilla Molina, E.

    2018-04-01

    This chapter of the Gaia DR2 documentation describes the models and processing steps used for the astrometric core solution, namely, the Astrometric Global Iterative Solution (AGIS). The inputs to this solution rely heavily on the basic observables (or astrometric elementaries) which have been pre-processed and discussed in Chapter 2, the results of which were published in Fabricius et al. (2016). The models consist of reference systems and time scales; assumed linear stellar motion and relativistic light deflection; in addition to fundamental constants and the transformation of coordinate systems. Higher level inputs such as: planetary and solar system ephemeris; Gaia tracking and orbit information; initial quasar catalogues and BAM data are all needed for the processing described here. The astrometric calibration models are outlined followed by the details processing steps which give AGIS its name. We also present a basic quality assessment and validation of the scientific results (for details, see Lindegren et al. 2018).

  5. Architecture For The Optimization Of A Machining Process In Real Time Through Rule-Based Expert System

    NASA Astrophysics Data System (ADS)

    Serrano, Rafael; González, Luis Carlos; Martín, Francisco Jesús

    2009-11-01

    Under the project SENSOR-IA which has had financial funding from the Order of Incentives to the Regional Technology Centers of the Counsil of Innovation, Science and Enterprise of Andalusia, an architecture for the optimization of a machining process in real time through rule-based expert system has been developed. The architecture consists of an acquisition system and sensor data processing engine (SATD) from an expert system (SE) rule-based which communicates with the SATD. The SE has been designed as an inference engine with an algorithm for effective action, using a modus ponens rule model of goal-oriented rules.The pilot test demonstrated that it is possible to govern in real time the machining process based on rules contained in a SE. The tests have been done with approximated rules. Future work includes an exhaustive collection of data with different tool materials and geometries in a database to extract more precise rules.

  6. The ATLAS EventIndex: architecture, design choices, deployment and first operation experience

    NASA Astrophysics Data System (ADS)

    Barberis, D.; Cárdenas Zárate, S. E.; Cranshaw, J.; Favareto, A.; Fernández Casaní, Á.; Gallas, E. J.; Glasman, C.; González de la Hoz, S.; Hřivnáč, J.; Malon, D.; Prokoshin, F.; Salt Cairols, J.; Sánchez, J.; Többicke, R.; Yuan, R.

    2015-12-01

    The EventIndex is the complete catalogue of all ATLAS events, keeping the references to all files that contain a given event in any processing stage. It replaces the TAG database, which had been in use during LHC Run 1. For each event it contains its identifiers, the trigger pattern and the GUIDs of the files containing it. Major use cases are event picking, feeding the Event Service used on some production sites, and technical checks of the completion and consistency of processing campaigns. The system design is highly modular so that its components (data collection system, storage system based on Hadoop, query web service and interfaces to other ATLAS systems) could be developed separately and in parallel during LSI. The EventIndex is in operation for the start of LHC Run 2. This paper describes the high-level system architecture, the technical design choices and the deployment process and issues. The performance of the data collection and storage systems, as well as the query services, are also reported.

  7. Stochastic dynamics of coupled active particles in an overdamped limit

    NASA Astrophysics Data System (ADS)

    Ann, Minjung; Lee, Kong-Ju-Bock; Park, Pyeong Jun

    2015-10-01

    We introduce a model for Brownian dynamics of coupled active particles in an overdamped limit. Our system consists of several identical active particles and one passive particle. Each active particle is elastically coupled to the passive particle and there is no direct coupling among the active particles. We investigate the dynamics of the system with respect to the number of active particles, viscous friction, and coupling between the active and passive particles. For this purpose, we consider an intracellular transport process as an application of our model and perform a Brownian dynamics simulation using realistic parameters for processive molecular motors such as kinesin-1. We determine an adequate energy conversion function for molecular motors and study the dynamics of intracellular transport by multiple motors. The results show that the average velocity of the coupled system is not affected by the number of active motors and that the stall force increases linearly as the number of motors increases. Our results are consistent with well-known experimental observations. We also examine the effects of coupling between the motors and the cargo, as well as of the spatial distribution of the motors around the cargo. Our model might provide a physical explanation of the cooperation among active motors in the cellular transport processes.

  8. Application-ready expedited MODIS data for operational land surface monitoring of vegetation condition

    USGS Publications Warehouse

    Brown, Jesslyn; Howard, Daniel M.; Wylie, Bruce K.; Friesz, Aaron M.; Ji, Lei; Gacke, Carolyn

    2015-01-01

    Monitoring systems benefit from high temporal frequency image data collected from the Moderate Resolution Imaging Spectroradiometer (MODIS) system. Because of near-daily global coverage, MODIS data are beneficial to applications that require timely information about vegetation condition related to drought, flooding, or fire danger. Rapid satellite data streams in operational applications have clear benefits for monitoring vegetation, especially when information can be delivered as fast as changing surface conditions. An “expedited” processing system called “eMODIS” operated by the U.S. Geological Survey provides rapid MODIS surface reflectance data to operational applications in less than 24 h offering tailored, consistently-processed information products that complement standard MODIS products. We assessed eMODIS quality and consistency by comparing to standard MODIS data. Only land data with known high quality were analyzed in a central U.S. study area. When compared to standard MODIS (MOD/MYD09Q1), the eMODIS Normalized Difference Vegetation Index (NDVI) maintained a strong, significant relationship to standard MODIS NDVI, whether from morning (Terra) or afternoon (Aqua) orbits. The Aqua eMODIS data were more prone to noise than the Terra data, likely due to differences in the internal cloud mask used in MOD/MYD09Q1 or compositing rules. Post-processing temporal smoothing decreased noise in eMODIS data.

  9. A Distributed Data Acquisition System for the Sensor Network of the TAWARA_RTM Project

    NASA Astrophysics Data System (ADS)

    Fontana, Cristiano Lino; Donati, Massimiliano; Cester, Davide; Fanucci, Luca; Iovene, Alessandro; Swiderski, Lukasz; Moretto, Sandra; Moszynski, Marek; Olejnik, Anna; Ruiu, Alessio; Stevanato, Luca; Batsch, Tadeusz; Tintori, Carlo; Lunardon, Marcello

    This paper describes a distributed Data Acquisition System (DAQ) developed for the TAWARA_RTM project (TAp WAter RAdioactivity Real Time Monitor). The aim is detecting the presence of radioactive contaminants in drinking water; in order to prevent deliberate or accidental threats. Employing a set of detectors, it is possible to detect alpha, beta and gamma radiations, from emitters dissolved in water. The Sensor Network (SN) consists of several heterogeneous nodes controlled by a centralized server. The SN cyber-security is guaranteed in order to protect it from external intrusions and malicious acts. The nodes were installed in different locations, along the water treatment processes, in the waterworks plant supplying the aqueduct of Warsaw, Poland. Embedded computers control the simpler nodes, and are directly connected to the SN. Local-PCs (LPCs) control the more complex nodes that consist signal digitizers acquiring data from several detectors. The DAQ in the LPC is split in several processes communicating with sockets in a local sub-network. Each process is dedicated to a very simple task (e.g. data acquisition, data analysis, hydraulics management) in order to have a flexible and fault-tolerant system. The main SN and the local DAQ networks are separated by data routers to ensure the cyber-security.

  10. Development and Testing of an Experimental Polysensory Instructional System for Teaching Electric Arc Welding Processes. Report No. 24. Final Report.

    ERIC Educational Resources Information Center

    Sergeant, Harold A.

    The population of the study consisted of 15 high school industrial arts students, 10 freshman and sophomore college students, and 10 adults. A polysensory, self-pacing instructional system was developed which included (1) pretests and post tests, (2) a general instruction book, (3) equipment to practice arc welding, (4) programed instruction…

  11. The Most Preferred and Effective Reviewer of L2 Writing among Automated Grading System, Peer Reviewer and Teacher

    ERIC Educational Resources Information Center

    Tsai, Min-Hsiu

    2017-01-01

    Who is the most preferred and deemed the most helpful reviewer in improving student writing? This study exercised a blended teaching method which consists of three currently prevailing reviewers: the automated grading system (AGS, a web-based method), the peer review (a process-oriented approach), and the teacher grading technique (the…

  12. Markov Chains For Testing Redundant Software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1990-01-01

    Preliminary design developed for validation experiment that addresses problems unique to assuring extremely high quality of multiple-version programs in process-control software. Approach takes into account inertia of controlled system in sense it takes more than one failure of control program to cause controlled system to fail. Verification procedure consists of two steps: experimentation (numerical simulation) and computation, with Markov model for each step.

  13. Estimating Nitrogen Loading in the Wabash River Subwatershed Using a GIS Schematic Processing Network in Support of Sustainable Watershed Management Planning

    EPA Science Inventory

    The Wabash River is a tributary of the Ohio River. This river system consists of headwaters and small streams, medium river reaches in the upper Wabash watershed, and large river reaches in the lower Wabash watershed. A large part of the river system is situated in agricultural a...

  14. An Analysis of the United States Air Force Energy Savings Performance Contracts

    DTIC Science & Technology

    2007-12-01

    key element of the ESPC system. Chapter IV uses the standard contracting processes to review the USAF implementations of strategic purchasing with...process and each level facilitates regionalization, which is the current implementation method of strategic purchasing for energy service management...the existing regulations that are inconsistent with the ESPC intent , and 3) to formulate substitute regulations consistent with laws governing Federal

  15. Effects of Reflection Category and Reflection Quality on Learning Outcomes during Web-Based Portfolio Assessment Process: A Case Study of High School Students in Computer Application Course

    ERIC Educational Resources Information Center

    Chou, Pao-Nan; Chang, Chi-Cheng

    2011-01-01

    This study examines the effects of reflection category and reflection quality on learning outcomes during Web-based portfolio assessment process. Experimental subjects consist of forty-five eight-grade students in a "Computer Application" course. Through the Web-based portfolio assessment system, these students write reflection, and join…

  16. Task allocation model for minimization of completion time in distributed computer systems

    NASA Astrophysics Data System (ADS)

    Wang, Jai-Ping; Steidley, Carl W.

    1993-08-01

    A task in a distributed computing system consists of a set of related modules. Each of the modules will execute on one of the processors of the system and communicate with some other modules. In addition, precedence relationships may exist among the modules. Task allocation is an essential activity in distributed-software design. This activity is of importance to all phases of the development of a distributed system. This paper establishes task completion-time models and task allocation models for minimizing task completion time. Current work in this area is either at the experimental level or without the consideration of precedence relationships among modules. The development of mathematical models for the computation of task completion time and task allocation will benefit many real-time computer applications such as radar systems, navigation systems, industrial process control systems, image processing systems, and artificial intelligence oriented systems.

  17. Digital Beamforming Scatterometer

    NASA Technical Reports Server (NTRS)

    Rincon, Rafael F.; Vega, Manuel; Kman, Luko; Buenfil, Manuel; Geist, Alessandro; Hillard, Larry; Racette, Paul

    2009-01-01

    This paper discusses scatterometer measurements collected with multi-mode Digital Beamforming Synthetic Aperture Radar (DBSAR) during the SMAP-VEX 2008 campaign. The 2008 SMAP Validation Experiment was conducted to address a number of specific questions related to the soil moisture retrieval algorithms. SMAP-VEX 2008 consisted on a series of aircraft-based.flights conducted on the Eastern Shore of Maryland and Delaware in the fall of 2008. Several other instruments participated in the campaign including the Passive Active L-Band System (PALS), the Marshall Airborne Polarimetric Imaging Radiometer (MAPIR), and the Global Positioning System Reflectometer (GPSR). This campaign was the first SMAP Validation Experiment. DBSAR is a multimode radar system developed at NASA/Goddard Space Flight Center that combines state-of-the-art radar technologies, on-board processing, and advances in signal processing techniques in order to enable new remote sensing capabilities applicable to Earth science and planetary applications [l]. The instrument can be configured to operate in scatterometer, Synthetic Aperture Radar (SAR), or altimeter mode. The system builds upon the L-band Imaging Scatterometer (LIS) developed as part of the RadSTAR program. The radar is a phased array system designed to fly on the NASA P3 aircraft. The instrument consists of a programmable waveform generator, eight transmit/receive (T/R) channels, a microstrip antenna, and a reconfigurable data acquisition and processor system. Each transmit channel incorporates a digital attenuator, and digital phase shifter that enables amplitude and phase modulation on transmit. The attenuators, phase shifters, and calibration switches are digitally controlled by the radar control card (RCC) on a pulse by pulse basis. The antenna is a corporate fed microstrip patch-array centered at 1.26 GHz with a 20 MHz bandwidth. Although only one feed is used with the present configuration, a provision was made for separate corporate feeds for vertical and horizontal polarization. System upgrades to dual polarization are currently under way. The DBSAR processor is a reconfigurable data acquisition and processor system capable of real-time, high-speed data processing. DBSAR uses an FPGA-based architecture to implement digitally down-conversion, in-phase and quadrature (I/Q) demodulation, and subsequent radar specific algorithms. The core of the processor board consists of an analog-to-digital (AID) section, three Altera Stratix field programmable gate arrays (FPGAs), an ARM microcontroller, several memory devices, and an Ethernet interface. The processor also interfaces with a navigation board consisting of a GPS and a MEMS gyro. The processor has been configured to operate in scatterometer, Synthetic Aperture Radar (SAR), and altimeter modes. All the modes are based on digital beamforming which is a digital process that generates the far-field beam patterns at various scan angles from voltages sampled in the antenna array. This technique allows steering the received beam and controlling its beam-width and side-lobe. Several beamforming techniques can be implemented each characterized by unique strengths and weaknesses, and each applicable to different measurement scenarios. In Scatterometer mode, the radar is capable to.generate a wide beam or scan a narrow beam on transmit, and to steer the received beam on processing while controlling its beamwidth and side-lobe level. Table I lists some important radar characteristics

  18. Multibus-based parallel processor for simulation

    NASA Technical Reports Server (NTRS)

    Ogrady, E. P.; Wang, C.-H.

    1983-01-01

    A Multibus-based parallel processor simulation system is described. The system is intended to serve as a vehicle for gaining hands-on experience, testing system and application software, and evaluating parallel processor performance during development of a larger system based on the horizontal/vertical-bus interprocessor communication mechanism. The prototype system consists of up to seven Intel iSBC 86/12A single-board computers which serve as processing elements, a multiple transmission controller (MTC) designed to support system operation, and an Intel Model 225 Microcomputer Development System which serves as the user interface and input/output processor. All components are interconnected by a Multibus/IEEE 796 bus. An important characteristic of the system is that it provides a mechanism for a processing element to broadcast data to other selected processing elements. This parallel transfer capability is provided through the design of the MTC and a minor modification to the iSBC 86/12A board. The operation of the MTC, the basic hardware-level operation of the system, and pertinent details about the iSBC 86/12A and the Multibus are described.

  19. A web-based computer aided system for liver surgery planning: initial implementation on RayPlus

    NASA Astrophysics Data System (ADS)

    Luo, Ming; Yuan, Rong; Sun, Zhi; Li, Tianhong; Xie, Qingguo

    2016-03-01

    At present, computer aided systems for liver surgery design and risk evaluation are widely used in clinical all over the world. However, most systems are local applications that run on high-performance workstations, and the images have to processed offline. Compared with local applications, a web-based system is accessible anywhere and for a range of regardless of relative processing power or operating system. RayPlus (http://rayplus.life.hust.edu.cn), a B/S platform for medical image processing, was developed to give a jump start on web-based medical image processing. In this paper, we implement a computer aided system for liver surgery planning on the architecture of RayPlus. The system consists of a series of processing to CT images including filtering, segmentation, visualization and analyzing. Each processing is packaged into an executable program and runs on the server side. CT images in DICOM format are processed step by to interactive modeling on browser with zero-installation and server-side computing. The system supports users to semi-automatically segment the liver, intrahepatic vessel and tumor from the pre-processed images. Then, surface and volume models are built to analyze the vessel structure and the relative position between adjacent organs. The results show that the initial implementation meets satisfactorily its first-order objectives and provide an accurate 3D delineation of the liver anatomy. Vessel labeling and resection simulation are planned to add in the future. The system is available on Internet at the link mentioned above and an open username for testing is offered.

  20. The Quality Control Algorithms Used in the Creation of NASA Kennedy Space Center Lightning Protection System Towers Meteorological Database

    NASA Technical Reports Server (NTRS)

    Orcutt, John M.; Brenton, James C.

    2016-01-01

    An accurate database of meteorological data is essential for designing any aerospace vehicle and for preparing launch commit criteria. Meteorological instrumentation were recently placed on the three Lightning Protection System (LPS) towers at Kennedy Space Center (KSC) launch complex 39B (LC-39B), which provide a unique meteorological dataset existing at the launch complex over an extensive altitude range. Data records of temperature, dew point, relative humidity, wind speed, and wind direction are produced at 40, 78, 116, and 139 m at each tower. The Marshall Space Flight Center Natural Environments Branch (EV44) received an archive that consists of one-minute averaged measurements for the period of record of January 2011 - April 2015. However, before the received database could be used EV44 needed to remove any erroneous data from within the database through a comprehensive quality control (QC) process. The QC process applied to the LPS towers' meteorological data is similar to other QC processes developed by EV44, which were used in the creation of meteorological databases for other towers at KSC. The QC process utilized in this study has been modified specifically for use with the LPS tower database. The QC process first includes a check of each individual sensor. This check includes removing any unrealistic data and checking the temporal consistency of each variable. Next, data from all three sensors at each height are checked against each other, checked against climatology, and checked for sensors that erroneously report a constant value. Then, a vertical consistency check of each variable at each tower is completed. Last, the upwind sensor at each level is selected to minimize the influence of the towers and other structures at LC-39B on the measurements. The selection process for the upwind sensor implemented a study of tower-induced turbulence. This paper describes in detail the QC process, QC results, and the attributes of the LPS towers meteorological database.

  1. Image velocimetry for clouds with relaxation labeling based on deformation consistency

    NASA Astrophysics Data System (ADS)

    Horinouchi, Takeshi; Murakami, Shin-ya; Kouyama, Toru; Ogohara, Kazunori; Yamazaki, Atsushi; Yamada, Manabu; Watanabe, Shigeto

    2017-08-01

    Correlation-based cloud tracking has been extensively used to measure atmospheric winds, but still difficulty remains. In this study, aiming at developing a cloud tracking system for Akatsuki, an artificial satellite orbiting Venus, a formulation is developed for improving the relaxation labeling technique to select appropriate peaks of cross-correlation surfaces which tend to have multiple peaks. The formulation makes an explicit use of consistency inherent in the type of cross-correlation method where template sub-images are slid without deformation; if the resultant motion vectors indicate a too-large deformation, it is contradictory to the assumption of the method. The deformation consistency is exploited further to develop two post processes; one clusters the motion vectors into groups within each of which the consistency is perfect, and the other extends the groups using the original candidate lists. These processes are useful to eliminate erroneous vectors, distinguish motion vectors at different altitudes, and detect phase velocities of waves in fluids such as atmospheric gravity waves. As a basis of the relaxation labeling and the post processes as well as uncertainty estimation, the necessity to find isolated (well-separated) peaks of cross-correlation surfaces is argued, and an algorithm to realize it is presented. All the methods are implemented, and their effectiveness is demonstrated with initial images obtained by the ultraviolet imager onboard Akatsuki. Since the deformation consistency regards the logical consistency inherent in template matching methods, it should have broad application beyond cloud tracking.

  2. Ion Implantation with in-situ Patterning for IBC Solar Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graff, John W.

    2014-10-24

    Interdigitated back-side Contact (IBC) solar cells are the highest efficiency silicon solar cells currently on the market. Unfortunately the cost to produce these solar cells is also very high, due to the large number of processing steps required. Varian believes that only the combination of high efficiency and low cost can meet the stated goal of $1/Wp. The core of this program has been to develop an in-situ patterning capability for an ion implantation system capable of producing patterned doped regions for IBC solar cells. Such a patterning capable ion implanter can reduce the number of process steps required tomore » manufacture IBC cells, and therefore significantly reduce the cost. The present program was organized into three phases. Phase I was to select a patterning approach and determine the patterning requirements for IBC cells. Phase II consists of construction of a Beta ion implantation system containing in-situ patterning capability. Phase III consists of shipping and installation of the ion implant system in a customer factory where it will be tested and proven in a pilot production line.« less

  3. Viking image processing. [digital stereo imagery and computer mosaicking

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    The paper discusses the camera systems capable of recording black and white and color imagery developed for the Viking Lander imaging experiment. Each Viking Lander image consisted of a matrix of numbers with 512 rows and an arbitrary number of columns up to a maximum of about 9,000. Various techniques were used in the processing of the Viking Lander images, including: (1) digital geometric transformation, (2) the processing of stereo imagery to produce three-dimensional terrain maps, and (3) computer mosaicking of distinct processed images. A series of Viking Lander images is included.

  4. Power Laws in Stochastic Processes for Social Phenomena: An Introductory Review

    NASA Astrophysics Data System (ADS)

    Kumamoto, Shin-Ichiro; Kamihigashi, Takashi

    2018-03-01

    Many phenomena with power laws have been observed in various fields of the natural and social sciences, and these power laws are often interpreted as the macro behaviors of systems that consist of micro units. In this paper, we review some basic mathematical mechanisms that are known to generate power laws. In particular, we focus on stochastic processes including the Yule process and the Simon process as well as some recent models. The main purpose of this paper is to explain the mathematical details of their mechanisms in a self-contained manner.

  5. Development of multichannel analyzer using sound card ADC for nuclear spectroscopy system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibrahim, Maslina Mohd; Yussup, Nolida; Lombigit, Lojius

    This paper describes the development of Multi-Channel Analyzer (MCA) using sound card analogue to digital converter (ADC) for nuclear spectroscopy system. The system was divided into a hardware module and a software module. Hardware module consist of detector NaI (Tl) 2” by 2”, Pulse Shaping Amplifier (PSA) and a build in ADC chip from readily available in any computers’ sound system. The software module is divided into two parts which are a pre-processing of raw digital input and the development of the MCA software. Band-pass filter and baseline stabilization and correction were implemented for the pre-processing. For the MCA development,more » the pulse height analysis method was used to process the signal before displaying it using histogram technique. The development and tested result for using the sound card as an MCA are discussed.« less

  6. Automatic mine detection based on multiple features

    NASA Astrophysics Data System (ADS)

    Yu, Ssu-Hsin; Gandhe, Avinash; Witten, Thomas R.; Mehra, Raman K.

    2000-08-01

    Recent research sponsored by the Army, Navy and DARPA has significantly advanced the sensor technologies for mine detection. Several innovative sensor systems have been developed and prototypes were built to investigate their performance in practice. Most of the research has been focused on hardware design. However, in order for the systems to be in wide use instead of in limited use by a small group of well-trained experts, an automatic process for mine detection is needed to make the final decision process on mine vs. no mine easier and more straightforward. In this paper, we describe an automatic mine detection process consisting of three stage, (1) signal enhancement, (2) pixel-level mine detection, and (3) object-level mine detection. The final output of the system is a confidence measure that quantifies the presence of a mine. The resulting system was applied to real data collected using radar and acoustic technologies.

  7. Monitoring and tracing of critical software systems: State of the work and project definition

    DTIC Science & Technology

    2008-12-01

    analysis, troubleshooting and debugging. Some of these subsystems already come with ad hoc tracers for events like wireless connections or SCSI disk... SQLite ). Additional synthetic events (e.g. states) are added to the database. The database thus consists in contexts (process, CPU, state), event...capability on a [operating] system-by-system basis. Additionally, the mechanics of querying the data in an ad - hoc manner outside the boundaries of the

  8. Commercial Digital/ADP Equipment in the Ocean Environment. Volume 2. User Appendices

    DTIC Science & Technology

    1978-12-15

    is that the LINDA system uses a mini computer with a time sharing system software which allows several terminals to be operated at the same time...Acquisition System (ODAS) consists of sensors, computer hardware and computer software . Certain sensors are interfaced to the computers for real time...on USNS KANE, USNS BENT, and USKS WILKES. Commercial automatic data processing equipment used in ODAS includes: Item Model Computer PDP-9 Tape

  9. A detonation wave in the system liquid-gas bubbles

    NASA Astrophysics Data System (ADS)

    Sychev, A. I.

    1985-06-01

    The shock-wave ignition of a system consisting of a liquid (H2O) and bubbles of an explosive gas mixture (C2H2+2.5O2) is investigated experimentally and analytically. The possibility of the existence of a detonation wave, a supersonic self-sustaining process, in a gas-liquid system is demonstrated. The conditions for the existence of a detonation wave are determined, and the initiation mechanism is analyzed.

  10. Ecposure Related Dose Estimating Model

    EPA Science Inventory

    ERDEM is a physiologically based pharmacokinetic (PBPK) modeling system consisting of a general model and an associated front end. An actual model is defined when the user prepares an input command file. Such a command file defines the chemicals, compartments and processes that...

  11. STUDY ON THE RECYCLING SYSTEM OF WASTE PLASTICS AND MIXED PAPER FROM A LONG-TERM PERSPECTIVE

    NASA Astrophysics Data System (ADS)

    Fujii, Minoru; Fujita, Tsuyoshi; Chen, Xudong; Ohnishi, Satoshi; Osako, Masahiro; Moriguchi, Yuichi; Yamaguchi, Naohisa

    Plastics and mixed paper in municipal solid waste are valuable resources with high calorific value. However, the recycling cost to utilize them tends to be expensive. In addition, recycling system has to be consistent with the reduce of wastes on which should be put higher-priority to lower carbon emission and save resources in the long term. In this paper, we proposed a recycling system (smart recycling system) which consists of a local center an d existing facilities in arterial industries. In the local center, collected waste plastics and mixed paper from household are processed on the same line into a form suitable for transportation and handling in a facility of arterial in dustry which can utilize those wastes effectively. At the same time, a part of plastics with high quality is processed into a recycled resin in the center. It was suggested that, by utilizing existing facilities in arterial industries which have enough and flexible capacity to accept those wastes, the system can be a robust system even if the amount of wastes generation fluctuates widely. The effect of CO2 reduction and cost by installing the system were calculated and it was estimated that 3.5 million ton of additional annual CO2 reduction could be brought in Tokyo and surrounding three prefectures without co nsiderable increase in cost.

  12. Application of the thermoelectric MEMS microwave power sensor in a power radiation monitoring system

    NASA Astrophysics Data System (ADS)

    Bo, Gao; Jing, Yang; Si, Jiang; Debo, Wang

    2016-08-01

    A power radiation monitoring system based on thermoelectric MEMS microwave power sensors is studied. This monitoring system consists of three modules: a data acquisition module, a data processing and display module, and a data sharing module. It can detect the power radiation in the environment and the date information can be processed and shared. The measured results show that the thermoelectric MEMS microwave power sensor and the power radiation monitoring system both have a relatively good linearity. The sensitivity of the thermoelectric MEMS microwave power sensor is about 0.101 mV/mW, and the sensitivity of the monitoring system is about 0.038 V/mW. The voltage gain of the monitoring system is about 380 times, which is relatively consistent with the theoretical value. In addition, the low-frequency and low-power module in the monitoring system is adopted in order to reduce the electromagnetic pollution and the power consumption, and this work will extend the application of the thermoelectric MEMS microwave power sensor in more areas. Project supported by the National Natural Science Foundation of China (No. 11304158), the Province Natural Science Foundation of Jiangsu (No. BK20140890), the Open Research Fund of the Key Laboratory of MEMS of Ministry of Education, Southeast University (No. 3206005302), and the Scientific Research Foundation of Nanjing University of Posts and Telecommunications (Nos. NY213024, NY215139).

  13. Cryogenic Insulation System

    NASA Technical Reports Server (NTRS)

    Davis, Randall C. (Inventor); Taylor, Allan H. (Inventor); Jackson, L. Robert (Inventor); Mcauliffe, Patrick S. (Inventor)

    1988-01-01

    This invention relates to reusable, low density, high temperature cryogenic foam insulation systems and the process for their manufacture. A pacing technology for liquid hydrogen fueled, high speed aircraft is the development of a fully reusable, flight weight cryogenic insulation system for propellant tank structures. In the invention cryogenic foam insulation is adhesively bonded to the outer wall of the fuel tank structure. The cryogenic insulation consists of square sheets fabricated from an array of abutting square blocks. Each block consists of a sheet of glass cloth adhesively bonded between two layers of polymethacrylimide foam. Each block is wrapped in a vapor impermeable membrane, such as Kapton(R) aluminum Kapton(R), to provide a vapor barrier. Very beneficial results can be obtained by employing the present invention in conjunction with fibrous insulation and an outer aeroshell, a hot fuselage structure with an internal thermal protection system.

  14. Coupling biology and oceanography in models.

    PubMed

    Fennel, W; Neumann, T

    2001-08-01

    The dynamics of marine ecosystems, i.e. the changes of observable chemical-biological quantities in space and time, are driven by biological and physical processes. Predictions of future developments of marine systems need a theoretical framework, i.e. models, solidly based on research and understanding of the different processes involved. The natural way to describe marine systems theoretically seems to be the embedding of chemical-biological models into circulation models. However, while circulation models are relatively advanced the quantitative theoretical description of chemical-biological processes lags behind. This paper discusses some of the approaches and problems in the development of consistent theories and indicates the beneficial potential of the coupling of marine biology and oceanography in models.

  15. Processing and Properties of a Phenolic Composite System

    NASA Technical Reports Server (NTRS)

    Hou, Tan-Hung; Bai, J. M.; Baughman, James M.

    2006-01-01

    Phenolic resin systems generate water as a reaction by-product via condensation reactions during curing at elevated temperatures. In the fabrication of fiber reinforced phenolic resin matrix composites, volatile management is crucial in producing void-free quality laminates. A commercial vacuum-bag moldable phenolic prepreg system was selected for this study. The traditional single-vacuum-bag (SVB) process was unable to manage the volatiles effectively, resulting in inferior voidy laminates. However, a double vacuum bag (DVB) process was shown to afford superior volatile management and consistently yielded void-free quality parts. The DVB process cure cycle (temperature /pressure profiles) for the selected composite system was designed, with the vacuum pressure application point carefully selected, to avoid excessive resin squeeze-outs and achieve the net shape and target resin content in the final consolidated laminate parts. Laminate consolidation quality was characterized by optical photomicrography for the cross sections and measurements of mechanical properties. A 40% increase in short beam shear strength, 30% greater flexural strength, 10% higher tensile and 18% higher compression strengths were obtained in composite laminates fabricated by the DVB process.

  16. Simplifying the complexity surrounding ICU work processes--identifying the scope for information management in ICU settings.

    PubMed

    Munir, Samina K; Kay, Stephen

    2005-08-01

    A multi-site study, conducted in two English and two Danish intensive care units, investigates the complexity of work processes in intensive care, and the implications of this complexity for information management with regards to clinical information systems. Data were collected via observations, shadowing of clinical staff, interviews and questionnaires. The construction of role activity diagrams enabled the capture of critical care work processes. Upon analysing these diagrams, it was found that intensive care work processes consist of 'simplified-complexity', these processes are changed with the introduction of information systems for the everyday use and management of all clinical information. The prevailing notion of complexity surrounding critical care clinical work processes was refuted and found to be misleading; in reality, it is not the work processes that cause the complexity, the complexity is rooted in the way in which clinical information is used and managed. This study emphasises that the potential for clinical information systems that consider integrating all clinical information requirements is not only immense but also very plausible.

  17. Harnessing Disordered-Ensemble Quantum Dynamics for Machine Learning

    NASA Astrophysics Data System (ADS)

    Fujii, Keisuke; Nakajima, Kohei

    2017-08-01

    The quantum computer has an amazing potential of fast information processing. However, the realization of a digital quantum computer is still a challenging problem requiring highly accurate controls and key application strategies. Here we propose a platform, quantum reservoir computing, to solve these issues successfully by exploiting the natural quantum dynamics of ensemble systems, which are ubiquitous in laboratories nowadays, for machine learning. This framework enables ensemble quantum systems to universally emulate nonlinear dynamical systems including classical chaos. A number of numerical experiments show that quantum systems consisting of 5-7 qubits possess computational capabilities comparable to conventional recurrent neural networks of 100-500 nodes. This discovery opens up a paradigm for information processing with artificial intelligence powered by quantum physics.

  18. Controlling the digital transfer process

    NASA Astrophysics Data System (ADS)

    Brunner, Felix

    1997-02-01

    The accuracy of today's color management systems fails to satisfy the requirements of the graphic arts market. A first explanation for this is that color calibration charts on which these systems rely, because of print technical reasons, are subject to color deviations and inconsistencies. A second reason is that colorimetry describes the human visual perception of color differences and has no direct relation to the rendering technology itself of a proofing or printing device. The author explains that only firm process control of the many parameters in offset printing by means of a system as for example EUROSTANDARD System Brunner, can lead to accurate and consistent calibration of scanner, display, proof and print. The same principles hold for the quality management of digital presses.

  19. Life insurance risk assessment using a fuzzy logic expert system

    NASA Technical Reports Server (NTRS)

    Carreno, Luis A.; Steel, Roy A.

    1992-01-01

    In this paper, we present a knowledge based system that combines fuzzy processing with rule-based processing to form an improved decision aid for evaluating risk for life insurance. This application illustrates the use of FuzzyCLIPS to build a knowledge based decision support system possessing fuzzy components to improve user interactions and KBS performance. The results employing FuzzyCLIPS are compared with the results obtained from the solution of the problem using traditional numerical equations. The design of the fuzzy solution consists of a CLIPS rule-based system for some factors combined with fuzzy logic rules for others. This paper describes the problem, proposes a solution, presents the results, and provides a sample output of the software product.

  20. The planned Alaska SAR Facility - An overview

    NASA Technical Reports Server (NTRS)

    Carsey, Frank; Weeks, Wilford

    1987-01-01

    The Alaska SAR Facility (ASF) is described in an overview fashion. The facility consists of three major components, a Receiving Ground System, a SAR Processing System and an Analysis and Archiving System; the ASF Program also has a Science Working Team and the requisite management and operations systems. The ASF is now an approved and fully funded activity; detailed requirements and science background are presented for the facility to be implemented for data from the European ERS-1, the Japanese ERS-1 and Radarsat.

  1. A Procedure for Measuring Latencies in Brain-Computer Interfaces

    PubMed Central

    Wilson, J. Adam; Mellinger, Jürgen; Schalk, Gerwin; Williams, Justin

    2011-01-01

    Brain-computer interface (BCI) systems must process neural signals with consistent timing in order to support adequate system performance. Thus, it is important to have the capability to determine whether a particular BCI configuration (i.e., hardware, software) provides adequate timing performance for a particular experiment. This report presents a method of measuring and quantifying different aspects of system timing in several typical BCI experiments across a range of settings, and presents comprehensive measures of expected overall system latency for each experimental configuration. PMID:20403781

  2. Human Haptic Interaction with Soft Objects: Discriminability, Force Control, and Contact Visualization

    DTIC Science & Technology

    1998-01-01

    consisted of a videomicroscopy system and a tactile stimulator system. By using this setup, real-time images from the contact region as wvell as the... Videomicroscopy system . 4.3.2 Tactile stimulator svsteln . 4.3.3 Real-time imaging setup. 4.3.4 Active and passive touch experiments. 4.3.5...contact process is an important step. In this study, therefore, a videomicroscopy system was built’to visualize the contact re- gion of the fingerpad

  3. Low-cost data analysis systems for processing multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Whitely, S. L.

    1976-01-01

    The basic hardware and software requirements are described for four low cost analysis systems for computer generated land use maps. The data analysis systems consist of an image display system, a small digital computer, and an output recording device. Software is described together with some of the display and recording devices, and typical costs are cited. Computer requirements are given, and two approaches are described for converting black-white film and electrostatic printer output to inexpensive color output products. Examples of output products are shown.

  4. Formal mechanization of device interactions with a process algebra

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, Karl; Cohen, Gerald C.

    1992-01-01

    The principle emphasis is to develop a methodology to formally verify correct synchronization communication of devices in a composed hardware system. Previous system integration efforts have focused on vertical integration of one layer on top of another. This task examines 'horizontal' integration of peer devices. To formally reason about communication, we mechanize a process algebra in the Higher Order Logic (HOL) theorem proving system. Using this formalization we show how four types of device interactions can be represented and verified to behave as specified. The report also describes the specification of a system consisting of an AVM-1 microprocessor and a memory management unit which were verified in previous work. A proof of correct communication is presented, and the extensions to the system specification to add a direct memory device are discussed.

  5. A knowledge-based approach to configuration layout, justification, and documentation

    NASA Technical Reports Server (NTRS)

    Craig, F. G.; Cutts, D. E.; Fennel, T. R.; Case, C.; Palmer, J. R.

    1990-01-01

    The design, development, and implementation is described of a prototype expert system which could aid designers and system engineers in the placement of racks aboard modules on Space Station Freedom. This type of problem is relevant to any program with multiple constraints and requirements demanding solutions which minimize usage of limited resources. This process is generally performed by a single, highly experienced engineer who integrates all the diverse mission requirements and limitations, and develops an overall technical solution which meets program and system requirements with minimal cost, weight, volume, power, etc. This system architect performs an intellectual integration process in which the underlying design rationale is often not fully documented. This is a situation which lends itself to an expert system solution for enhanced consistency, thoroughness, documentation, and change assessment capabilities.

  6. A Knowledge-Based Approach to Configuration Layout, Justification, and Documentation

    NASA Technical Reports Server (NTRS)

    Craig, F. G.; Cutts, D. E.; Fennel, T. R.; Case, C. M.; Palmer, J. R.

    1991-01-01

    The design, development, and implementation of a prototype expert system which could aid designers and system engineers in the placement of racks aboard modules on the Space Station Freedom are described. This type of problem is relevant to any program with multiple constraints and requirements demanding solutions which minimize usage of limited resources. This process is generally performed by a single, highly experienced engineer who integrates all the diverse mission requirements and limitations, and develops an overall technical solution which meets program and system requirements with minimal cost, weight, volume, power, etc. This system architect performs an intellectual integration process in which the underlying design rationale is often not fully documented. This is a situation which lends itself to an expert system solution for enhanced consistency, thoroughness, documentation, and change assessment capabilities.

  7. Consistent model driven architecture

    NASA Astrophysics Data System (ADS)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  8. Acoustic Signal Processing in Photorefractive Optical Systems.

    NASA Astrophysics Data System (ADS)

    Zhou, Gan

    This thesis discusses applications of the photorefractive effect in the context of acoustic signal processing. The devices and systems presented here illustrate the ideas and optical principles involved in holographic processing of acoustic information. The interest in optical processing stems from the similarities between holographic optical systems and contemporary models for massively parallel computation, in particular, neural networks. An initial step in acoustic processing is the transformation of acoustic signals into relevant optical forms. A fiber-optic transducer with photorefractive readout transforms acoustic signals into optical images corresponding to their short-time spectrum. The device analyzes complex sound signals and interfaces them with conventional optical correlators. The transducer consists of 130 multimode optical fibers sampling the spectral range of 100 Hz to 5 kHz logarithmically. A physical model of the human cochlea can help us understand some characteristics of human acoustic transduction and signal representation. We construct a life-sized cochlear model using elastic membranes coupled with two fluid-filled chambers, and use a photorefractive novelty filter to investigate its response. The detection sensitivity is determined to be 0.3 angstroms per root Hz at 2 kHz. Qualitative agreement is found between the model response and physiological data. Delay lines map time-domain signals into space -domain and permit holographic processing of temporal information. A parallel optical delay line using dynamic beam coupling in a rotating photorefractive crystal is presented. We experimentally demonstrate a 64 channel device with 0.5 seconds of time-delay and 167 Hz bandwidth. Acoustic signal recognition is described in a photorefractive system implementing the time-delay neural network model. The system consists of a photorefractive optical delay-line and a holographic correlator programmed in a LiNbO_3 crystal. We demonstrate the recognition of synthesized chirps as well as spoken words. A photorefractive ring resonator containing an optical delay line can learn temporal information through self-organization. We experimentally investigate a system that learns by itself and picks out the most-frequently -presented signals from the input. We also give results demonstrating the separation of two orthogonal temporal signals into two competing ring resonators.

  9. Advanced technologies for maintenance of electrical systems and equipment at the Savannah River Site Defense Waste Processing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Husler, R.O.; Weir, T.J.

    1991-01-01

    An enhanced maintenance program is being established to characterize and monitor cables, components, and process response at the Savannah River Site, Defense Waste Processing Facility. This facility was designed and constructed to immobilize the radioactive waste currently stored in underground storage tanks and is expected to begin operation in 1993. The plant is initiating the program to baseline and monitor instrument and control (I C) and electrical equipment, remote process equipment, embedded instrument and control cables, and in-cell jumper cables used in the facility. This program is based on the electronic characterization and diagnostic (ECAD) system which was modified tomore » include process response analysis and to meet rigid Department of Energy equipment requirements. The system consists of computer-automated, state-of-the-art electronics. The data that are gathered are stored in a computerized database for analysis, trending, and troubleshooting. It is anticipated that the data which are gathered and trended will aid in life extension for the facility.« less

  10. Advanced technologies for maintenance of electrical systems and equipment at the Savannah River Site Defense Waste Processing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Husler, R.O.; Weir, T.J.

    1991-12-31

    An enhanced maintenance program is being established to characterize and monitor cables, components, and process response at the Savannah River Site, Defense Waste Processing Facility. This facility was designed and constructed to immobilize the radioactive waste currently stored in underground storage tanks and is expected to begin operation in 1993. The plant is initiating the program to baseline and monitor instrument and control (I&C) and electrical equipment, remote process equipment, embedded instrument and control cables, and in-cell jumper cables used in the facility. This program is based on the electronic characterization and diagnostic (ECAD) system which was modified to includemore » process response analysis and to meet rigid Department of Energy equipment requirements. The system consists of computer-automated, state-of-the-art electronics. The data that are gathered are stored in a computerized database for analysis, trending, and troubleshooting. It is anticipated that the data which are gathered and trended will aid in life extension for the facility.« less

  11. Multidimensional Interactive Radiology Report and Analysis: standardization of workflow and reporting for renal mass tracking and quantification

    NASA Astrophysics Data System (ADS)

    Hwang, Darryl H.; Ma, Kevin; Yepes, Fernando; Nadamuni, Mridula; Nayyar, Megha; Liu, Brent; Duddalwar, Vinay; Lepore, Natasha

    2015-12-01

    A conventional radiology report primarily consists of a large amount of unstructured text, and lacks clear, concise, consistent and content-rich information. Hence, an area of unmet clinical need consists of developing better ways to communicate radiology findings and information specific to each patient. Here, we design a new workflow and reporting system that combines and integrates advances in engineering technology with those from the medical sciences, the Multidimensional Interactive Radiology Report and Analysis (MIRRA). Until recently, clinical standards have primarily relied on 2D images for the purpose of measurement, but with the advent of 3D processing, many of the manually measured metrics can be automated, leading to better reproducibility and less subjective measurement placement. Hence, we make use this newly available 3D processing in our workflow. Our pipeline is used here to standardize the labeling, tracking, and quantifying of metrics for renal masses.

  12. Circular analysis in complex stochastic systems

    PubMed Central

    Valleriani, Angelo

    2015-01-01

    Ruling out observations can lead to wrong models. This danger occurs unwillingly when one selects observations, experiments, simulations or time-series based on their outcome. In stochastic processes, conditioning on the future outcome biases all local transition probabilities and makes them consistent with the selected outcome. This circular self-consistency leads to models that are inconsistent with physical reality. It is also the reason why models built solely on macroscopic observations are prone to this fallacy. PMID:26656656

  13. Monitoring total mixed rations and feed delivery systems.

    PubMed

    Oelberg, Thomas J; Stone, William

    2014-11-01

    This article is intended to give practitioners a method to evaluate total mixed ration (TMR) consistency and to give them practical solutions to improve TMR consistency that will improve cattle performance and health. Practitioners will learn how to manage the variation in moisture and nutrients that exists in haylage and corn silage piles and in bales of hay, and methods to reduce variation in the TMR mixing and delivery process. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Establishment of a Quantitative Medical Technology Evaluation System and Indicators within Medical Institutions.

    PubMed

    Wu, Suo-Wei; Chen, Tong; Pan, Qi; Wei, Liang-Yu; Wang, Qin; Li, Chao; Song, Jing-Chen; Luo, Ji

    2018-06-05

    The development and application of medical technologies reflect the medical quality and clinical capacity of a hospital. It is also an effective approach in upgrading medical service and core competitiveness among medical institutions. This study aimed to build a quantitative medical technology evaluation system through questionnaire survey within medical institutions to perform an assessment to medical technologies more objectively and accurately, and promote the management of medical quality technologies and ensure the medical safety of various operations among the hospitals. A two-leveled quantitative medical technology evaluation system was built through a two-round questionnaire survey of chosen experts. The Delphi method was applied in identifying the structure of evaluation system and indicators. The judgment of the experts on the indicators was adopted in building the matrix so that the weight coefficient and maximum eigenvalue (λ max), consistency index (CI), and random consistency ratio (CR) could be obtained and collected. The results were verified through consistency tests, and the index weight coefficient of each indicator was conducted and calculated through analytical hierarchy process. Twenty-six experts of different medical fields were involved in the questionnaire survey, 25 of whom successfully responded to the two-round research. Altogether, 4 primary indicators (safety, effectiveness, innovativeness, and benefits), as well as 13 secondary indicators, were included in the evaluation system. The matrix is built to conduct the λ max, CI, and CR of each expert in the survey, and the index weight coefficients of primary indicators were 0.33, 0.28, 0.27, and 0.12, respectively, and the index weight coefficients of secondary indicators were conducted and calculated accordingly. As the two-round questionnaire survey of experts and statistical analysis were performed and credibility of the results was verified through consistency evaluation test, the study established a quantitative medical technology evaluation system model and assessment indicators within medical institutions based on the Delphi method and analytical hierarchy process. Moreover, further verifications, adjustments, and optimizations of the system and indicators will be performed in follow-up studies.

  15. Studies of the use of heat from high temperature nuclear sources for hydrogen production processes

    NASA Technical Reports Server (NTRS)

    Farbman, G. H.

    1976-01-01

    Future uses of hydrogen and hydrogen production processes that can meet the demand for hydrogen in the coming decades were considered. To do this, a projection was made of the market for hydrogen through the year 2000. Four hydrogen production processes were selected, from among water electrolysis, fossil based and thermochemical water decomposition systems, and evaluated, using a consistent set of ground rules, in terms of relative performance, economics, resource requirements, and technology status.

  16. Medical diagnosis of atherosclerosis from Carotid Artery Doppler Signals using principal component analysis (PCA), k-NN based weighting pre-processing and Artificial Immune Recognition System (AIRS).

    PubMed

    Latifoğlu, Fatma; Polat, Kemal; Kara, Sadik; Güneş, Salih

    2008-02-01

    In this study, we proposed a new medical diagnosis system based on principal component analysis (PCA), k-NN based weighting pre-processing, and Artificial Immune Recognition System (AIRS) for diagnosis of atherosclerosis from Carotid Artery Doppler Signals. The suggested system consists of four stages. First, in the feature extraction stage, we have obtained the features related with atherosclerosis disease using Fast Fourier Transformation (FFT) modeling and by calculating of maximum frequency envelope of sonograms. Second, in the dimensionality reduction stage, the 61 features of atherosclerosis disease have been reduced to 4 features using PCA. Third, in the pre-processing stage, we have weighted these 4 features using different values of k in a new weighting scheme based on k-NN based weighting pre-processing. Finally, in the classification stage, AIRS classifier has been used to classify subjects as healthy or having atherosclerosis. Hundred percent of classification accuracy has been obtained by the proposed system using 10-fold cross validation. This success shows that the proposed system is a robust and effective system in diagnosis of atherosclerosis disease.

  17. Real Time Monitoring System of Pollution Waste on Musi River Using Support Vector Machine (SVM) Method

    NASA Astrophysics Data System (ADS)

    Fachrurrozi, Muhammad; Saparudin; Erwin

    2017-04-01

    Real-time Monitoring and early detection system which measures the quality standard of waste in Musi River, Palembang, Indonesia is a system for determining air and water pollution level. This system was designed in order to create an integrated monitoring system and provide real time information that can be read. It is designed to measure acidity and water turbidity polluted by industrial waste, as well as to show and provide conditional data integrated in one system. This system consists of inputting and processing the data, and giving output based on processed data. Turbidity, substances, and pH sensor is used as a detector that produce analog electrical direct current voltage (DC). Early detection system works by determining the value of the ammonia threshold, acidity, and turbidity level of water in Musi River. The results is then presented based on the level group pollution by the Support Vector Machine classification method.

  18. Design modification and optimisation of the perfusion system of a tri-axial bioreactor for tissue engineering.

    PubMed

    Hussein, Husnah; Williams, David J; Liu, Yang

    2015-07-01

    A systematic design of experiments (DOE) approach was used to optimize the perfusion process of a tri-axial bioreactor designed for translational tissue engineering exploiting mechanical stimuli and mechanotransduction. Four controllable design parameters affecting the perfusion process were identified in a cause-effect diagram as potential improvement opportunities. A screening process was used to separate out the factors that have the largest impact from the insignificant ones. DOE was employed to find the settings of the platen design, return tubing configuration and the elevation difference that minimise the load on the pump and variation in the perfusion process and improve the controllability of the perfusion pressures within the prescribed limits. DOE was very effective for gaining increased knowledge of the perfusion process and optimizing the process for improved functionality. It is hypothesized that the optimized perfusion system will result in improved biological performance and consistency.

  19. LANDSAT information for state planning

    NASA Technical Reports Server (NTRS)

    Faust, N. L.; Spann, G. W.

    1977-01-01

    The transfer of remote sensing technology for the digital processing of LANDSAT data to state and local agencies in Georgia and other southeastern states is discussed. The project consists of a series of workshops, seminars, and demonstration efforts, and transfer of NASA-developed hardware concepts and computer software to state agencies. Throughout the multi-year effort, digital processing techniques have been emphasized classification algorithms. Software for LANDSAT data rectification and processing have been developed and/or transferred. A hardware system is available at EES (engineering experiment station) to allow user interactive processing of LANDSAT data. Seminars and workshops emphasize the digital approach to LANDSAT data utilization and the system improvements scheduled for LANDSATs C and D. Results of the project indicate a substantially increased awareness of the utility of digital LANDSAT processing techniques among the agencies contracted throughout the southeast. In Georgia, several agencies have jointly funded a program to map the entire state using digitally processed LANDSAT data.

  20. Performance evaluation of the croissant production line with reparable machines

    NASA Astrophysics Data System (ADS)

    Tsarouhas, Panagiotis H.

    2015-03-01

    In this study, the analytical probability models for an automated serial production system, bufferless that consists of n-machines in series with common transfer mechanism and control system was developed. Both time to failure and time to repair a failure are assumed to follow exponential distribution. Applying those models, the effect of system parameters on system performance in actual croissant production line was studied. The production line consists of six workstations with different numbers of reparable machines in series. Mathematical models of the croissant production line have been developed using Markov process. The strength of this study is in the classification of the whole system in states, representing failures of different machines. Failure and repair data from the actual production environment have been used to estimate reliability and maintainability for each machine, workstation, and the entire line is based on analytical models. The analysis provides a useful insight into the system's behaviour, helps to find design inherent faults and suggests optimal modifications to upgrade the system and improve its performance.

  1. Integrated Information Systems for Electronic Chemotherapy Medication Administration

    PubMed Central

    Levy, Mia A.; Giuse, Dario A.; Eck, Carol; Holder, Gwen; Lippard, Giles; Cartwright, Julia; Rudge, Nancy K.

    2011-01-01

    Introduction: Chemotherapy administration is a highly complex and distributed task in both the inpatient and outpatient infusion center settings. The American Society of Clinical Oncology and the Oncology Nursing Society (ASCO/ONS) have developed standards that specify procedures and documentation requirements for safe chemotherapy administration. Yet paper-based approaches to medication administration have several disadvantages and do not provide any decision support for patient safety checks. Electronic medication administration that includes bar coding technology may provide additional safety checks, enable consistent documentation structure, and have additional downstream benefits. Methods: We describe the specialized configuration of clinical informatics systems for electronic chemotherapy medication administration. The system integrates the patient registration system, the inpatient order entry system, the pharmacy information system, the nursing documentation system, and the electronic health record. Results: We describe the process of deploying this infrastructure in the adult and pediatric inpatient oncology, hematology, and bone marrow transplant wards at Vanderbilt University Medical Center. We have successfully adapted the system for the oncology-specific documentation requirements detailed in the ASCO/ONS guidelines for chemotherapy administration. However, several limitations remain with regard to recording the day of treatment and dose number. Conclusion: Overall, the configured systems facilitate compliance with the ASCO/ONS guidelines and improve the consistency of documentation and multidisciplinary team communication. Our success has prompted us to deploy this infrastructure in our outpatient chemotherapy infusion centers, a process that is currently underway and that will require a few unique considerations. PMID:22043185

  2. A generic biogeochemical module for Earth system models: Next Generation BioGeoChemical Module (NGBGC), version 1.0

    NASA Astrophysics Data System (ADS)

    Fang, Y.; Huang, M.; Liu, C.; Li, H.; Leung, L. R.

    2013-11-01

    Physical and biogeochemical processes regulate soil carbon dynamics and CO2 flux to and from the atmosphere, influencing global climate changes. Integration of these processes into Earth system models (e.g., community land models (CLMs)), however, currently faces three major challenges: (1) extensive efforts are required to modify modeling structures and to rewrite computer programs to incorporate new or updated processes as new knowledge is being generated, (2) computational cost is prohibitively expensive to simulate biogeochemical processes in land models due to large variations in the rates of biogeochemical processes, and (3) various mathematical representations of biogeochemical processes exist to incorporate different aspects of fundamental mechanisms, but systematic evaluation of the different mathematical representations is difficult, if not impossible. To address these challenges, we propose a new computational framework to easily incorporate physical and biogeochemical processes into land models. The new framework consists of a new biogeochemical module, Next Generation BioGeoChemical Module (NGBGC), version 1.0, with a generic algorithm and reaction database so that new and updated processes can be incorporated into land models without the need to manually set up the ordinary differential equations to be solved numerically. The reaction database consists of processes of nutrient flow through the terrestrial ecosystems in plants, litter, and soil. This framework facilitates effective comparison studies of biogeochemical cycles in an ecosystem using different conceptual models under the same land modeling framework. The approach was first implemented in CLM and benchmarked against simulations from the original CLM-CN code. A case study was then provided to demonstrate the advantages of using the new approach to incorporate a phosphorus cycle into CLM. To our knowledge, the phosphorus-incorporated CLM is a new model that can be used to simulate phosphorus limitation on the productivity of terrestrial ecosystems. The method presented here could in theory be applied to simulate biogeochemical cycles in other Earth system models.

  3. Job-mix modeling and system analysis of an aerospace multiprocessor.

    NASA Technical Reports Server (NTRS)

    Mallach, E. G.

    1972-01-01

    An aerospace guidance computer organization, consisting of multiple processors and memory units attached to a central time-multiplexed data bus, is described. A job mix for this type of computer is obtained by analysis of Apollo mission programs. Multiprocessor performance is then analyzed using: 1) queuing theory, under certain 'limiting case' assumptions; 2) Markov process methods; and 3) system simulation. Results of the analyses indicate: 1) Markov process analysis is a useful and efficient predictor of simulation results; 2) efficient job execution is not seriously impaired even when the system is so overloaded that new jobs are inordinately delayed in starting; 3) job scheduling is significant in determining system performance; and 4) a system having many slow processors may or may not perform better than a system of equal power having few fast processors, but will not perform significantly worse.

  4. Space Generic Open Avionics Architecture (SGOAA) reference model technical guide

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.; Stovall, John R.

    1993-01-01

    This report presents a full description of the Space Generic Open Avionics Architecture (SGOAA). The SGOAA consists of a generic system architecture for the entities in spacecraft avionics, a generic processing architecture, and a six class model of interfaces in a hardware/software system. The purpose of the SGOAA is to provide an umbrella set of requirements for applying the generic architecture interface model to the design of specific avionics hardware/software systems. The SGOAA defines a generic set of system interface points to facilitate identification of critical interfaces and establishes the requirements for applying appropriate low level detailed implementation standards to those interface points. The generic core avionics system and processing architecture models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.

  5. Poka Yoke system based on image analysis and object recognition

    NASA Astrophysics Data System (ADS)

    Belu, N.; Ionescu, L. M.; Misztal, A.; Mazăre, A.

    2015-11-01

    Poka Yoke is a method of quality management which is related to prevent faults from arising during production processes. It deals with “fail-sating” or “mistake-proofing”. The Poka-yoke concept was generated and developed by Shigeo Shingo for the Toyota Production System. Poka Yoke is used in many fields, especially in monitoring production processes. In many cases, identifying faults in a production process involves a higher cost than necessary cost of disposal. Usually, poke yoke solutions are based on multiple sensors that identify some nonconformities. This means the presence of different equipment (mechanical, electronic) on production line. As a consequence, coupled with the fact that the method itself is an invasive, affecting the production process, would increase its price diagnostics. The bulky machines are the means by which a Poka Yoke system can be implemented become more sophisticated. In this paper we propose a solution for the Poka Yoke system based on image analysis and identification of faults. The solution consists of a module for image acquisition, mid-level processing and an object recognition module using associative memory (Hopfield network type). All are integrated into an embedded system with AD (Analog to Digital) converter and Zync 7000 (22 nm technology).

  6. 42 CFR 51.32 - Resolving disputes.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... such as those involving negotiation, conciliation and mediation to resolve disputes early in the... process involving negotiation, mediation and conciliation. Consistent with State and Federal laws and... system shall be held to the standard of exhaustion of remedies provided under State and Federal law. The...

  7. 42 CFR 51.32 - Resolving disputes.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... such as those involving negotiation, conciliation and mediation to resolve disputes early in the... process involving negotiation, mediation and conciliation. Consistent with State and Federal laws and... system shall be held to the standard of exhaustion of remedies provided under State and Federal law. The...

  8. 42 CFR 51.32 - Resolving disputes.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... such as those involving negotiation, conciliation and mediation to resolve disputes early in the... process involving negotiation, mediation and conciliation. Consistent with State and Federal laws and... system shall be held to the standard of exhaustion of remedies provided under State and Federal law. The...

  9. 42 CFR 51.32 - Resolving disputes.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... such as those involving negotiation, conciliation and mediation to resolve disputes early in the... process involving negotiation, mediation and conciliation. Consistent with State and Federal laws and... system shall be held to the standard of exhaustion of remedies provided under State and Federal law. The...

  10. Autonomous magnetic float zone microgravity crystal growth application to TiC and GaAs

    NASA Astrophysics Data System (ADS)

    Chan, Tony Y.-T.; Choi, Sang-Keun

    1992-10-01

    The floating zone process is ideal for high temperature (greater than 3000 K) growth of titanium carbide because it is containerless. However, float zoning requires small melt volumes in order to maintain a stable melt configuration. The short melt columns make it difficult to achieve a controlled thermal profile, a necessity for producing crystals of high quality. Thus, an automated control strategy based upon continuous monitoring of the growth process with processing parameters adjusted to values based upon the physical transport processes of the growth process is very desirable for maintaining stability and reproducibility of the process. The present work developed a Float-zone Acquisition and Control Technology (FACT) system which uses relations derived by combining empirical relations with a knowledge data base deduced from detailed numerical analysis of fluid mechanics and thermal transport of the growth process. The FACT system was assembled, tested and employed to grow two TiC ingots. One of the ingots was characterized by x-ray diffraction at different axial locations. The x-ray rocking curves showed consistent characteristics of a manually grown ingot. It was also found that with the FACT system, the process conditions can be operated closer to the stability limits, due to fast response time and repetitive amounts of adjustment from the FACT system. The FACT system shows a major potential in growing quality TiC crystals in a cost-effective manner.

  11. Modeling methodology for MLS range navigation system errors using flight test data

    NASA Technical Reports Server (NTRS)

    Karmali, M. S.; Phatak, A. V.

    1982-01-01

    Flight test data was used to develop a methodology for modeling MLS range navigation system errors. The data used corresponded to the constant velocity and glideslope approach segment of a helicopter landing trajectory. The MLS range measurement was assumed to consist of low frequency and random high frequency components. The random high frequency component was extracted from the MLS range measurements. This was done by appropriate filtering of the range residual generated from a linearization of the range profile for the final approach segment. This range navigation system error was then modeled as an autoregressive moving average (ARMA) process. Maximum likelihood techniques were used to identify the parameters of the ARMA process.

  12. Large - scale Rectangular Ruler Automated Verification Device

    NASA Astrophysics Data System (ADS)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  13. A multiarchitecture parallel-processing development environment

    NASA Technical Reports Server (NTRS)

    Townsend, Scott; Blech, Richard; Cole, Gary

    1993-01-01

    A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.

  14. Sociotechnical attributes of safe and unsafe work systems.

    PubMed

    Kleiner, Brian M; Hettinger, Lawrence J; DeJoy, David M; Huang, Yuang-Hsiang; Love, Peter E D

    2015-01-01

    Theoretical and practical approaches to safety based on sociotechnical systems principles place heavy emphasis on the intersections between social-organisational and technical-work process factors. Within this perspective, work system design emphasises factors such as the joint optimisation of social and technical processes, a focus on reliable human-system performance and safety metrics as design and analysis criteria, the maintenance of a realistic and consistent set of safety objectives and policies, and regular access to the expertise and input of workers. We discuss three current approaches to the analysis and design of complex sociotechnical systems: human-systems integration, macroergonomics and safety climate. Each approach emphasises key sociotechnical systems themes, and each prescribes a more holistic perspective on work systems than do traditional theories and methods. We contrast these perspectives with historical precedents such as system safety and traditional human factors and ergonomics, and describe potential future directions for their application in research and practice. The identification of factors that can reliably distinguish between safe and unsafe work systems is an important concern for ergonomists and other safety professionals. This paper presents a variety of sociotechnical systems perspectives on intersections between social--organisational and technology--work process factors as they impact work system analysis, design and operation.

  15. Distributed software framework and continuous integration in hydroinformatics systems

    NASA Astrophysics Data System (ADS)

    Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao

    2017-08-01

    When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.

  16. A new approach to criteria for health risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spickett, Jeffery, E-mail: J.Spickett@curtin.edu.au; Faculty of Health Sciences, School of Public Health, Curtin University, Perth, Western Australia; Katscherian, Dianne

    2012-01-15

    Health Impact Assessment (HIA) is a developing component of the overall impact assessment process and as such needs access to procedures that can enable more consistent approaches to the stepwise process that is now generally accepted in both EIA and HIA. The guidelines developed during this project provide a structured process, based on risk assessment procedures which use consequences and likelihood, as a way of ranking risks to adverse health outcomes from activities subjected to HIA or HIA as part of EIA. The aim is to assess the potential for both acute and chronic health outcomes. The consequences component alsomore » identifies a series of consequences for the health care system, depicted as expressions of financial expenditure and the capacity of the health system. These more specific health risk assessment characteristics should provide for a broader consideration of health consequences and a more consistent estimation of the adverse health risks of a proposed development at both the scoping and risk assessment stages of the HIA process. - Highlights: Black-Right-Pointing-Pointer A more objective approach to health risk assessment is provided. Black-Right-Pointing-Pointer An objective set of criteria for the consequences for chronic and acute impacts. Black-Right-Pointing-Pointer An objective set of criteria for the consequences on the health care system. Black-Right-Pointing-Pointer An objective set of criteria for event frequency that could impact on health. Black-Right-Pointing-Pointer The approach presented is currently being trialled in Australia.« less

  17. Two-step rapid sulfur capture. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-04-01

    The primary goal of this program was to test the technical and economic feasibility of a novel dry sorbent injection process called the Two-Step Rapid Sulfur Capture process for several advanced coal utilization systems. The Two-Step Rapid Sulfur Capture process consists of limestone activation in a high temperature auxiliary burner for short times followed by sorbent quenching in a lower temperature sulfur containing coal combustion gas. The Two-Step Rapid Sulfur Capture process is based on the Non-Equilibrium Sulfur Capture process developed by the Energy Technology Office of Textron Defense Systems (ETO/TDS). Based on the Non-Equilibrium Sulfur Capture studies the rangemore » of conditions for optimum sorbent activation were thought to be: activation temperature > 2,200 K for activation times in the range of 10--30 ms. Therefore, the aim of the Two-Step process is to create a very active sorbent (under conditions similar to the bomb reactor) and complete the sulfur reaction under thermodynamically favorable conditions. A flow facility was designed and assembled to simulate the temperature, time, stoichiometry, and sulfur gas concentration prevalent in the advanced coal utilization systems such as gasifiers, fluidized bed combustors, mixed-metal oxide desulfurization systems, diesel engines, and gas turbines.« less

  18. Collaborative problem solving with a total quality model.

    PubMed

    Volden, C M; Monnig, R

    1993-01-01

    A collaborative problem-solving system committed to the interests of those involved complies with the teachings of the total quality management movement in health care. Deming espoused that any quality system must become an integral part of routine activities. A process that is used consistently in dealing with problems, issues, or conflicts provides a mechanism for accomplishing total quality improvement. The collaborative problem-solving process described here results in quality decision-making. This model incorporates Ishikawa's cause-and-effect (fishbone) diagram, Moore's key causes of conflict, and the steps of the University of North Dakota Conflict Resolution Center's collaborative problem solving model.

  19. Quantum thermodynamic cycles and quantum heat engines. II.

    PubMed

    Quan, H T

    2009-04-01

    We study the quantum-mechanical generalization of force or pressure, and then we extend the classical thermodynamic isobaric process to quantum-mechanical systems. Based on these efforts, we are able to study the quantum version of thermodynamic cycles that consist of quantum isobaric processes, such as the quantum Brayton cycle and quantum Diesel cycle. We also consider the implementation of the quantum Brayton cycle and quantum Diesel cycle with some model systems, such as single particle in a one-dimensional box and single-mode radiation field in a cavity. These studies lay the microscopic (quantum-mechanical) foundation for Szilard-Zurek single-molecule engine.

  20. Fuzzy Logic-Based Audio Pattern Recognition

    NASA Astrophysics Data System (ADS)

    Malcangi, M.

    2008-11-01

    Audio and audio-pattern recognition is becoming one of the most important technologies to automatically control embedded systems. Fuzzy logic may be the most important enabling methodology due to its ability to rapidly and economically model such application. An audio and audio-pattern recognition engine based on fuzzy logic has been developed for use in very low-cost and deeply embedded systems to automate human-to-machine and machine-to-machine interaction. This engine consists of simple digital signal-processing algorithms for feature extraction and normalization, and a set of pattern-recognition rules manually tuned or automatically tuned by a self-learning process.

  1. GAP: yet another image processing system for solar observations.

    NASA Astrophysics Data System (ADS)

    Keller, C. U.

    GAP is a versatile, interactive image processing system for analyzing solar observations, in particular extended time sequences, and for preparing publication quality figures. It consists of an interpreter that is based on a language with a control flow similar to PASCAL and C. The interpreter may be accessed from a command line editor and from user-supplied functions, procedures, and command scripts. GAP is easily expandable via external FORTRAN programs that are linked to the GAP interface routines. The current version of GAP runs on VAX, DECstation, Sun, and Apollo computers. Versions for MS-DOS and OS/2 are in preparation.

  2. Static Frequency Converter System Installed and Tested

    NASA Technical Reports Server (NTRS)

    Brown, Donald P.; Sadhukhan, Debashis

    2003-01-01

    A new Static Frequency Converter (SFC) system has been installed and tested at the NASA Glenn Research Center s Central Air Equipment Building to provide consistent, reduced motor start times and improved reliability for the building s 14 large exhausters and compressors. The operational start times have been consistent around 2 min, 20 s per machine. This is at least a 3-min improvement (per machine) over the old variable-frequency motor generator sets. The SFC was designed and built by Asea Brown Boveri (ABB) and installed by Encompass Design Group (EDG) as part of a Construction of Facilities project managed by Glenn (Robert Scheidegger, project manager). The authors designed the Central Process Distributed Control Systems interface and control between the programmable logic controller, solid-state exciter, and switchgear, which was constructed by Gilcrest Electric.

  3. Experimental analysis of pressure controlled atomization process (PCAP) coatings for replacement of hard chromium plating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tierney, J.C.; Glovan, R.J.; Witt, S.J.

    1995-12-31

    A four-phase experimental design was utilized to evaluate the abrasive wear and corrosion protection characteristics of VERSAlloy 50 coatings applied to AISI 4130 steel sheet. The coatings were applied with the Pressure Controlled Atomization Process (PCAP), a new thermal spray process being developed for the United States Air Force to replace hard chromium plating. Phase 1 of the design consisted of an evaluation of deposit profiles that were sprayed at five different standoff distances. Profile measurements yielded standard deviations ({sigma}) of the plume at each of the spray distances. Phase 2 consisted of a completely randomized series of eight spraymore » tests in which the track gap or distance between consecutive spray passes was varied by amounts of 0.5{sigma}, 1{sigma}, 2{sigma}, and 3{sigma}. The sprayed test coupons were then evaluated for corrosion protection, abrasive wear resistance, microhardness, and porosity. Results from Phase 2 were used to determine the best track gap or overlap for Phase 3 and Phase 4 testing. Phase 3 consisted of 22-run central composite design. The test coupons were evaluated the same as in Phase 2. Statistical analysis of Phase 3 data revealed that the optimal system operating parameters produced coatings that would either provide superior corrosion protection or resistance to abrasive wear. Phase 4 consisted of four spray tests to validate the results obtained in Phase 3. Phase 4 test coupons were again evaluated with the same analysis as in Phases 2 and 3. The validation tests indicated that PCAP system operating parameters could be controlled to produce VERSAlloy 50 coatings with superior corrosion protection or resistance to abrasive wear.« less

  4. Exploring the evolutionary mechanism of complex supply chain systems using evolving hypergraphs

    NASA Astrophysics Data System (ADS)

    Suo, Qi; Guo, Jin-Li; Sun, Shiwei; Liu, Han

    2018-01-01

    A new evolutionary model is proposed to describe the characteristics and evolution pattern of supply chain systems using evolving hypergraphs, in which nodes represent enterprise entities while hyperedges represent the relationships among diverse trades. The nodes arrive at the system in accordance with a Poisson process, with the evolving process incorporating the addition of new nodes, linking of old nodes, and rewiring of links. Grounded in the Poisson process theory and continuum theory, the stationary average hyperdegree distribution is shown to follow a shifted power law (SPL), and the theoretical predictions are consistent with the results of numerical simulations. Testing the impact of parameters on the model yields a positive correlation between hyperdegree and degree. The model also uncovers macro characteristics of the relationships among enterprises due to the microscopic interactions among individuals.

  5. Remediating ethylbenzene-contaminated clayey soil by a surfactant-aided electrokinetic (SAEK) process.

    PubMed

    Yuan, Ching; Weng, Chih-Huang

    2004-10-01

    The objectives of this research are to investigate the remediation efficiency and electrokinetic behavior of ethylbenzene-contaminated clay by a surfactant-aided electrokinetic (SAEK) process under a potential gradient of 2 Vcm(-1). Experimental results indicated that the type of processing fluids played a key role in determining the removal performance of ethylbenzene from clay in the SAEK process. A mixed surfactant system consisted of 0.5% SDS and 2.0% PANNOX 110 showed the best performance of ethylbenzene removed in the SAEK system. The removal efficiency of ethylbenzene was determined to be 63-98% in SAEK system while only 40% was achieved in an electrokinetic system with tap water as processing fluid. It was found that ethylbenzene was accumulated in the vicinity of anode in an electrokinetic system with tap water as processing fluid. However, the concentration front of ethylbenzene was shifted toward cathode in the SAEK system. The electroosmotic permeability and power consumption were 0.17 x 10(-6)-3.01 x 10(-6) cm(2)V(-1)s(-1) and 52-123 kW h m(-3), respectively. The cost, including the expense of energy and surfactants, was estimated to be 5.15-12.65 USD m(-3) for SAEK systems, which was 2.0-4.9 times greater than that in the system of electrokinetic alone (2.6 USD m(-3)). Nevertheless, by taking the remediation efficiency of ethylbenzene and the energy expenditure into account for the overall process performance evaluation, the system SAEK was still a cost-effective alternative treatment method.

  6. Extension of ERIM multispectral data processing capabilities through improved data handling techniques

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.

    1973-01-01

    The improvement and extension of the capabilities of the Environmental Research Institute of Michigan processing facility in handling multispectral data are discussed. Improvements consisted of implementing hardware modifications which permitted more rapid access to the recorded data through improved numbering and indexing of such data. In addition, techniques are discussed for handling data from sources other than the ERIM M-5 and M-7 scanner systems.

  7. `Counterfactual' interpretation of the quantum measurement process

    NASA Astrophysics Data System (ADS)

    Nisticò, Giuseppe

    1997-08-01

    The question of the determination of the state of the system during a measurement experiment is discussed within quantum theory, as a part of the more general measurement’s problem. I propose a counterfactual interpretation of the measurement process which answers the question from a conceptual point of view. This interpretation turns out to be consistent with the predictions of quantum theory, but it presents difficulties from an operational point of view.

  8. Foundations of Observation: Considerations for Developing a Classroom Observation System That Helps Districts Achieve Consistent and Accurate Scores. MET Project, Policy and Practice Brief

    ERIC Educational Resources Information Center

    Joe, Jilliam N.; Tocci, Cynthia M.; Holtzman, Steven L.; Williams, Jean C.

    2013-01-01

    The purpose of this paper is to provide states and school districts with processes they can use to help ensure high-quality data collection during teacher observations. Educational Testing Service's (ETS's) goal in writing it is to share the knowledge and expertise they gained: (1) from designing and implementing scoring processes for the Measures…

  9. Perchlorate and nitrate treatment by ion exchange integrated with biological brine treatment.

    PubMed

    Lehman, S Geno; Badruzzaman, Mohammad; Adham, Samer; Roberts, Deborah J; Clifford, Dennis A

    2008-02-01

    Groundwater contaminated with perchlorate and nitrate was treated in a pilot plant using a commercially available ion exchange (IX) resin. Regenerant brine concentrate from the IX process, containing high perchlorate and nitrate, was treated biologically and the treated brine was reused in IX resin regeneration. The nitrate concentration of the feed water determined the exhaustion lifetime (i.e., regeneration frequency) of the resin; and the regeneration condition was determined by the perchlorate elution profile from the exhausted resin. The biological brine treatment system, using a salt-tolerant perchlorate- and nitrate-reducing culture, was housed in a sequencing batch reactor (SBR). The biological process consistently reduced perchlorate and nitrate concentrations in the spent brine to below the treatment goals of 500 microg ClO4(-)/L and 0.5mg NO3(-)-N/L determined by equilibrium multicomponent IX modeling. During 20 cycles of regeneration, the system consistently treated the drinking water to below the MCL of nitrate (10 mgNO3(-)-N/L) and the California Department of Health Services (CDHS) notification level of perchlorate (i.e., 6 microg/L). A conceptual cost analysis of the IX process estimated that perchlorate and nitrate treatment using the IX process with biological brine treatment to be approximately 20% less expensive than using the conventional IX with brine disposal.

  10. The Development and Technical Adequacy of Seventh-Grade Reading Comprehension Measures in a Progress Monitoring Assessment System. Technical Report #1102

    ERIC Educational Resources Information Center

    Park, Bitnara Jasmine; Alonzo, Julie; Tindal, Gerald

    2011-01-01

    This technical report describes the process of development and piloting of reading comprehension measures that are appropriate for seventh-grade students as part of an online progress screening and monitoring assessment system, http://easycbm.com. Each measure consists of an original fictional story of approximately 1,600 to 1,900 words with 20…

  11. Fuel Processor Development for a Soldier-Portable Fuel Cell System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palo, Daniel R.; Holladay, Jamie D.; Rozmiarek, Robert T.

    2002-01-01

    Battelle is currently developing a soldier-portable power system for the U.S. Army that will continuously provide 15 W (25 W peak) of base load electric power for weeks or months using a micro technology-based fuel processor. The fuel processing train consists of a combustor, two vaporizers, and a steam-reforming reactor. This paper describes the concept and experimental progress to date.

  12. Application of agent-based system for bioprocess description and process improvement.

    PubMed

    Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J

    2010-01-01

    Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. Copyright 2009 American Institute of Chemical Engineers

  13. The General Mission Analysis Tool (GMAT) System Test Plan

    NASA Technical Reports Server (NTRS)

    Conway, Darrel J.; Hughes, Steven P.

    2007-01-01

    This document serves as the System Test Approach for the GMAT Project. Preparation for system testing consists of three major stages: 1) The Test Approach sets the scope of system testing, the overall strategy to be adopted, the activities to be completed, the general resources required and the methods and processes to be used to test the release. It also details the activities, dependencies and effort required to conduct the System Test. 2) Test Planning details the activities, dependencies and effort required to conduct the System Test. 3) Test Cases documents the tests to be applied, the data to be processed, the automated testing coverage and the expected results. This document covers the first two of these items, and established the framework used for the GMAT test case development. The test cases themselves exist as separate components, and are managed outside of and concurrently with this System Test Plan.

  14. Exploration criteria for low permeability geothermal resources. Final report. [Coso KGRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Norton, D.

    1977-10-01

    Low permeability geothermal systems related to high temperature plutons in the upper crust were analyzed in order to ascertain those characteristics of these systems which could be detected by surface and shallow subsurface exploration methods. Analyses were designed to integrate data and concepts from the literature, which relate to the transport processes, together with computer simulation of idealized systems. The systems were analyzed by systematically varying input parameters in order to understand their effect on the variables which might be measured in an exploration-assessment program. The methods were applied to a prospective system in its early stages of evaluation. Datamore » from the Coso system were used. The study represents a first-order approximation to transport processes in geothermal systems, which consist of high temperature intrusions, host rock, and fluids. Included in an appendix are operations procedures for interactive graphics programs developed during the study. (MHR)« less

  15. System and method for air temperature control in an oxygen transport membrane based reactor

    DOEpatents

    Kelly, Sean M

    2016-09-27

    A system and method for air temperature control in an oxygen transport membrane based reactor is provided. The system and method involves introducing a specific quantity of cooling air or trim air in between stages in a multistage oxygen transport membrane based reactor or furnace to maintain generally consistent surface temperatures of the oxygen transport membrane elements and associated reactors. The associated reactors may include reforming reactors, boilers or process gas heaters.

  16. System and method for temperature control in an oxygen transport membrane based reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, Sean M.

    A system and method for temperature control in an oxygen transport membrane based reactor is provided. The system and method involves introducing a specific quantity of cooling air or trim air in between stages in a multistage oxygen transport membrane based reactor or furnace to maintain generally consistent surface temperatures of the oxygen transport membrane elements and associated reactors. The associated reactors may include reforming reactors, boilers or process gas heaters.

  17. Definition and Proposed Realization of the International Height Reference System (IHRS)

    NASA Astrophysics Data System (ADS)

    Ihde, Johannes; Sánchez, Laura; Barzaghi, Riccardo; Drewes, Hermann; Foerste, Christoph; Gruber, Thomas; Liebsch, Gunter; Marti, Urs; Pail, Roland; Sideris, Michael

    2017-05-01

    Studying, understanding and modelling global change require geodetic reference frames with an order of accuracy higher than the magnitude of the effects to be actually studied and with high consistency and reliability worldwide. The International Association of Geodesy, taking care of providing a precise geodetic infrastructure for monitoring the Earth system, promotes the implementation of an integrated global geodetic reference frame that provides a reliable frame for consistent analysis and modelling of global phenomena and processes affecting the Earth's gravity field, the Earth's surface geometry and the Earth's rotation. The definition, realization, maintenance and wide utilization of the International Terrestrial Reference System guarantee a globally unified geometric reference frame with an accuracy at the millimetre level. An equivalent high-precision global physical reference frame that supports the reliable description of changes in the Earth's gravity field (such as sea level variations, mass displacements, processes associated with geophysical fluids) is missing. This paper addresses the theoretical foundations supporting the implementation of such a physical reference surface in terms of an International Height Reference System and provides guidance for the coming activities required for the practical and sustainable realization of this system. Based on conceptual approaches of physical geodesy, the requirements for a unified global height reference system are derived. In accordance with the practice, its realization as the International Height Reference Frame is designed. Further steps for the implementation are also proposed.

  18. Altitude deviations: Breakdowns of an error-tolerant system

    NASA Technical Reports Server (NTRS)

    Palmer, Everett A.; Hutchins, Edwin L.; Ritter, Richard D.; Vancleemput, Inge

    1993-01-01

    Pilot reports of aviation incidents to the Aviation Safety Reporting System (ASRS) provide a window on the problems occurring in today's airline cockpits. The narratives of 10 pilot reports of errors made in the automation-assisted altitude-change task are used to illustrate some of the issues of pilots interacting with automatic systems. These narratives are then used to construct a description of the cockpit as an information processing system. The analysis concentrates on the error-tolerant properties of the system and on how breakdowns can occasionally occur. An error-tolerant system can detect and correct its internal processing errors. The cockpit system consists of two or three pilots supported by autoflight, flight-management, and alerting systems. These humans and machines have distributed access to clearance information and perform redundant processing of information. Errors can be detected as deviations from either expected behavior or as deviations from expected information. Breakdowns in this system can occur when the checking and cross-checking tasks that give the system its error-tolerant properties are not performed because of distractions or other task demands. Recommendations based on the analysis for improving the error tolerance of the cockpit system are given.

  19. A Microelectrode Array with Reproducible Performance Shows Loss of Consistency Following Functionalization with a Self-Assembled 6-Mercapto-1-hexanol Layer.

    PubMed

    Corrigan, Damion K; Vezza, Vincent; Schulze, Holger; Bachmann, Till T; Mount, Andrew R; Walton, Anthony J; Terry, Jonathan G

    2018-06-09

    For analytical applications involving label-free biosensors and multiple measurements, i.e., across an electrode array, it is essential to develop complete sensor systems capable of functionalization and of producing highly consistent responses. To achieve this, a multi-microelectrode device bearing twenty-four equivalent 50 µm diameter Pt disc microelectrodes was designed in an integrated 3-electrode system configuration and then fabricated. Cyclic voltammetry and electrochemical impedance spectroscopy were used for initial electrochemical characterization of the individual working electrodes. These confirmed the expected consistency of performance with a high degree of measurement reproducibility for each microelectrode across the array. With the aim of assessing the potential for production of an enhanced multi-electrode sensor for biomedical use, the working electrodes were then functionalized with 6-mercapto-1-hexanol (MCH). This is a well-known and commonly employed surface modification process, which involves the same principles of thiol attachment chemistry and self-assembled monolayer (SAM) formation commonly employed in the functionalization of electrodes and the formation of biosensors. Following this SAM formation, the reproducibility of the observed electrochemical signal between electrodes was seen to decrease markedly, compromising the ability to achieve consistent analytical measurements from the sensor array following this relatively simple and well-established surface modification. To successfully and consistently functionalize the sensors, it was necessary to dilute the constituent molecules by a factor of ten thousand to support adequate SAM formation on microelectrodes. The use of this multi-electrode device therefore demonstrates in a high throughput manner irreproducibility in the SAM formation process at the higher concentration, even though these electrodes are apparently functionalized simultaneously in the same film formation environment, confirming that the often seen significant electrode-to-electrode variation in label-free SAM biosensing films formed under such conditions is not likely to be due to variation in film deposition conditions, but rather kinetically controlled variation in the SAM layer formation process at these microelectrodes.

  20. PRISM Software: Processing and Review Interface for Strong‐Motion Data

    USGS Publications Warehouse

    Jones, Jeanne M.; Kalkan, Erol; Stephens, Christopher D.; Ng, Peter

    2017-01-01

    A continually increasing number of high‐quality digital strong‐motion records from stations of the National Strong Motion Project (NSMP) of the U.S. Geological Survey, as well as data from regional seismic networks within the United States, calls for automated processing of strong‐motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong‐motion records. When used without AQMS, PRISM provides batch‐processing capabilities. The PRISM software is platform independent (coded in Java), open source, and does not depend on any closed‐source or proprietary software. The software consists of two major components: a record processing engine composed of modules for each processing step, and a review tool, which is a graphical user interface for manual review, edit, and processing. To facilitate use by non‐NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand‐alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible to accommodate implementation of new processing techniques. All the computing features have been thoroughly tested.

Top