Sample records for processing systems components

  1. Thermal Storage Process and Components Laboratory | Energy Systems

    Science.gov Websites

    Integration Facility | NREL Process and Components Laboratory Thermal Storage Process and Components Laboratory The Energy Systems Integration Facility's Thermal Systems Process and Components Laboratory supports research and development, testing, and evaluation of new thermal energy storage systems

  2. 76 FR 55944 - In the Matter of Certain Electronic Devices With Image Processing Systems, Components Thereof...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-09

    ... With Image Processing Systems, Components Thereof, and Associated Software; Notice of Commission... importation of certain electronic devices with image processing systems, components thereof, and associated... direct infringement is asserted and the accused article does not meet every limitation of the asserted...

  3. 75 FR 38118 - In the Matter of Certain Electronic Devices With Image Processing Systems, Components Thereof...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-01

    ... With Image Processing Systems, Components Thereof, and Associated Software; Notice of Investigation..., and associated software by reason of infringement of certain claims of U.S. Patent Nos. 7,043,087... processing systems, components thereof, and associated software that infringe one or more of claims 1, 6, and...

  4. High-efficiency high-reliability optical components for a large, high-average-power visible laser system

    NASA Astrophysics Data System (ADS)

    Taylor, John R.; Stolz, Christopher J.

    1993-08-01

    Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.

  5. High-efficiency high-reliability optical components for a large, high-average-power visible laser system

    NASA Astrophysics Data System (ADS)

    Taylor, J. R.; Stolz, C. J.

    1992-12-01

    Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.

  6. A Multi-mission Event-Driven Component-Based System for Support of Flight Software Development, ATLO, and Operations first used by the Mars Science Laboratory (MSL) Project

    NASA Technical Reports Server (NTRS)

    Dehghani, Navid; Tankenson, Michael

    2006-01-01

    This paper details an architectural description of the Mission Data Processing and Control System (MPCS), an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is developed based on a set of small reusable components, implemented in Java, each designed with a specific function and well-defined interfaces. An industry standard messaging bus is used to transfer information among system components. Components generate standard messages which are used to capture system information, as well as triggers to support the event-driven architecture of the system. Event-driven systems are highly desirable for processing high-rate telemetry (science and engineering) data, and for supporting automation for many mission operations processes.

  7. A framework for the computer-aided planning and optimisation of manufacturing processes for components with functional graded properties

    NASA Astrophysics Data System (ADS)

    Biermann, D.; Gausemeier, J.; Heim, H.-P.; Hess, S.; Petersen, M.; Ries, A.; Wagner, T.

    2014-05-01

    In this contribution a framework for the computer-aided planning and optimisation of functional graded components is presented. The framework is divided into three modules - the "Component Description", the "Expert System" for the synthetisation of several process chains and the "Modelling and Process Chain Optimisation". The Component Description module enhances a standard computer-aided design (CAD) model by a voxel-based representation of the graded properties. The Expert System synthesises process steps stored in the knowledge base to generate several alternative process chains. Each process chain is capable of producing components according to the enhanced CAD model and usually consists of a sequence of heating-, cooling-, and forming processes. The dependencies between the component and the applied manufacturing processes as well as between the processes themselves need to be considered. The Expert System utilises an ontology for that purpose. The ontology represents all dependencies in a structured way and connects the information of the knowledge base via relations. The third module performs the evaluation of the generated process chains. To accomplish this, the parameters of each process are optimised with respect to the component specification, whereby the result of the best parameterisation is used as representative value. Finally, the process chain which is capable of manufacturing a functionally graded component in an optimal way regarding to the property distributions of the component description is presented by means of a dedicated specification technique.

  8. Optical components damage parameters database system

    NASA Astrophysics Data System (ADS)

    Tao, Yizheng; Li, Xinglan; Jin, Yuquan; Xie, Dongmei; Tang, Dingyong

    2012-10-01

    Optical component is the key to large-scale laser device developed by one of its load capacity is directly related to the device output capacity indicators, load capacity depends on many factors. Through the optical components will damage parameters database load capacity factors of various digital, information technology, for the load capacity of optical components to provide a scientific basis for data support; use of business processes and model-driven approach, the establishment of component damage parameter information model and database systems, system application results that meet the injury test optical components business processes and data management requirements of damage parameters, component parameters of flexible, configurable system is simple, easy to use, improve the efficiency of the optical component damage test.

  9. Closed-loop system for growth of aquatic biomass and gasification thereof

    DOEpatents

    Oyler, James R.

    2017-09-19

    Processes, systems, and methods for producing combustible gas from wet biomass are provided. In one aspect, for example, a process for generating a combustible gas from a wet biomass in a closed system is provided. Such a process may include growing a wet biomass in a growth chamber, moving at least a portion of the wet biomass to a reactor, heating the portion of the wet biomass under high pressure in the reactor to gasify the wet biomass into a total gas component, separating the gasified component into a liquid component, a non-combustible gas component, and a combustible gas component, and introducing the liquid component and non-combustible gas component containing carbon dioxide into the growth chamber to stimulate new wet biomass growth.

  10. Graphical Language for Data Processing

    NASA Technical Reports Server (NTRS)

    Alphonso, Keith

    2011-01-01

    A graphical language for processing data allows processing elements to be connected with virtual wires that represent data flows between processing modules. The processing of complex data, such as lidar data, requires many different algorithms to be applied. The purpose of this innovation is to automate the processing of complex data, such as LIDAR, without the need for complex scripting and programming languages. The system consists of a set of user-interface components that allow the user to drag and drop various algorithmic and processing components onto a process graph. By working graphically, the user can completely visualize the process flow and create complex diagrams. This innovation supports the nesting of graphs, such that a graph can be included in another graph as a single step for processing. In addition to the user interface components, the system includes a set of .NET classes that represent the graph internally. These classes provide the internal system representation of the graphical user interface. The system includes a graph execution component that reads the internal representation of the graph (as described above) and executes that graph. The execution of the graph follows the interpreted model of execution in that each node is traversed and executed from the original internal representation. In addition, there are components that allow external code elements, such as algorithms, to be easily integrated into the system, thus making the system infinitely expandable.

  11. Oxygen Compatibility Assessment of Components and Systems

    NASA Technical Reports Server (NTRS)

    Stoltzfus, Joel; Sparks, Kyle

    2010-01-01

    Fire hazards are inherent in oxygen systems and a storied history of fires in rocket engine propulsion components exists. To detect and mitigate these fire hazards requires careful, detailed, and thorough analyses applied during the design process. The oxygen compatibility assessment (OCA) process designed by NASA Johnson Space Center (JSC) White Sands Test Facility (WSTF) can be used to determine the presence of fire hazards in oxygen systems and the likelihood of a fire. This process may be used as both a design guide and during the approval process to ensure proper design features and material selection. The procedure for performing an OCA is a structured step-by-step process to determine the most severe operating conditions; assess the flammability of the system materials at the use conditions; evaluate the presence and efficacy of ignition mechanisms; assess the potential for a fire to breach the system; and determine the reaction effect (the potential loss of life, mission, and system functionality as the result of a fire). This process should be performed for each component in a system. The results of each component assessment, and the overall system assessment, should be recorded in a report that can be used in the short term to communicate hazards and their mitigation and to aid in system/component development and, in the long term, to solve anomalies that occur during engine testing and operation.

  12. Process management using component thermal-hydraulic function classes

    DOEpatents

    Morman, James A.; Wei, Thomas Y. C.; Reifman, Jaques

    1999-01-01

    A process management expert system where following malfunctioning of a component, such as a pump, for determining system realignment procedures such as for by-passing the malfunctioning component with on-line speeds to maintain operation of the process at full or partial capacity or to provide safe shut down of the system while isolating the malfunctioning component. The expert system uses thermal-hydraulic function classes at the component level for analyzing unanticipated as well as anticipated component malfunctions to provide recommended sequences of operator actions. Each component is classified according to its thermal-hydraulic function, and the generic and component-specific characteristics for that function. Using the diagnosis of the malfunctioning component and its thermal hydraulic class, the expert system analysis is carried out using generic thermal-hydraulic first principles. One aspect of the invention employs a qualitative physics-based forward search directed primarily downstream from the malfunctioning component in combination with a subsequent backward search directed primarily upstream from the serviced component. Generic classes of components are defined in the knowledge base according to the three thermal-hydraulic functions of mass, momentum and energy transfer and are used to determine possible realignment of component configurations in response to thermal-hydraulic function imbalance caused by the malfunctioning component. Each realignment to a new configuration produces the accompanying sequence of recommended operator actions. All possible new configurations are examined and a prioritized list of acceptable solutions is produced.

  13. Process management using component thermal-hydraulic function classes

    DOEpatents

    Morman, J.A.; Wei, T.Y.C.; Reifman, J.

    1999-07-27

    A process management expert system where following malfunctioning of a component, such as a pump, for determining system realignment procedures such as for by-passing the malfunctioning component with on-line speeds to maintain operation of the process at full or partial capacity or to provide safe shut down of the system while isolating the malfunctioning component. The expert system uses thermal-hydraulic function classes at the component level for analyzing unanticipated as well as anticipated component malfunctions to provide recommended sequences of operator actions. Each component is classified according to its thermal-hydraulic function, and the generic and component-specific characteristics for that function. Using the diagnosis of the malfunctioning component and its thermal hydraulic class, the expert system analysis is carried out using generic thermal-hydraulic first principles. One aspect of the invention employs a qualitative physics-based forward search directed primarily downstream from the malfunctioning component in combination with a subsequent backward search directed primarily upstream from the serviced component. Generic classes of components are defined in the knowledge base according to the three thermal-hydraulic functions of mass, momentum and energy transfer and are used to determine possible realignment of component configurations in response to thermal-hydraulic function imbalance caused by the malfunctioning component. Each realignment to a new configuration produces the accompanying sequence of recommended operator actions. All possible new configurations are examined and a prioritized list of acceptable solutions is produced. 5 figs.

  14. 76 FR 74753 - Authority To Manufacture and Distribute Postage Evidencing Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-01

    ... revision of the rules governing the inventory control processes of Postage Evidencing Systems (PES... destruction or disposal of all Postage Evidencing Systems and their components to enable accurate accounting...) Postage Evidencing System repair process--any physical or electronic access to the internal components of...

  15. NASA's Earth Science Data Systems

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.

    2015-01-01

    NASA's Earth Science Data Systems (ESDS) Program has evolved over the last two decades, and currently has several core and community components. Core components provide the basic operational capabilities to process, archive, manage and distribute data from NASA missions. Community components provide a path for peer-reviewed research in Earth Science Informatics to feed into the evolution of the core components. The Earth Observing System Data and Information System (EOSDIS) is a core component consisting of twelve Distributed Active Archive Centers (DAACs) and eight Science Investigator-led Processing Systems spread across the U.S. The presentation covers how the ESDS Program continues to evolve and benefits from as well as contributes to advances in Earth Science Informatics.

  16. Combined expert system/neural networks method for process fault diagnosis

    DOEpatents

    Reifman, Jaques; Wei, Thomas Y. C.

    1995-01-01

    A two-level hierarchical approach for process fault diagnosis is an operating system employs a function-oriented approach at a first level and a component characteristic-oriented approach at a second level, where the decision-making procedure is structured in order of decreasing intelligence with increasing precision. At the first level, the diagnostic method is general and has knowledge of the overall process including a wide variety of plant transients and the functional behavior of the process components. An expert system classifies malfunctions by function to narrow the diagnostic focus to a particular set of possible faulty components that could be responsible for the detected functional misbehavior of the operating system. At the second level, the diagnostic method limits its scope to component malfunctions, using more detailed knowledge of component characteristics. Trained artificial neural networks are used to further narrow the diagnosis and to uniquely identify the faulty component by classifying the abnormal condition data as a failure of one of the hypothesized components through component characteristics. Once an anomaly is detected, the hierarchical structure is used to successively narrow the diagnostic focus from a function misbehavior, i.e., a function oriented approach, until the fault can be determined, i.e., a component characteristic-oriented approach.

  17. Combined expert system/neural networks method for process fault diagnosis

    DOEpatents

    Reifman, J.; Wei, T.Y.C.

    1995-08-15

    A two-level hierarchical approach for process fault diagnosis of an operating system employs a function-oriented approach at a first level and a component characteristic-oriented approach at a second level, where the decision-making procedure is structured in order of decreasing intelligence with increasing precision. At the first level, the diagnostic method is general and has knowledge of the overall process including a wide variety of plant transients and the functional behavior of the process components. An expert system classifies malfunctions by function to narrow the diagnostic focus to a particular set of possible faulty components that could be responsible for the detected functional misbehavior of the operating system. At the second level, the diagnostic method limits its scope to component malfunctions, using more detailed knowledge of component characteristics. Trained artificial neural networks are used to further narrow the diagnosis and to uniquely identify the faulty component by classifying the abnormal condition data as a failure of one of the hypothesized components through component characteristics. Once an anomaly is detected, the hierarchical structure is used to successively narrow the diagnostic focus from a function misbehavior, i.e., a function oriented approach, until the fault can be determined, i.e., a component characteristic-oriented approach. 9 figs.

  18. Privacy-Related Context Information for Ubiquitous Health

    PubMed Central

    Nykänen, Pirkko; Ruotsalainen, Pekka

    2014-01-01

    Background Ubiquitous health has been defined as a dynamic network of interconnected systems. A system is composed of one or more information systems, their stakeholders, and the environment. These systems offer health services to individuals and thus implement ubiquitous computing. Privacy is the key challenge for ubiquitous health because of autonomous processing, rich contextual metadata, lack of predefined trust among participants, and the business objectives. Additionally, regulations and policies of stakeholders may be unknown to the individual. Context-sensitive privacy policies are needed to regulate information processing. Objective Our goal was to analyze privacy-related context information and to define the corresponding components and their properties that support privacy management in ubiquitous health. These properties should describe the privacy issues of information processing. With components and their properties, individuals can define context-aware privacy policies and set their privacy preferences that can change in different information-processing situations. Methods Scenarios and user stories are used to analyze typical activities in ubiquitous health to identify main actors, goals, tasks, and stakeholders. Context arises from an activity and, therefore, we can determine different situations, services, and systems to identify properties for privacy-related context information in information-processing situations. Results Privacy-related context information components are situation, environment, individual, information technology system, service, and stakeholder. Combining our analyses and previously identified characteristics of ubiquitous health, more detailed properties for the components are defined. Properties define explicitly what context information for different components is needed to create context-aware privacy policies that can control, limit, and constrain information processing. With properties, we can define, for example, how data can be processed or how components are regulated or in what kind of environment data can be processed. Conclusions This study added to the vision of ubiquitous health by analyzing information processing from the viewpoint of an individual’s privacy. We learned that health and wellness-related activities may happen in several environments and situations with multiple stakeholders, services, and systems. We have provided new knowledge regarding privacy-related context information and corresponding components by analyzing typical activities in ubiquitous health. With the identified components and their properties, individuals can define their personal preferences on information processing based on situational information, and privacy services can capture privacy-related context of the information-processing situation. PMID:25100084

  19. Privacy-related context information for ubiquitous health.

    PubMed

    Seppälä, Antto; Nykänen, Pirkko; Ruotsalainen, Pekka

    2014-03-11

    Ubiquitous health has been defined as a dynamic network of interconnected systems. A system is composed of one or more information systems, their stakeholders, and the environment. These systems offer health services to individuals and thus implement ubiquitous computing. Privacy is the key challenge for ubiquitous health because of autonomous processing, rich contextual metadata, lack of predefined trust among participants, and the business objectives. Additionally, regulations and policies of stakeholders may be unknown to the individual. Context-sensitive privacy policies are needed to regulate information processing. Our goal was to analyze privacy-related context information and to define the corresponding components and their properties that support privacy management in ubiquitous health. These properties should describe the privacy issues of information processing. With components and their properties, individuals can define context-aware privacy policies and set their privacy preferences that can change in different information-processing situations. Scenarios and user stories are used to analyze typical activities in ubiquitous health to identify main actors, goals, tasks, and stakeholders. Context arises from an activity and, therefore, we can determine different situations, services, and systems to identify properties for privacy-related context information in information-processing situations. Privacy-related context information components are situation, environment, individual, information technology system, service, and stakeholder. Combining our analyses and previously identified characteristics of ubiquitous health, more detailed properties for the components are defined. Properties define explicitly what context information for different components is needed to create context-aware privacy policies that can control, limit, and constrain information processing. With properties, we can define, for example, how data can be processed or how components are regulated or in what kind of environment data can be processed. This study added to the vision of ubiquitous health by analyzing information processing from the viewpoint of an individual's privacy. We learned that health and wellness-related activities may happen in several environments and situations with multiple stakeholders, services, and systems. We have provided new knowledge regarding privacy-related context information and corresponding components by analyzing typical activities in ubiquitous health. With the identified components and their properties, individuals can define their personal preferences on information processing based on situational information, and privacy services can capture privacy-related context of the information-processing situation.

  20. Systems Suitable for Information Professionals.

    ERIC Educational Resources Information Center

    Blair, John C., Jr.

    1983-01-01

    Describes computer operating systems applicable to microcomputers, noting hardware components, advantages and disadvantages of each system, local area networks, distributed processing, and a fully configured system. Lists of hardware components (disk drives, solid state disk emulators, input/output and memory components, and processors) and…

  1. Solar industrial process heat systems: An assessment of standards for materials and components

    NASA Astrophysics Data System (ADS)

    Rossiter, W. J.; Shipp, W. E.

    1981-09-01

    A study was conducted to obtain information on the performance of materials and components in operational solar industrial process heat (PH) systems, and to provide recommendations for the development of standards including evaluative test procedures for materials and components. An assessment of the needs for standards for evaluating the long-term performance of materials and components of IPH systems was made. The assessment was based on the availability of existing standards, and information obtained from a field survey of operational systems, the literature, and discussions with individuals in the industry. Field inspections of 10 operational IPH systems were performed.

  2. An architecture for object-oriented intelligent control of power systems in space

    NASA Technical Reports Server (NTRS)

    Holmquist, Sven G.; Jayaram, Prakash; Jansen, Ben H.

    1993-01-01

    A control system for autonomous distribution and control of electrical power during space missions is being developed. This system should free the astronauts from localizing faults and reconfiguring loads if problems with the power distribution and generation components occur. The control system uses an object-oriented simulation model of the power system and first principle knowledge to detect, identify, and isolate faults. Each power system component is represented as a separate object with knowledge of its normal behavior. The reasoning process takes place at three different levels of abstraction: the Physical Component Model (PCM) level, the Electrical Equivalent Model (EEM) level, and the Functional System Model (FSM) level, with the PCM the lowest level of abstraction and the FSM the highest. At the EEM level the power system components are reasoned about as their electrical equivalents, e.g, a resistive load is thought of as a resistor. However, at the PCM level detailed knowledge about the component's specific characteristics is taken into account. The FSM level models the system at the subsystem level, a level appropriate for reconfiguration and scheduling. The control system operates in two modes, a reactive and a proactive mode, simultaneously. In the reactive mode the control system receives measurement data from the power system and compares these values with values determined through simulation to detect the existence of a fault. The nature of the fault is then identified through a model-based reasoning process using mainly the EEM. Compound component models are constructed at the EEM level and used in the fault identification process. In the proactive mode the reasoning takes place at the PCM level. Individual components determine their future health status using a physical model and measured historical data. In case changes in the health status seem imminent the component warns the control system about its impending failure. The fault isolation process uses the FSM level for its reasoning base.

  3. The research on surface characteristics of optical lens by 3D printing technique and precise diamond turning technique

    NASA Astrophysics Data System (ADS)

    Huang, Chien-Yao; Chang, Chun-Ming; Ho, Cheng-Fong; Lee, Tai-Wen; Lin, Ping-Hung; Hsu, Wei-Yao

    2017-06-01

    The advantage of 3D printing technique is flexible in design and fabrication. Using 3D printing technique, the traditional manufacturing limitations are not considered. The optical lens is the key component in an optical system. The traditional process to manufacture optical plastic lens is injection molding. However injection molding is only suitable for plastics lens, it cannot fabricate optical and mechanical components at same time. The assembly error of optical system can be reduced effectively with fabricating optical and mechanical components at same time. The process of printing optical and mechanical components simultaneously is proposed in previous papers, but the optical surface of printing components is not transparent. If we increase the transmittance of the optical surface, the printing components which fabricated by 3D printing process could be high transmission. Therefore, precise diamond turning technique has been used to turning the surface of 3D printing optical lens in this paper. The precise diamond turning techniques could process surfaces of components to meet the requirements of optical system. A 3D printing machine, Stratasys Connex 500, and a precise diamond turning machine, Precitech Freeform705XG, have been used in this paper, respectively. The dimension, roughness, transmission and printing types of 3D printing components have been discussed in this paper. After turning and polishing process, the roughness of 3D printing component is below 0.05 μm and the transmittance increase above 80 %. This optical module can be used in hand-held telescope and other system which need lens and special mechanical structure fabricated simultaneously.

  4. Computational modeling of residual stress formation during the electron beam melting process for Inconel 718

    DOE PAGES

    Prabhakar, P.; Sames, William J.; Dehoff, Ryan R.; ...

    2015-03-28

    Here, a computational modeling approach to simulate residual stress formation during the electron beam melting (EBM) process within the additive manufacturing (AM) technologies for Inconel 718 is presented in this paper. The EBM process has demonstrated a high potential to fabricate components with complex geometries, but the resulting components are influenced by the thermal cycles observed during the manufacturing process. When processing nickel based superalloys, very high temperatures (approx. 1000 °C) are observed in the powder bed, base plate, and build. These high temperatures, when combined with substrate adherence, can result in warping of the base plate and affect themore » final component by causing defects. It is important to have an understanding of the thermo-mechanical response of the entire system, that is, its mechanical behavior towards thermal loading occurring during the EBM process prior to manufacturing a component. Therefore, computational models to predict the response of the system during the EBM process will aid in eliminating the undesired process conditions, a priori, in order to fabricate the optimum component. Such a comprehensive computational modeling approach is demonstrated to analyze warping of the base plate, stress and plastic strain accumulation within the material, and thermal cycles in the system during different stages of the EBM process.« less

  5. A comparative study of the proposed models for the components of the national health information system.

    PubMed

    Ahmadi, Maryam; Damanabi, Shahla; Sadoughi, Farahnaz

    2014-04-01

    National Health Information System plays an important role in ensuring timely and reliable access to Health information, which is essential for strategic and operational decisions that improve health, quality and effectiveness of health care. In other words, using the National Health information system you can improve the quality of health data, information and knowledge used to support decision making at all levels and areas of the health sector. Since full identification of the components of this system - for better planning and management influential factors of performanceseems necessary, therefore, in this study different attitudes towards components of this system are explored comparatively. This is a descriptive and comparative kind of study. The society includes printed and electronic documents containing components of the national health information system in three parts: input, process and output. In this context, search for information using library resources and internet search were conducted, and data analysis was expressed using comparative tables and qualitative data. The findings showed that there are three different perspectives presenting the components of national health information system Lippeveld and Sauerborn and Bodart model in 2000, Health Metrics Network (HMN) model from World Health Organization in 2008, and Gattini's 2009 model. All three models outlined above in the input (resources and structure) require components of management and leadership, planning and design programs, supply of staff, software and hardware facilities and equipment. Plus, in the "process" section from three models, we pointed up the actions ensuring the quality of health information system, and in output section, except for Lippeveld Model, two other models consider information products and use and distribution of information as components of the national health information system. the results showed that all the three models have had a brief discussion about the components of health information in input section. But Lippeveld model has overlooked the components of national health information in process and output sections. Therefore, it seems that the health measurement model of network has a comprehensive presentation for the components of health system in all three sections-input, process and output.

  6. Increasing component functionality via multi-process additive manufacturing

    NASA Astrophysics Data System (ADS)

    Coronel, Jose L.; Fehr, Katherine H.; Kelly, Dominic D.; Espalin, David; Wicker, Ryan B.

    2017-05-01

    Additively manufactured components, although extensively customizable, are often limited in functionality. Multi-process additive manufacturing (AM) grants the ability to increase the functionality of components via subtractive manufacturing, wire embedding, foil embedding and pick and place. These processes are scalable to include several platforms ranging from desktop to large area printers. The Multi3D System is highlighted, possessing the capability to perform the above mentioned processes, all while transferring a fabricated component with a robotic arm. Work was conducted to fabricate a patent inspired, printed missile seeker. The seeker demonstrated the advantage of multi-process AM via introduction of the pick and place process. Wire embedding was also explored, with the successful interconnect of two layers of embedded wires in different planes. A final demonstration of a printed contour bracket, served to show the reduction of surface roughness on a printed part is 87.5% when subtractive manufacturing is implemented in tandem with AM. Functionality of the components on all the cases was improved. Results included optical components embedded within the printed housing, wires embedded with interconnection, and reduced surface roughness. These results highlight the improved functionality of components through multi-process AM, specifically through work conducted with the Multi3D System.

  7. Design and analysis of automobile components using industrial procedures

    NASA Astrophysics Data System (ADS)

    Kedar, B.; Ashok, B.; Rastogi, Nisha; Shetty, Siddhanth

    2017-11-01

    Today’s automobiles depend upon mechanical systems that are crucial for aiding in the movement and safety features of the vehicle. Various safety systems such as Antilock Braking System (ABS) and passenger restraint systems have been developed to ensure that in the event of a collision be it head on or any other type, the safety of the passenger is ensured. On the other side, manufacturers also want their customers to have a good experience while driving and thus aim to improve the handling and the drivability of the vehicle. Electronics systems such as Cruise Control and active suspension systems are designed to ensure passenger comfort. Finally, to ensure optimum and safe driving the various components of a vehicle must be manufactured using the latest state of the art processes and must be tested and inspected with utmost care so that any defective component can be prevented from being sent out right at the beginning of the supply chain. Therefore, processes which can improve the lifetime of their respective components are in high demand and much research and development is done on these processes. With a solid base research conducted, these processes can be used in a much more versatile manner for different components, made up of different materials and under different input conditions. This will help increase the profitability of the process and also upgrade its value to the industry.

  8. Markov Modeling of Component Fault Growth over a Derived Domain of Feasible Output Control Effort Modifications

    NASA Technical Reports Server (NTRS)

    Bole, Brian; Goebel, Kai; Vachtsevanos, George

    2012-01-01

    This paper introduces a novel Markov process formulation of stochastic fault growth modeling, in order to facilitate the development and analysis of prognostics-based control adaptation. A metric representing the relative deviation between the nominal output of a system and the net output that is actually enacted by an implemented prognostics-based control routine, will be used to define the action space of the formulated Markov process. The state space of the Markov process will be defined in terms of an abstracted metric representing the relative health remaining in each of the system s components. The proposed formulation of component fault dynamics will conveniently relate feasible system output performance modifications to predictions of future component health deterioration.

  9. Control of microstructure in soldered, brazed, welded, plated, cast or vapor deposited manufactured components

    DOEpatents

    Ripley, Edward B.; Hallman, Russell L.

    2015-11-10

    Disclosed are methods and systems for controlling of the microstructures of a soldered, brazed, welded, plated, cast, or vapor deposited manufactured component. The systems typically use relatively weak magnetic fields of either constant or varying flux to affect material properties within a manufactured component, typically without modifying the alloy, or changing the chemical composition of materials or altering the time, temperature, or transformation parameters of a manufacturing process. Such systems and processes may be used with components consisting of only materials that are conventionally characterized as be uninfluenced by magnetic forces.

  10. Design and Production of the Injection Mould with a Cax Assistance

    NASA Astrophysics Data System (ADS)

    Likavčan, Lukáš; Frnčík, Martin; Zaujec, Rudolf; Satin, Lukáš; Martinkovič, Maroš

    2016-09-01

    This paper is focused on the process of designing the desired plastic component and injection mould by using the 3D CAD systems. The subsequent FEM analysis of the injection mould process was carried out in order to define shrinkage and deformation of the plastic material by CAE system. The dimensions of the mould were then modified to compensate the shrinkage effect. Machining process (milling and the laser texturing) of the mould was performed by using CAM systems. Finally, after the production of the plastic components by the injection mould technology, the inspection of the plastic component dimensions was carried out by CAQ in order to define the accuracy of the whole CAx chain. It was also demonstrated that CAx systems are an integral part of pre-production and production process.

  11. An integration architecture for the automation of a continuous production complex.

    PubMed

    Chacón, Edgar; Besembel, Isabel; Narciso, Flor; Montilva, Jonás; Colina, Eliezer

    2002-01-01

    The development of integrated automation systems for continuous production plants is a very complicated process. A variety of factors must be taken into account, such as their different components (e.g., production units control systems, planning systems, financial systems, etc.), the interaction among them, and their different behavior (continuous or discrete). Moreover, the difficulty of this process is increased by the fact that each component can be viewed in a different way depending on the kind of decisions to be made, and its specific behavior. Modeling continuous production complexes as a composition of components, where, in turn, each component may also be a composite, appears to be the simplest and safest way to develop integrated automation systems. In order to provide the most versatile way to develop this kind of system, this work proposes a new approach for designing and building them, where process behavior, operation conditions and equipment conditions are integrated into a hierarchical automation architecture.

  12. Error-proofing test system of industrial components based on image processing

    NASA Astrophysics Data System (ADS)

    Huang, Ying; Huang, Tao

    2018-05-01

    Due to the improvement of modern industrial level and accuracy, conventional manual test fails to satisfy the test standards of enterprises, so digital image processing technique should be utilized to gather and analyze the information on the surface of industrial components, so as to achieve the purpose of test. To test the installation parts of automotive engine, this paper employs camera to capture the images of the components. After these images are preprocessed including denoising, the image processing algorithm relying on flood fill algorithm is used to test the installation of the components. The results prove that this system has very high test accuracy.

  13. The Numerical Propulsion System Simulation: A Multidisciplinary Design System for Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    1999-01-01

    Advances in computational technology and in physics-based modeling are making large scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze ma or propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of design process and to provide the designer with critical information about the components early in the design process. This paper describes the development of the Numerical Propulsion System Simulation (NPSS), a multidisciplinary system of analysis tools that is focussed on extending the simulation capability from components to the full system. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  14. Interaction of dissolution, sorption and biodegradation on transport of BTEX in a saturated groundwater system: Numerical modeling and spatial moment analysis

    NASA Astrophysics Data System (ADS)

    Valsala, Renu; Govindarajan, Suresh Kumar

    2018-06-01

    Interaction of various physical, chemical and biological transport processes plays an important role in deciding the fate and migration of contaminants in groundwater systems. In this study, a numerical investigation on the interaction of various transport processes of BTEX in a saturated groundwater system is carried out. In addition, the multi-component dissolution from a residual BTEX source under unsteady flow conditions is incorporated in the modeling framework. The model considers Benzene, Toluene, Ethyl Benzene and Xylene dissolving from the residual BTEX source zone to undergo sorption and aerobic biodegradation within the groundwater aquifer. Spatial concentration profiles of dissolved BTEX components under the interaction of various sorption and biodegradation conditions have been studied. Subsequently, a spatial moment analysis is carried out to analyze the effect of interaction of various transport processes on the total dissolved mass and the mobility of dissolved BTEX components. Results from the present numerical study suggest that the interaction of dissolution, sorption and biodegradation significantly influence the spatial distribution of dissolved BTEX components within the saturated groundwater system. Mobility of dissolved BTEX components is also found to be affected by the interaction of these transport processes.

  15. Model reduction by weighted Component Cost Analysis

    NASA Technical Reports Server (NTRS)

    Kim, Jae H.; Skelton, Robert E.

    1990-01-01

    Component Cost Analysis considers any given system driven by a white noise process as an interconnection of different components, and assigns a metric called 'component cost' to each component. These component costs measure the contribution of each component to a predefined quadratic cost function. A reduced-order model of the given system may be obtained by deleting those components that have the smallest component costs. The theory of Component Cost Analysis is extended to include finite-bandwidth colored noises. The results also apply when actuators have dynamics of their own. Closed-form analytical expressions of component costs are also derived for a mechanical system described by its modal data. This is very useful to compute the modal costs of very high order systems. A numerical example for MINIMAST system is presented.

  16. Context sensitivity and ambiguity in component-based systems design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bespalko, S.J.; Sindt, A.

    1997-10-01

    Designers of components-based, real-time systems need to guarantee to correctness of soft-ware and its output. Complexity of a system, and thus the propensity for error, is best characterized by the number of states a component can encounter. In many cases, large numbers of states arise where the processing is highly dependent on context. In these cases, states are often missed, leading to errors. The following are proposals for compactly specifying system states which allow the factoring of complex components into a control module and a semantic processing module. Further, the need for methods that allow for the explicit representation ofmore » ambiguity and uncertainty in the design of components is discussed. Presented herein are examples of real-world problems which are highly context-sensitive or are inherently ambiguous.« less

  17. Trajectory of the arctic as an integrated system

    USGS Publications Warehouse

    Hinzman, Larry; Deal, Clara; McGuire, Anthony David; Mernild, Sebastian H.; Polyakov, Igor V.; Walsh, John E.

    2013-01-01

    Although much remains to be learned about the Arctic and its component processes, many of the most urgent scientific, engineering, and social questions can only be approached through a broader system perspective. Here, we address interactions between components of the Arctic System and assess feedbacks and the extent to which feedbacks (1) are now underway in the Arctic; and (2) will shape the future trajectory of the Arctic system. We examine interdependent connections among atmospheric processes, oceanic processes, sea-ice dynamics, marine and terrestrial ecosystems, land surface stocks of carbon and water, glaciers and ice caps, and the Greenland ice sheet. Our emphasis on the interactions between components, both historical and anticipated, is targeted on the feedbacks, pathways, and processes that link these different components of the Arctic system. We present evidence that the physical components of the Arctic climate system are currently in extreme states, and that there is no indication that the system will deviate from this anomalous trajectory in the foreseeable future. The feedback for which the evidence of ongoing changes is most compelling is the surface albedo-temperature feedback, which is amplifying temperature changes over land (primarily in spring) and ocean (primarily in autumn-winter). Other feedbacks likely to emerge are those in which key processes include surface fluxes of trace gases, changes in the distribution of vegetation, changes in surface soil moisture, changes in atmospheric water vapor arising from higher temperatures and greater areas of open ocean, impacts of Arctic freshwater fluxes on the meridional overturning circulation of the ocean, and changes in Arctic clouds resulting from changes in water vapor content.

  18. Trajectory of the Arctic as an integrated system.

    PubMed

    Hinzman, Larry D; Deal, Clara J; McGuire, A David; Mernild, Sebastian H; Polyakov, Igor V; Walsh, John E

    2013-12-01

    Although much remains to be learned about the Arctic and its component processes, many of the most urgent scientific, engineering, and social questions can only be approached through a broader system perspective. Here, we address interactions between components of the Arctic system and assess feedbacks and the extent to which feedbacks (1) are now underway in the Arctic and (2) will shape the future trajectory of the Arctic system. We examine interdependent connections among atmospheric processes, oceanic processes, sea-ice dynamics, marine and terrestrial ecosystems, land surface stocks of carbon and water, glaciers and ice caps, and the Greenland ice sheet. Our emphasis on the interactions between components, both historical and anticipated, is targeted on the feedbacks, pathways, and processes that link these different components of the Arctic system. We present evidence that the physical components of the Arctic climate system are currently in extreme states, and that there is no indication that the system will deviate from this anomalous trajectory in the foreseeable future. The feedback for which the evidence of ongoing changes is most compelling is the surface albedo-temperature feedback, which is amplifying temperature changes over land (primarily in spring) and ocean (primarily in autumn-winter). Other feedbacks likely to emerge are those in which key processes include surface fluxes of trace gases, changes in the distribution of vegetation, changes in surface soil moisture, changes in atmospheric water vapor arising from higher temperatures and greater areas of open ocean, impacts of Arctic freshwater fluxes on the meridional overturning circulation of the ocean, and changes in Arctic clouds resulting from changes in water vapor content.

  19. Internal motion in high vacuum systems

    NASA Astrophysics Data System (ADS)

    Frank, J. M.

    Three transfer and positioning mechanisms have been developed for the non-air exposed, multistep processing of components in vacuum chambers. The functions to be performed in all of the systems include ultraviolet/ozone cleaning, vacuum baking, deposition of thin films, and thermocompression sealing of the enclosures. Precise positioning of the components is required during the evaporation and sealing processes. The three methods of transporting and positioning the components were developed to accommodate the design criteria and goals of each individual system. The design philosophy, goals, and operation of the three mechanisms are discussed.

  20. Coarse-grained component concurrency in Earth system modeling: parallelizing atmospheric radiative transfer in the GFDL AM3 model using the Flexible Modeling System coupling framework

    NASA Astrophysics Data System (ADS)

    Balaji, V.; Benson, Rusty; Wyman, Bruce; Held, Isaac

    2016-10-01

    Climate models represent a large variety of processes on a variety of timescales and space scales, a canonical example of multi-physics multi-scale modeling. Current hardware trends, such as Graphical Processing Units (GPUs) and Many Integrated Core (MIC) chips, are based on, at best, marginal increases in clock speed, coupled with vast increases in concurrency, particularly at the fine grain. Multi-physics codes face particular challenges in achieving fine-grained concurrency, as different physics and dynamics components have different computational profiles, and universal solutions are hard to come by. We propose here one approach for multi-physics codes. These codes are typically structured as components interacting via software frameworks. The component structure of a typical Earth system model consists of a hierarchical and recursive tree of components, each representing a different climate process or dynamical system. This recursive structure generally encompasses a modest level of concurrency at the highest level (e.g., atmosphere and ocean on different processor sets) with serial organization underneath. We propose to extend concurrency much further by running more and more lower- and higher-level components in parallel with each other. Each component can further be parallelized on the fine grain, potentially offering a major increase in the scalability of Earth system models. We present here first results from this approach, called coarse-grained component concurrency, or CCC. Within the Geophysical Fluid Dynamics Laboratory (GFDL) Flexible Modeling System (FMS), the atmospheric radiative transfer component has been configured to run in parallel with a composite component consisting of every other atmospheric component, including the atmospheric dynamics and all other atmospheric physics components. We will explore the algorithmic challenges involved in such an approach, and present results from such simulations. Plans to achieve even greater levels of coarse-grained concurrency by extending this approach within other components, such as the ocean, will be discussed.

  1. High performance VLSI telemetry data systems

    NASA Technical Reports Server (NTRS)

    Chesney, J.; Speciale, N.; Horner, W.; Sabia, S.

    1990-01-01

    NASA's deployment of major space complexes such as Space Station Freedom (SSF) and the Earth Observing System (EOS) will demand increased functionality and performance from ground based telemetry acquisition systems well above current system capabilities. Adaptation of space telemetry data transport and processing standards such as those specified by the Consultative Committee for Space Data Systems (CCSDS) standards and those required for commercial ground distribution of telemetry data, will drive these functional and performance requirements. In addition, budget limitations will force the requirement for higher modularity, flexibility, and interchangeability at lower cost in new ground telemetry data system elements. At NASA's Goddard Space Flight Center (GSFC), the design and development of generic ground telemetry data system elements, over the last five years, has resulted in significant solutions to these problems. This solution, referred to as the functional components approach includes both hardware and software components ready for end user application. The hardware functional components consist of modern data flow architectures utilizing Application Specific Integrated Circuits (ASIC's) developed specifically to support NASA's telemetry data systems needs and designed to meet a range of data rate requirements up to 300 Mbps. Real-time operating system software components support both embedded local software intelligence, and overall system control, status, processing, and interface requirements. These components, hardware and software, form the superstructure upon which project specific elements are added to complete a telemetry ground data system installation. This paper describes the functional components approach, some specific component examples, and a project example of the evolution from VLSI component, to basic board level functional component, to integrated telemetry data system.

  2. IDAPS (Image Data Automated Processing System) System Description

    DTIC Science & Technology

    1988-06-24

    This document describes the physical configuration and components used in the image processing system referred to as IDAPS (Image Data Automated ... Processing System). This system was developed by the Environmental Research Institute of Michigan (ERIM) for Eglin Air Force Base. The system is designed

  3. Combined Acquisition/Processing For Data Reduction

    NASA Astrophysics Data System (ADS)

    Kruger, Robert A.

    1982-01-01

    Digital image processing systems necessarily consist of three components: acquisition, storage/retrieval and processing. The acquisition component requires the greatest data handling rates. By coupling together the acquisition witn some online hardwired processing, data rates and capacities for short term storage can be reduced. Furthermore, long term storage requirements can be reduced further by appropriate processing and editing of image data contained in short term memory. The net result could be reduced performance requirements for mass storage, processing and communication systems. Reduced amounts of data also snouid speed later data analysis and diagnostic decision making.

  4. Implementation of a VLSI Level Zero Processing system utilizing the functional component approach

    NASA Technical Reports Server (NTRS)

    Shi, Jianfei; Horner, Ward P.; Grebowsky, Gerald J.; Chesney, James R.

    1991-01-01

    A high rate Level Zero Processing system is currently being prototyped at NASA/Goddard Space Flight Center (GSFC). Based on state-of-the-art VLSI technology and the functional component approach, the new system promises capabilities of handling multiple Virtual Channels and Applications with a combined data rate of up to 20 Megabits per second (Mbps) at low cost.

  5. Holostrain system: a powerful tool for experimental mechanics

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Bhat, Gopalakrishna K.

    1992-09-01

    A portable holographic interferometer that can be used to measure displacements and strains in all kinds of mechanical components and structures is described. The holostrain system captures images on a TV camera that detects interference patterns produced by laser illumination. The video signals are digitized. The digitized interferograms are processed by a fast processing system. The output of the system are the strains or the stresses of the observed mechanical component or structure.

  6. Process Management inside ATLAS DAQ

    NASA Astrophysics Data System (ADS)

    Alexandrov, I.; Amorim, A.; Badescu, E.; Burckhart-Chromek, D.; Caprini, M.; Dobson, M.; Duval, P. Y.; Hart, R.; Jones, R.; Kazarov, A.; Kolos, S.; Kotov, V.; Liko, D.; Lucio, L.; Mapelli, L.; Mineev, M.; Moneta, L.; Nassiakou, M.; Pedro, L.; Ribeiro, A.; Roumiantsev, V.; Ryabov, Y.; Schweiger, D.; Soloviev, I.; Wolters, H.

    2002-10-01

    The Process Management component of the online software of the future ATLAS experiment data acquisition system is presented. The purpose of the Process Manager is to perform basic job control of the software components of the data acquisition system. It is capable of starting, stopping and monitoring the status of those components on the data acquisition processors independent of the underlying operating system. Its architecture is designed on the basis of a server client model using CORBA based communication. The server part relies on C++ software agent objects acting as an interface between the local operating system and client applications. Some of the major design challenges of the software agents were to achieve the maximum degree of autonomy possible, to create processes aware of dynamic conditions in their environment and with the ability to determine corresponding actions. Issues such as the performance of the agents in terms of time needed for process creation and destruction, the scalability of the system taking into consideration the final ATLAS configuration and minimizing the use of hardware resources were also of critical importance. Besides the details given on the architecture and the implementation, we also present scalability and performance tests results of the Process Manager system.

  7. Signal processing method and system for noise removal and signal extraction

    DOEpatents

    Fu, Chi Yung; Petrich, Loren

    2009-04-14

    A signal processing method and system combining smooth level wavelet pre-processing together with artificial neural networks all in the wavelet domain for signal denoising and extraction. Upon receiving a signal corrupted with noise, an n-level decomposition of the signal is performed using a discrete wavelet transform to produce a smooth component and a rough component for each decomposition level. The n.sup.th level smooth component is then inputted into a corresponding neural network pre-trained to filter out noise in that component by pattern recognition in the wavelet domain. Additional rough components, beginning at the highest level, may also be retained and inputted into corresponding neural networks pre-trained to filter out noise in those components also by pattern recognition in the wavelet domain. In any case, an inverse discrete wavelet transform is performed on the combined output from all the neural networks to recover a clean signal back in the time domain.

  8. Infinite Systems of Interacting Chains with Memory of Variable Length—A Stochastic Model for Biological Neural Nets

    NASA Astrophysics Data System (ADS)

    Galves, A.; Löcherbach, E.

    2013-06-01

    We consider a new class of non Markovian processes with a countable number of interacting components. At each time unit, each component can take two values, indicating if it has a spike or not at this precise moment. The system evolves as follows. For each component, the probability of having a spike at the next time unit depends on the entire time evolution of the system after the last spike time of the component. This class of systems extends in a non trivial way both the interacting particle systems, which are Markovian (Spitzer in Adv. Math. 5:246-290, 1970) and the stochastic chains with memory of variable length which have finite state space (Rissanen in IEEE Trans. Inf. Theory 29(5):656-664, 1983). These features make it suitable to describe the time evolution of biological neural systems. We construct a stationary version of the process by using a probabilistic tool which is a Kalikow-type decomposition either in random environment or in space-time. This construction implies uniqueness of the stationary process. Finally we consider the case where the interactions between components are given by a critical directed Erdös-Rényi-type random graph with a large but finite number of components. In this framework we obtain an explicit upper-bound for the correlation between successive inter-spike intervals which is compatible with previous empirical findings.

  9. Resolving components of wind accreting systems: a case study of Mira AB

    NASA Astrophysics Data System (ADS)

    Karovska, M.

    2004-12-01

    Mass transfer in many systems occurs by wind interaction rather then by tidal interaction, because the primary does not fill its Roche surface. The nearby detached binary Mira AB provides a unique laboratory for studying wind accretion processes because this system can be resolved and the interacting components can be studied individually, which is not possible in most accreting systems. The study of Mira AB wind accretion and mass transfer may therefore help understand the accretion processes in many other astronomical systems.

  10. A Comparative Study of the Proposed Models for the Components of the National Health Information System

    PubMed Central

    Ahmadi, Maryam; Damanabi, Shahla; Sadoughi, Farahnaz

    2014-01-01

    Introduction: National Health Information System plays an important role in ensuring timely and reliable access to Health information, which is essential for strategic and operational decisions that improve health, quality and effectiveness of health care. In other words, using the National Health information system you can improve the quality of health data, information and knowledge used to support decision making at all levels and areas of the health sector. Since full identification of the components of this system – for better planning and management influential factors of performanceseems necessary, therefore, in this study different attitudes towards components of this system are explored comparatively. Methods: This is a descriptive and comparative kind of study. The society includes printed and electronic documents containing components of the national health information system in three parts: input, process and output. In this context, search for information using library resources and internet search were conducted, and data analysis was expressed using comparative tables and qualitative data. Results: The findings showed that there are three different perspectives presenting the components of national health information system Lippeveld and Sauerborn and Bodart model in 2000, Health Metrics Network (HMN) model from World Health Organization in 2008, and Gattini’s 2009 model. All three models outlined above in the input (resources and structure) require components of management and leadership, planning and design programs, supply of staff, software and hardware facilities and equipment. Plus, in the “process” section from three models, we pointed up the actions ensuring the quality of health information system, and in output section, except for Lippeveld Model, two other models consider information products and use and distribution of information as components of the national health information system. Conclusion: the results showed that all the three models have had a brief discussion about the components of health information in input section. But Lippeveld model has overlooked the components of national health information in process and output sections. Therefore, it seems that the health measurement model of network has a comprehensive presentation for the components of health system in all three sections-input, process and output. PMID:24825937

  11. Latent effects decision analysis

    DOEpatents

    Cooper, J Arlin [Albuquerque, NM; Werner, Paul W [Albuquerque, NM

    2004-08-24

    Latent effects on a system are broken down into components ranging from those far removed in time from the system under study (latent) to those which closely effect changes in the system. Each component is provided with weighted inputs either by a user or from outputs of other components. A non-linear mathematical process known as `soft aggregation` is performed on the inputs to each component to provide information relating to the component. This information is combined in decreasing order of latency to the system to provide a quantifiable measure of an attribute of a system (e.g., safety) or to test hypotheses (e.g., for forensic deduction or decisions about various system design options).

  12. A Generic Modeling Process to Support Functional Fault Model Development

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  13. Development of a Water Recovery System Resource Tracking Model

    NASA Technical Reports Server (NTRS)

    Chambliss, Joe; Stambaugh, Imelda; Sargusingh, Miriam; Shull, Sarah; Moore, Michael

    2015-01-01

    A simulation model has been developed to track water resources in an exploration vehicle using Regenerative Life Support (RLS) systems. The Resource Tracking Model (RTM) integrates the functions of all the vehicle components that affect the processing and recovery of water during simulated missions. The approach used in developing the RTM enables its use as part of a complete vehicle simulation for real time mission studies. Performance data for the components in the RTM is focused on water processing. The data provided to the model has been based on the most recent information available regarding the technology of the component. This paper will describe the process of defining the RLS system to be modeled, the way the modeling environment was selected, and how the model has been implemented. Results showing how the RLS components exchange water are provided in a set of test cases.

  14. Development of a Water Recovery System Resource Tracking Model

    NASA Technical Reports Server (NTRS)

    Chambliss, Joe; Stambaugh, Imelda; Sarguishm, Miriam; Shull, Sarah; Moore, Michael

    2014-01-01

    A simulation model has been developed to track water resources in an exploration vehicle using regenerative life support (RLS) systems. The model integrates the functions of all the vehicle components that affect the processing and recovery of water during simulated missions. The approach used in developing the model results in the RTM being a part of of a complete vehicle simulation that can be used in real time mission studies. Performance data for the variety of components in the RTM is focused on water processing and has been defined based on the most recent information available for the technology of the component. This paper will describe the process of defining the RLS system to be modeled and then the way the modeling environment was selected and how the model has been implemented. Results showing how the variety of RLS components exchange water are provided in a set of test cases.

  15. Using Dual Process Models to Examine Impulsivity Throughout Neural Maturation.

    PubMed

    Leshem, Rotem

    2016-01-01

    The multivariate construct of impulsivity is examined through neural systems and connections that comprise the executive functioning system. It is proposed that cognitive and behavioral components of impulsivity can be divided into two distinct groups, mediated by (1) the cognitive control system: deficits in top-down cognitive control processes referred to as action/cognitive impulsivity and (2) the socioemotional system: related to bottom-up affective/motivational processes referred to as affective impulsivity. Examination of impulsivity from a developmental viewpoint can guide future research, potentially enabling the selection of more effective interventions for impulsive individuals, based on the cognitive components requiring improvement.

  16. Analysis of exergy efficiency of a super-critical compressed carbon dioxide energy-storage system based on the orthogonal method.

    PubMed

    He, Qing; Hao, Yinping; Liu, Hui; Liu, Wenyi

    2018-01-01

    Super-critical carbon dioxide energy-storage (SC-CCES) technology is a new type of gas energy-storage technology. This paper used orthogonal method and variance analysis to make significant analysis on the factors which would affect the thermodynamics characteristics of the SC-CCES system and obtained the significant factors and interactions in the energy-storage process, the energy-release process and the whole energy-storage system. Results have shown that the interactions in the components have little influence on the energy-storage process, the energy-release process and the whole energy-storage process of the SC-CCES system, the significant factors are mainly on the characteristics of the system component itself, which will provide reference for the optimization of the thermal properties of the energy-storage system.

  17. Analysis of exergy efficiency of a super-critical compressed carbon dioxide energy-storage system based on the orthogonal method

    PubMed Central

    He, Qing; Liu, Hui; Liu, Wenyi

    2018-01-01

    Super-critical carbon dioxide energy-storage (SC-CCES) technology is a new type of gas energy-storage technology. This paper used orthogonal method and variance analysis to make significant analysis on the factors which would affect the thermodynamics characteristics of the SC-CCES system and obtained the significant factors and interactions in the energy-storage process, the energy-release process and the whole energy-storage system. Results have shown that the interactions in the components have little influence on the energy-storage process, the energy-release process and the whole energy-storage process of the SC-CCES system, the significant factors are mainly on the characteristics of the system component itself, which will provide reference for the optimization of the thermal properties of the energy-storage system. PMID:29634742

  18. Real-time diagnostics for a reusable rocket engine

    NASA Technical Reports Server (NTRS)

    Guo, T. H.; Merrill, W.; Duyar, A.

    1992-01-01

    A hierarchical, decentralized diagnostic system is proposed for the Real-Time Diagnostic System component of the Intelligent Control System (ICS) for reusable rocket engines. The proposed diagnostic system has three layers of information processing: condition monitoring, fault mode detection, and expert system diagnostics. The condition monitoring layer is the first level of signal processing. Here, important features of the sensor data are extracted. These processed data are then used by the higher level fault mode detection layer to do preliminary diagnosis on potential faults at the component level. Because of the closely coupled nature of the rocket engine propulsion system components, it is expected that a given engine condition may trigger more than one fault mode detector. Expert knowledge is needed to resolve the conflicting reports from the various failure mode detectors. This is the function of the diagnostic expert layer. Here, the heuristic nature of this decision process makes it desirable to use an expert system approach. Implementation of the real-time diagnostic system described above requires a wide spectrum of information processing capability. Generally, in the condition monitoring layer, fast data processing is often needed for feature extraction and signal conditioning. This is usually followed by some detection logic to determine the selected faults on the component level. Three different techniques are used to attack different fault detection problems in the NASA LeRC ICS testbed simulation. The first technique employed is the neural network application for real-time sensor validation which includes failure detection, isolation, and accommodation. The second approach demonstrated is the model-based fault diagnosis system using on-line parameter identification. Besides these model based diagnostic schemes, there are still many failure modes which need to be diagnosed by the heuristic expert knowledge. The heuristic expert knowledge is implemented using a real-time expert system tool called G2 by Gensym Corp. Finally, the distributed diagnostic system requires another level of intelligence to oversee the fault mode reports generated by component fault detectors. The decision making at this level can best be done using a rule-based expert system. This level of expert knowledge is also implemented using G2.

  19. Overview of the Smart Network Element Architecture and Recent Innovations

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M.; Mata, Carlos T.; Oostdyk, Rebecca L.

    2008-01-01

    In industrial environments, system operators rely on the availability and accuracy of sensors to monitor processes and detect failures of components and/or processes. The sensors must be networked in such a way that their data is reported to a central human interface, where operators are tasked with making real-time decisions based on the state of the sensors and the components that are being monitored. Incorporating health management functions at this central location aids the operator by automating the decision-making process to suggest, and sometimes perform, the action required by current operating conditions. Integrated Systems Health Management (ISHM) aims to incorporate data from many sources, including real-time and historical data and user input, and extract information and knowledge from that data to diagnose failures and predict future failures of the system. By distributing health management processing to lower levels of the architecture, there is less bandwidth required for ISHM, enhanced data fusion, make systems and processes more robust, and improved resolution for the detection and isolation of failures in a system, subsystem, component, or process. The Smart Network Element (SNE) has been developed at NASA Kennedy Space Center to perform intelligent functions at sensors and actuators' level in support of ISHM.

  20. Automated assembly of fast-axis collimation (FAC) lenses for diode laser bar modules

    NASA Astrophysics Data System (ADS)

    Miesner, Jörn; Timmermann, Andre; Meinschien, Jens; Neumann, Bernhard; Wright, Steve; Tekin, Tolga; Schröder, Henning; Westphalen, Thomas; Frischkorn, Felix

    2009-02-01

    Laser diodes and diode laser bars are key components in high power semiconductor lasers and solid state laser systems. During manufacture, the assembly of the fast axis collimation (FAC) lens is a crucial step. The goal of our activities is to design an automated assembly system for high volume production. In this paper the results of an intermediate milestone will be reported: a demonstration system was designed, realized and tested to prove the feasibility of all of the system components and process features. The demonstration system consists of a high precision handling system, metrology for process feedback, a powerful digital image processing system and tooling for glue dispensing, UV curing and laser operation. The system components as well as their interaction with each other were tested in an experimental system in order to glean design knowledge for the fully automated assembly system. The adjustment of the FAC lens is performed by a series of predefined steps monitored by two cameras concurrently imaging the far field and the near field intensity distributions. Feedback from these cameras processed by a powerful and efficient image processing algorithm control a five axis precision motion system to optimize the fast axis collimation of the laser beam. Automated cementing of the FAC to the diode bar completes the process. The presentation will show the system concept, the algorithm of the adjustment as well as experimental results. A critical discussion of the results will close the talk.

  1. Development of a software tool to support chemical and biological terrorism intelligence analysis

    NASA Astrophysics Data System (ADS)

    Hunt, Allen R.; Foreman, William

    1997-01-01

    AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.

  2. High Resolution X-Ray Micro-CT of Ultra-Thin Wall Space Components

    NASA Technical Reports Server (NTRS)

    Roth, Don J.; Rauser, R. W.; Bowman, Randy R.; Bonacuse, Peter; Martin, Richard E.; Locci, I. E.; Kelley, M.

    2012-01-01

    A high resolution micro-CT system has been assembled and is being used to provide optimal characterization for ultra-thin wall space components. The Glenn Research Center NDE Sciences Team, using this CT system, has assumed the role of inspection vendor for the Advanced Stirling Convertor (ASC) project at NASA. This article will discuss many aspects of the development of the CT scanning for this type of component, including CT system overview; inspection requirements; process development, software utilized and developed to visualize, process, and analyze results; calibration sample development; results on actual samples; correlation with optical/SEM characterization; CT modeling; and development of automatic flaw recognition software. Keywords: Nondestructive Evaluation, NDE, Computed Tomography, Imaging, X-ray, Metallic Components, Thin Wall Inspection

  3. The design of red-blue 3D video fusion system based on DM642

    NASA Astrophysics Data System (ADS)

    Fu, Rongguo; Luo, Hao; Lv, Jin; Feng, Shu; Wei, Yifang; Zhang, Hao

    2016-10-01

    Aiming at the uncertainty of traditional 3D video capturing including camera focal lengths, distance and angle parameters between two cameras, a red-blue 3D video fusion system based on DM642 hardware processing platform is designed with the parallel optical axis. In view of the brightness reduction of traditional 3D video, the brightness enhancement algorithm based on human visual characteristics is proposed and the luminance component processing method based on YCbCr color space is also proposed. The BIOS real-time operating system is used to improve the real-time performance. The video processing circuit with the core of DM642 enhances the brightness of the images, then converts the video signals of YCbCr to RGB and extracts the R component from one camera, so does the other video and G, B component are extracted synchronously, outputs 3D fusion images finally. The real-time adjustments such as translation and scaling of the two color components are realized through the serial communication between the VC software and BIOS. The system with the method of adding red-blue components reduces the lost of the chrominance components and makes the picture color saturation reduce to more than 95% of the original. Enhancement algorithm after optimization to reduce the amount of data fusion in the processing of video is used to reduce the fusion time and watching effect is improved. Experimental results show that the system can capture images in near distance, output red-blue 3D video and presents the nice experiences to the audience wearing red-blue glasses.

  4. The development of control and monitoring system on marine current renewable energy Case study: strait of Toyapakeh - Nusa Penida, Bali

    NASA Astrophysics Data System (ADS)

    Arief, I. S.; Suherman, I. H.; Wardani, A. Y.; Baidowi, A.

    2017-05-01

    Control and monitoring system is a continuous process of securing the asset in the Marine Current Renewable Energy. A control and monitoring system is existed each critical components which is embedded in Failure Mode Effect Analysis (FMEA) method. As the result, the process in this paper developed through a matrix sensor. The matrix correlated to critical components and monitoring system which supported by sensors to conduct decision-making.

  5. Optical read/write memory system components

    NASA Technical Reports Server (NTRS)

    Kozma, A.

    1972-01-01

    The optical components of a breadboard holographic read/write memory system have been fabricated and the parameters specified of the major system components: (1) a laser system; (2) an x-y beam deflector; (3) a block data composer; (4) the read/write memory material; (5) an output detector array; and (6) the electronics to drive, synchronize, and control all system components. The objectives of the investigation were divided into three concurrent phases: (1) to supply and fabricate the major components according to the previously established specifications; (2) to prepare computer programs to simulate the entire holographic memory system so that a designer can balance the requirements on the various components; and (3) to conduct a development program to optimize the combined recording and reconstruction process of the high density holographic memory system.

  6. Constraints on Fluctuations in Sparsely Characterized Biological Systems.

    PubMed

    Hilfinger, Andreas; Norman, Thomas M; Vinnicombe, Glenn; Paulsson, Johan

    2016-02-05

    Biochemical processes are inherently stochastic, creating molecular fluctuations in otherwise identical cells. Such "noise" is widespread but has proven difficult to analyze because most systems are sparsely characterized at the single cell level and because nonlinear stochastic models are analytically intractable. Here, we exactly relate average abundances, lifetimes, step sizes, and covariances for any pair of components in complex stochastic reaction systems even when the dynamics of other components are left unspecified. Using basic mathematical inequalities, we then establish bounds for whole classes of systems. These bounds highlight fundamental trade-offs that show how efficient assembly processes must invariably exhibit large fluctuations in subunit levels and how eliminating fluctuations in one cellular component requires creating heterogeneity in another.

  7. Constraints on Fluctuations in Sparsely Characterized Biological Systems

    NASA Astrophysics Data System (ADS)

    Hilfinger, Andreas; Norman, Thomas M.; Vinnicombe, Glenn; Paulsson, Johan

    2016-02-01

    Biochemical processes are inherently stochastic, creating molecular fluctuations in otherwise identical cells. Such "noise" is widespread but has proven difficult to analyze because most systems are sparsely characterized at the single cell level and because nonlinear stochastic models are analytically intractable. Here, we exactly relate average abundances, lifetimes, step sizes, and covariances for any pair of components in complex stochastic reaction systems even when the dynamics of other components are left unspecified. Using basic mathematical inequalities, we then establish bounds for whole classes of systems. These bounds highlight fundamental trade-offs that show how efficient assembly processes must invariably exhibit large fluctuations in subunit levels and how eliminating fluctuations in one cellular component requires creating heterogeneity in another.

  8. 76 FR 70490 - Certain Electronic Devices With Graphics Data Processing Systems, Components Thereof, and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-14

    ... Graphics Data Processing Systems, Components Thereof, and Associated Software; Institution of Investigation... associated software by reason of infringement of certain claims of U.S. Patent No. 5,945,997 (``the `997... software that infringe one or more of claims 1, 3-5, 9, and 16 of the `997 patent; claims 1, 5, and 9 of...

  9. Advanced optical manufacturing digital integrated system

    NASA Astrophysics Data System (ADS)

    Tao, Yizheng; Li, Xinglan; Li, Wei; Tang, Dingyong

    2012-10-01

    It is necessarily to adapt development of advanced optical manufacturing technology with modern science technology development. To solved these problems which low of ration, ratio of finished product, repetition, consistent in big size and high precision in advanced optical component manufacturing. Applied business driven and method of Rational Unified Process, this paper has researched advanced optical manufacturing process flow, requirement of Advanced Optical Manufacturing integrated System, and put forward architecture and key technology of it. Designed Optical component core and Manufacturing process driven of Advanced Optical Manufacturing Digital Integrated System. the result displayed effective well, realized dynamic planning Manufacturing process, information integration improved ratio of production manufactory.

  10. Automated processing of whole blood units: operational value and in vitro quality of final blood components

    PubMed Central

    Jurado, Marisa; Algora, Manuel; Garcia-Sanchez, Félix; Vico, Santiago; Rodriguez, Eva; Perez, Sonia; Barbolla, Luz

    2012-01-01

    Background The Community Transfusion Centre in Madrid currently processes whole blood using a conventional procedure (Compomat, Fresenius) followed by automated processing of buffy coats with the OrbiSac system (CaridianBCT). The Atreus 3C system (CaridianBCT) automates the production of red blood cells, plasma and an interim platelet unit from a whole blood unit. Interim platelet unit are pooled to produce a transfusable platelet unit. In this study the Atreus 3C system was evaluated and compared to the routine method with regards to product quality and operational value. Materials and methods Over a 5-week period 810 whole blood units were processed using the Atreus 3C system. The attributes of the automated process were compared to those of the routine method by assessing productivity, space, equipment and staffing requirements. The data obtained were evaluated in order to estimate the impact of implementing the Atreus 3C system in the routine setting of the blood centre. Yield and in vitro quality of the final blood components processed with the two systems were evaluated and compared. Results The Atreus 3C system enabled higher throughput while requiring less space and employee time by decreasing the amount of equipment and processing time per unit of whole blood processed. Whole blood units processed on the Atreus 3C system gave a higher platelet yield, a similar amount of red blood cells and a smaller volume of plasma. Discussion These results support the conclusion that the Atreus 3C system produces blood components meeting quality requirements while providing a high operational efficiency. Implementation of the Atreus 3C system could result in a large organisational improvement. PMID:22044958

  11. Automated processing of whole blood units: operational value and in vitro quality of final blood components.

    PubMed

    Jurado, Marisa; Algora, Manuel; Garcia-Sanchez, Félix; Vico, Santiago; Rodriguez, Eva; Perez, Sonia; Barbolla, Luz

    2012-01-01

    The Community Transfusion Centre in Madrid currently processes whole blood using a conventional procedure (Compomat, Fresenius) followed by automated processing of buffy coats with the OrbiSac system (CaridianBCT). The Atreus 3C system (CaridianBCT) automates the production of red blood cells, plasma and an interim platelet unit from a whole blood unit. Interim platelet unit are pooled to produce a transfusable platelet unit. In this study the Atreus 3C system was evaluated and compared to the routine method with regards to product quality and operational value. Over a 5-week period 810 whole blood units were processed using the Atreus 3C system. The attributes of the automated process were compared to those of the routine method by assessing productivity, space, equipment and staffing requirements. The data obtained were evaluated in order to estimate the impact of implementing the Atreus 3C system in the routine setting of the blood centre. Yield and in vitro quality of the final blood components processed with the two systems were evaluated and compared. The Atreus 3C system enabled higher throughput while requiring less space and employee time by decreasing the amount of equipment and processing time per unit of whole blood processed. Whole blood units processed on the Atreus 3C system gave a higher platelet yield, a similar amount of red blood cells and a smaller volume of plasma. These results support the conclusion that the Atreus 3C system produces blood components meeting quality requirements while providing a high operational efficiency. Implementation of the Atreus 3C system could result in a large organisational improvement.

  12. 10 CFR 63.112 - Requirements for preclosure safety analysis of the geologic repository operations area.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... emergency power to instruments, utility service systems, and operating systems important to safety if there... include: (a) A general description of the structures, systems, components, equipment, and process... of the performance of the structures, systems, and components to identify those that are important to...

  13. 10 CFR 63.112 - Requirements for preclosure safety analysis of the geologic repository operations area.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... emergency power to instruments, utility service systems, and operating systems important to safety if there... include: (a) A general description of the structures, systems, components, equipment, and process... of the performance of the structures, systems, and components to identify those that are important to...

  14. 10 CFR 63.112 - Requirements for preclosure safety analysis of the geologic repository operations area.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... emergency power to instruments, utility service systems, and operating systems important to safety if there... include: (a) A general description of the structures, systems, components, equipment, and process... of the performance of the structures, systems, and components to identify those that are important to...

  15. 10 CFR 63.112 - Requirements for preclosure safety analysis of the geologic repository operations area.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... emergency power to instruments, utility service systems, and operating systems important to safety if there... include: (a) A general description of the structures, systems, components, equipment, and process... of the performance of the structures, systems, and components to identify those that are important to...

  16. Computer-Aided Modeling and Analysis of Power Processing Systems (CAMAPPS). Phase 1: Users handbook

    NASA Technical Reports Server (NTRS)

    Kim, S.; Lee, J.; Cho, B. H.; Lee, F. C.

    1986-01-01

    The EASY5 macro component models developed for the spacecraft power system simulation are described. A brief explanation about how to use the macro components with the EASY5 Standard Components to build a specific system is given through an example. The macro components are ordered according to the following functional group: converter power stage models, compensator models, current-feedback models, constant frequency control models, load models, solar array models, and shunt regulator models. Major equations, a circuit model, and a program listing are provided for each macro component.

  17. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  18. Radiology information system: a workflow-based approach.

    PubMed

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; van der Aalst, W M P

    2009-09-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare.

  19. 30 CFR 260.130 - What criteria does MMS use for selecting bidding systems and bidding system components?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... system or tract, or may present a conflict that we will have to resolve in the process of bidding system... bidding systems and bidding system components? 260.130 Section 260.130 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, REGULATION, AND ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OUTER CONTINENTAL...

  20. Overview of the production of sintered SiC optics and optical sub-assemblies

    NASA Astrophysics Data System (ADS)

    Williams, S.; Deny, P.

    2005-08-01

    The following is an overview on sintered silicon carbide (SSiC) material properties and processing requirements for the manufacturing of components for advanced technology optical systems. The overview will compare SSiC material properties to typical materials used for optics and optical structures. In addition, it will review manufacturing processes required to produce optical components in detail by process step. The process overview will illustrate current manufacturing process and concepts to expand the process size capability. The overview will include information on the substantial capital equipment employed in the manufacturing of SSIC. This paper will also review common in-process inspection methodology and design rules. The design rules are used to improve production yield, minimize cost, and maximize the inherent benefits of SSiC for optical systems. Optimizing optical system designs for a SSiC manufacturing process will allow systems designers to utilize SSiC as a low risk, cost competitive, and fast cycle time technology for next generation optical systems.

  1. A shared resource between declarative memory and motor memory.

    PubMed

    Keisler, Aysha; Shadmehr, Reza

    2010-11-03

    The neural systems that support motor adaptation in humans are thought to be distinct from those that support the declarative system. Yet, during motor adaptation changes in motor commands are supported by a fast adaptive process that has important properties (rapid learning, fast decay) that are usually associated with the declarative system. The fast process can be contrasted to a slow adaptive process that also supports motor memory, but learns gradually and shows resistance to forgetting. Here we show that after people stop performing a motor task, the fast motor memory can be disrupted by a task that engages declarative memory, but the slow motor memory is immune from this interference. Furthermore, we find that the fast/declarative component plays a major role in the consolidation of the slow motor memory. Because of the competitive nature of declarative and nondeclarative memory during consolidation, impairment of the fast/declarative component leads to improvements in the slow/nondeclarative component. Therefore, the fast process that supports formation of motor memory is not only neurally distinct from the slow process, but it shares critical resources with the declarative memory system.

  2. A shared resource between declarative memory and motor memory

    PubMed Central

    Keisler, Aysha; Shadmehr, Reza

    2010-01-01

    The neural systems that support motor adaptation in humans are thought to be distinct from those that support the declarative system. Yet, during motor adaptation changes in motor commands are supported by a fast adaptive process that has important properties (rapid learning, fast decay) that are usually associated with the declarative system. The fast process can be contrasted to a slow adaptive process that also supports motor memory, but learns gradually and shows resistance to forgetting. Here we show that after people stop performing a motor task, the fast motor memory can be disrupted by a task that engages declarative memory, but the slow motor memory is immune from this interference. Furthermore, we find that the fast/declarative component plays a major role in the consolidation of the slow motor memory. Because of the competitive nature of declarative and non-declarative memory during consolidation, impairment of the fast/declarative component leads to improvements in the slow/non-declarative component. Therefore, the fast process that supports formation of motor memory is not only neurally distinct from the slow process, but it shares critical resources with the declarative memory system. PMID:21048140

  3. Process and catalyst for carbonylating olefins

    DOEpatents

    Zoeller, Joseph Robert

    1998-06-02

    Disclosed is an improved catalyst system and process for preparing aliphatic carbonyl compounds such as aliphatic carboxylic acids, alkyl esters of aliphatic carboxylic acids and anhydrides of aliphatic carboxylic acids by carbonylating olefins in the presence of a catalyst system comprising (1) a first component selected from at least one Group 6 metal, i.e., chromium, molybdenum, and/or tungsten and (2) a second component selected from at least one of certain halides and tertiary and quaternary compounds of a Group 15 element, i.e., nitrogen, phosphorus and/or arsenic, and (3) as a third component, a polar, aprotic solvent. The process employing the improved catalyst system is carried out under carbonylating conditions of pressure and temperature discussed herein. The process constitutes and improvement over known processes since it can be carried out at moderate carbonylation conditions without the necessity of using an expensive noble metal catalyst, volatile, toxic materials such as nickel tetracarbonyl, formic acid or a formate ester. Further, the addition of a polar, aprotic solvent to the catalyst system significantly increases, or accelerates, the rate at which the carbonylation takes place.

  4. Introduction. ERIC Processing Manual, Section I.

    ERIC Educational Resources Information Center

    Brandhorst, Ted, Ed.

    This document describes the major organizational components of the Educational Resources Information Center (ERIC) system, the interactions between those components, and the major products and services provided by those components. (WTB)

  5. Developing seventh grade students' systems thinking skills in the context of the human circulatory system.

    PubMed

    Raved, Lena; Yarden, Anat

    2014-01-01

    Developing systems thinking skills in school can provide useful tools to deal with a vast amount of medical and health information that may help learners in decision making in their future lives as citizen. Thus, there is a need to develop effective tools that will allow learners to analyze biological systems and organize their knowledge. Here, we examine junior high school students' systems thinking skills in the context of the human circulatory system. A model was formulated for developing teaching and learning materials and for characterizing students' systems thinking skills. Specifically, we asked whether seventh grade students, who studied about the human circulatory system, acquired systems thinking skills, and what are the characteristics of those skills? Concept maps were used to characterize students' systems thinking components and examine possible changes in the students' knowledge structure. These maps were composed by the students before and following the learning process. The study findings indicate a significant improvement in the students' ability to recognize the system components and the processes that occur within the system, as well as the relationships between different levels of organization of the system, following the learning process. Thus, following learning students were able to organize the systems' components and its processes within a framework of relationships, namely the students' systems thinking skills were improved in the course of learning using the teaching and learning materials.

  6. Clustering execution in a processing system to increase power savings

    DOEpatents

    Bose, Pradip; Buyuktosunoglu, Alper; Jacobson, Hans M.; Vega, Augusto J.

    2018-03-20

    Embodiments relate to clustering execution in a processing system. An aspect includes accessing a control flow graph that defines a data dependency and an execution sequence of a plurality of tasks of an application that executes on a plurality of system components. The execution sequence of the tasks in the control flow graph is modified as a clustered control flow graph that clusters active and idle phases of a system component while maintaining the data dependency. The clustered control flow graph is sent to an operating system, where the operating system utilizes the clustered control flow graph for scheduling the tasks.

  7. Method and system for controlling a gasification or partial oxidation process

    DOEpatents

    Rozelle, Peter L; Der, Victor K

    2015-02-10

    A method and system for controlling a fuel gasification system includes optimizing a conversion of solid components in the fuel to gaseous fuel components, controlling the flux of solids entrained in the product gas through equipment downstream of the gasifier, and maximizing the overall efficiencies of processes utilizing gasification. A combination of models, when utilized together, can be integrated with existing plant control systems and operating procedures and employed to develop new control systems and operating procedures. Such an approach is further applicable to gasification systems that utilize both dry feed and slurry feed.

  8. Applicability and Limitations of Reliability Allocation Methods

    NASA Technical Reports Server (NTRS)

    Cruz, Jose A.

    2016-01-01

    Reliability allocation process may be described as the process of assigning reliability requirements to individual components within a system to attain the specified system reliability. For large systems, the allocation process is often performed at different stages of system design. The allocation process often begins at the conceptual stage. As the system design develops, more information about components and the operating environment becomes available, different allocation methods can be considered. Reliability allocation methods are usually divided into two categories: weighting factors and optimal reliability allocation. When properly applied, these methods can produce reasonable approximations. Reliability allocation techniques have limitations and implied assumptions that need to be understood by system engineers. Applying reliability allocation techniques without understanding their limitations and assumptions can produce unrealistic results. This report addresses weighting factors, optimal reliability allocation techniques, and identifies the applicability and limitations of each reliability allocation technique.

  9. Developing Seventh Grade Students’ Systems Thinking Skills in the Context of the Human Circulatory System

    PubMed Central

    Raved, Lena; Yarden, Anat

    2014-01-01

    Developing systems thinking skills in school can provide useful tools to deal with a vast amount of medical and health information that may help learners in decision making in their future lives as citizen. Thus, there is a need to develop effective tools that will allow learners to analyze biological systems and organize their knowledge. Here, we examine junior high school students’ systems thinking skills in the context of the human circulatory system. A model was formulated for developing teaching and learning materials and for characterizing students’ systems thinking skills. Specifically, we asked whether seventh grade students, who studied about the human circulatory system, acquired systems thinking skills, and what are the characteristics of those skills? Concept maps were used to characterize students’ systems thinking components and examine possible changes in the students’ knowledge structure. These maps were composed by the students before and following the learning process. The study findings indicate a significant improvement in the students’ ability to recognize the system components and the processes that occur within the system, as well as the relationships between different levels of organization of the system, following the learning process. Thus, following learning students were able to organize the systems’ components and its processes within a framework of relationships, namely the students’ systems thinking skills were improved in the course of learning using the teaching and learning materials. PMID:25520948

  10. Definition of Contravariant Velocity Components

    NASA Technical Reports Server (NTRS)

    Hung, Ching-moa; Kwak, Dochan (Technical Monitor)

    2002-01-01

    In this paper we have reviewed the basics of tensor analysis in an attempt to clarify some misconceptions regarding contravariant and covariant vector components as used in fluid dynamics. We have indicated that contravariant components are components of a given vector expressed as a unique combination of the covariant base vector system and, vice versa, that the covariant components are components of a vector expressed with the contravariant base vector system. Mathematically, expressing a vector with a combination of base vector is a decomposition process for a specific base vector system. Hence, the contravariant velocity components are decomposed components of velocity vector along the directions of coordinate lines, with respect to the covariant base vector system. However, the contravariant (and covariant) components are not physical quantities. Their magnitudes and dimensions are controlled by their corresponding covariant (and contravariant) base vectors.

  11. A review on the relationship between food structure, processing, and bioavailability.

    PubMed

    Sensoy, Ilkay

    2014-01-01

    This review highlights the effects of processing and food matrix on bioaccessibility and bioavailability of functional components. Human digestive system is reviewed as an element in bioavailability. Methods for bioaccessibility and bioavailability determination are described. Information about the location of functional compounds in the tissue is presented to portray the matrix information. Research data on the effects of food matrix and processing on bioaccessibility and bioavailability are summarized. Finally, trends in the development of functional component delivery systems are included.

  12. Analysis of electromagnetic interference from power system processing and transmission components for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Barber, Peter W.; Demerdash, Nabeel A. O.; Wang, R.; Hurysz, B.; Luo, Z.

    1991-01-01

    The goal is to analyze the potential effects of electromagnetic interference (EMI) originating from power system processing and transmission components for Space Station Freedom.The approach consists of four steps: (1) develop analytical tools (models and computer programs); (2) conduct parameterization studies; (3) predict the global space station EMI environment; and (4) provide a basis for modification of EMI standards.

  13. The Livingstone Model of a Main Propulsion System

    NASA Technical Reports Server (NTRS)

    Bajwa, Anupa; Sweet, Adam; Korsmeyer, David (Technical Monitor)

    2003-01-01

    Livingstone is a discrete, propositional logic-based inference engine that has been used for diagnosis of physical systems. We present a component-based model of a Main Propulsion System (MPS) and say how it is used with Livingstone (L2) in order to implement a diagnostic system for integrated vehicle health management (IVHM) for the Propulsion IVHM Technology Experiment (PITEX). We start by discussing the process of conceptualizing such a model. We describe graphical tools that facilitated the generation of the model. The model is composed of components (which map onto physical components), connections between components and constraints. A component is specified by variables, with a set of discrete, qualitative values for each variable in its local nominal and failure modes. For each mode, the model specifies the component's behavior and transitions. We describe the MPS components' nominal and fault modes and associated Livingstone variables and data structures. Given this model, and observed external commands and observations from the system, Livingstone tracks the state of the MPS over discrete time-steps by choosing trajectories that are consistent with observations. We briefly discuss how the compiled model fits into the overall PITEX architecture. Finally we summarize our modeling experience, discuss advantages and disadvantages of our approach, and suggest enhancements to the modeling process.

  14. An overview of Virginia's computerized crash records systems.

    DOT National Transportation Integrated Search

    1995-01-01

    This report identifies the various components of Virginia's computerized crash records systems and explains how these components process crash data. Emphasis has been placed on recording information that was previously not documented. Most of the sta...

  15. (n, N) type maintenance policy for multi-component systems with failure interactions

    NASA Astrophysics Data System (ADS)

    Zhang, Zhuoqi; Wu, Su; Li, Binfeng; Lee, Seungchul

    2015-04-01

    This paper studies maintenance policies for multi-component systems in which failure interactions and opportunistic maintenance (OM) involve. This maintenance problem can be formulated as a Markov decision process (MDP). However, since an action set and state space in MDP exponentially expand as the number of components increase, traditional approaches are computationally intractable. To deal with curse of dimensionality, we decompose such a multi-component system into mutually influential single-component systems. Each single-component system is formulated as an MDP with the objective of minimising its long-run average maintenance cost. Under some reasonable assumptions, we prove the existence of the optimal (n, N) type policy for a single-component system. An algorithm to obtain the optimal (n, N) type policy is also proposed. Based on the proposed algorithm, we develop an iterative approximation algorithm to obtain an acceptable maintenance policy for a multi-component system. Numerical examples find that failure interactions and OM pose significant effects on a maintenance policy.

  16. GITEWS, an extensible and open integration platform for manifold sensor systems and processing components based on Sensor Web Enablement and the principles of Service Oriented Architectures

    NASA Astrophysics Data System (ADS)

    Haener, Rainer; Waechter, Joachim; Fleischer, Jens; Herrnkind, Stefan; Schwarting, Herrmann

    2010-05-01

    The German Indonesian Tsunami Early Warning System (GITEWS) is a multifaceted system consisting of various sensor types like seismometers, sea level sensors or GPS stations, and processing components, all with their own system behavior and proprietary data structure. To operate a warning chain, beginning from measurements scaling up to warning products, all components have to interact in a correct way, both syntactically and semantically. Designing the system great emphasis was laid on conformity to the Sensor Web Enablement (SWE) specification by the Open Geospatial Consortium (OGC). The technical infrastructure, the so called Tsunami Service Bus (TSB) follows the blueprint of Service Oriented Architectures (SOA). The TSB is an integration concept (SWE) where functionality (observe, task, notify, alert, and process) is grouped around business processes (Monitoring, Decision Support, Sensor Management) and packaged as interoperable services (SAS, SOS, SPS, WNS). The benefits of using a flexible architecture together with SWE lead to an open integration platform: • accessing and controlling heterogeneous sensors in a uniform way (Functional Integration) • assigns functionality to distinct services (Separation of Concerns) • allows resilient relationship between systems (Loose Coupling) • integrates services so that they can be accessed from everywhere (Location Transparency) • enables infrastructures which integrate heterogeneous applications (Encapsulation) • allows combination of services (Orchestration) and data exchange within business processes Warning systems will evolve over time: New sensor types might be added, old sensors will be replaced and processing components will be improved. From a collection of few basic services it shall be possible to compose more complex functionality essential for specific warning systems. Given these requirements a flexible infrastructure is a prerequisite for sustainable systems and their architecture must be tailored for evolution. The use of well-known techniques and widely used open source software implementing industrial standards reduces the impact of service modifications allowing the evolution of a system as a whole. GITEWS implemented a solution to feed sensor raw data from any (remote) system into the infrastructure. Specific dispatchers enable plugging in sensor-type specific processing without changing the architecture. Client components don't need to be adjusted if new sensor-types or individuals are added to the system, because they access them via standardized services. One of the outstanding features of service-oriented architectures is the possibility to compose new services from existing ones. The so called orchestration, allows the definition of new warning processes which can be adapted easily to new requirements. This approach has following advantages: • With implementing SWE it is possible to establish the "detection" and integration of sensors via the internet. Thus a system of systems combining early warning functionality at different levels of detail is feasible. • Any institution could add both its own components as well as components from third parties if they are developed in conformance to SOA principles. In a federation an institution keeps the ownership of its data and decides which data are provided by a service and when. • A system can be deployed at minor costs as a core for own development at any institution and thus enabling autonomous early warning- or monitoring systems. The presentation covers both design and various instantiations (live demonstration) of the GITEWS architecture. Experiences concerning the design and complexity of SWE will be addressed in detail. A substantial amount of attention is laid on the techniques and methods of extending the architecture, adapting proprietary components to SWE services and encoding, and their orchestration in high level workflows and processes. Furthermore the potential of the architecture concerning adaptive behavior, collaboration across boundaries and semantic interoperability will be addressed.

  17. Status of the ITER Cryodistribution

    NASA Astrophysics Data System (ADS)

    Chang, H.-S.; Vaghela, H.; Patel, P.; Rizzato, A.; Cursan, M.; Henry, D.; Forgeas, A.; Grillot, D.; Sarkar, B.; Muralidhara, S.; Das, J.; Shukla, V.; Adler, E.

    2017-12-01

    Since the conceptual design of the ITER Cryodistribution many modifications have been applied due to both system optimization and improved knowledge of the clients’ requirements. Process optimizations in the Cryoplant resulted in component simplifications whereas increased heat load in some of the superconducting magnet systems required more complicated process configuration but also the removal of a cold box was possible due to component arrangement standardization. Another cold box, planned for redundancy, has been removed due to the Tokamak in-Cryostat piping layout modification. In this proceeding we will summarize the present design status and component configuration of the ITER Cryodistribution with all changes implemented which aim at process optimization and simplification as well as operational reliability, stability and flexibility.

  18. Product specification documentation standard and Data Item Descriptions (DID). Volume of the information system life-cycle and documentation standards, volume 3

    NASA Technical Reports Server (NTRS)

    Callender, E. David; Steinbacher, Jody

    1989-01-01

    This is the third of five volumes on Information System Life-Cycle and Documentation Standards which present a well organized, easily used standard for providing technical information needed for developing information systems, components, and related processes. This volume states the Software Management and Assurance Program documentation standard for a product specification document and for data item descriptions. The framework can be applied to any NASA information system, software, hardware, operational procedures components, and related processes.

  19. Implementation of Integrated System Fault Management Capability

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Schmalzel, John; Morris, Jon; Smith, Harvey; Turowski, Mark

    2008-01-01

    Fault Management to support rocket engine test mission with highly reliable and accurate measurements; while improving availability and lifecycle costs. CORE ELEMENTS: Architecture, taxonomy, and ontology (ATO) for DIaK management. Intelligent Sensor Processes; Intelligent Element Processes; Intelligent Controllers; Intelligent Subsystem Processes; Intelligent System Processes; Intelligent Component Processes.

  20. Cost decomposition of linear systems with application to model reduction

    NASA Technical Reports Server (NTRS)

    Skelton, R. E.

    1980-01-01

    A means is provided to assess the value or 'cst' of each component of a large scale system, when the total cost is a quadratic function. Such a 'cost decomposition' of the system has several important uses. When the components represent physical subsystems which can fail, the 'component cost' is useful in failure mode analysis. When the components represent mathematical equations which may be truncated, the 'component cost' becomes a criterion for model truncation. In this latter event component costs provide a mechanism by which the specific control objectives dictate which components should be retained in the model reduction process. This information can be valuable in model reduction and decentralized control problems.

  1. User's manual for the National Water Information System of the U.S. Geological Survey: Automated Data Processing System (ADAPS)

    USGS Publications Warehouse

    ,

    2003-01-01

    The Automated Data Processing System (ADAPS) was developed for the processing, storage, and retrieval of water data, and is part of the National Water Information System (NWIS) developed by the U.S. Geological Survey. NWIS is a distributed water database in which data can be processed over a network of computers at U.S. Geological Survey offices throughout the United States. NWIS comprises four subsystems: ADAPS, the Ground-Water Site Inventory System (GWSI), the Water-Quality System (QWDATA), and the Site-Specific Water-Use Data System (SWUDS). This section of the NWIS User's Manual describes the automated data processing of continuously recorded water data, which primarily are surface-water data; however, the system also allows for the processing of water-quality and ground-water data. This manual describes various components and features of the ADAPS, and provides an overview of the data processing system and a description of the system framework. The components and features included are: (1) data collection and processing, (2) ADAPS menus and programs, (3) command line functions, (4) steps for processing station records, (5) postprocessor programs control files, (6) the standard format for transferring and entering unit and daily values, and (7) relational database (RDB) formats.

  2. Compatibility Assessment Tool

    NASA Technical Reports Server (NTRS)

    Egbert, James Allen

    2016-01-01

    In support of ground system development for the Space Launch System (SLS), engineers are tasked with building immense engineering models of extreme complexity. The various systems require rigorous analysis of pneumatics, hydraulic, cryogenic, and hypergolic systems. There are certain standards that each of these systems must meet, in the form of pressure vessel system (PVS) certification reports. These reports can be hundreds of pages long, and require many hours to compile. Traditionally, each component is analyzed individually, often utilizing hand calculations in the design process. The objective of this opportunity is to perform these analyses in an integrated fashion with the parametric CADCAE environment. This allows for systems to be analyzed on an assembly level in a semi-automated fashion, which greatly improves accuracy and efficiency. To accomplish this, component specific parameters were stored in the Windchill database to individual Creo Parametric models based on spec control drawings. These parameters were then accessed by using the Prime Analysis within Creo Parametric. MathCAD Prime spreadsheets were created that automatically extracted these parameters, performed calculations, and generated reports. The reports described component compatibility based on local conditions such as pressure, temperature, density, and flow rates. The reports also determined component pairing compatibility, such as properly sizing relief valves with regulators. The reports stored the input conditions that were used to determine compatibility to increase traceability of component selection. The desired workflow for using this tool would begin with a Creo Schematics diagram of a PVS system. This schematic would store local conditions and locations of components. The schematic would then populate an assembly within Creo Parametric, using Windchill database parts. These parts would have their attributes already assigned, and the MathCAD spreadsheets could begin running through database parts to determine which components would be suited for specific locations within the assembly. This eliminates a significant amount of time from the design process, and makes initial analysis assessments more accurate. Each component that would be checked for a location within the assembly would generate a report, showing whether the component was compatible. These reports could be used to generate the PVS report without the need to perform the same analysis multiple times. This process also has the potential to be expanded upon to further automate PVS reports. The integration of software codes or macros could be used to automatically check through hundreds of parts for each location on the schematic. If the software could recognize which type of component would be necessary for each location, it is possible that simply starting the macro could completely choose all the components needed for the schematic, and in turn the system. This would save many hours of work initially selecting components, which could end up saving money. Overall, this process helps to automate initial component selections for PVS systems to fit local design specifications. These selections will automatically generate reports showing how the design criteria are met by the specific component that was chosen. These reports will contribute to easier compilation of the PVS certification reports, which currently take a great amount of time and effort to produce.

  3. Washington State University Algae Biofuels Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    chen, Shulin; McCormick, Margaret; Sutterlin, Rusty

    The goal of this project was to advance algal technologies for the production of biofuels and biochemicals by establishing the Washington State Algae Alliance, a collaboration partnership among two private companies (Targeted Growth, Inc. (TGI), Inventure Chemicals (Inventure) Inc (now Inventure Renewables Inc) and Washington State University (WSU). This project included three major components. The first one was strain development at TGI by genetically engineering cyanobacteria to yield high levels of lipid and other specialty chemicals. The second component was developing an algal culture system at WSU to produce algal biomass as biofuel feedstock year-round in the northern states ofmore » the United States. This system included two cultivation modes, the first one was a phototrophic process and the second a heterotrophic process. The phototrophic process would be used for algae production in open ponds during warm seasons; the heterotrophic process would be used in cold seasons so that year-round production of algal lipid would be possible. In warm seasons the heterotrophic process would also produce algal seeds to be used in the phototrophic culture process. Selected strains of green algae and cyanobacteria developed by TGI were tested in the system. The third component was downstream algal biomass processing by Inventure that included efficiently harvesting the usable fuel fractions from the algae mass and effectively isolating and separating the usable components into specific fractions, and converting isolated fractions into green chemicals.« less

  4. The effects of polymers' visco-elastoplastic properties on the micro cavities filling step of hot embossing process

    NASA Astrophysics Data System (ADS)

    Cheng, Gang; Barrière, Thierry

    2018-05-01

    The hot embossing process has been widely used in the manufacturing of polymer components, especially for the fabrication of micro or nano components. The significant advantage of the hot embossing process compared to the traditional injection moulding process is the excellent effective filling ratio for the high aspect ratio components and large surface structural components. The lack of material behavior modeling and numerical simulation limits the further development the hot embossing process, especially at the micro and nano scales. In this paper, a visco-elastoplastic behavior law has been proposed to describe the amorphous thermoplastic polymer mechanical properties in the hot embossing processing temperature range, which is lightly above their glass transition temperature. Uniaxial compression tests have been carried out in order to investigate the amorphous thermoplastic polymers properties. The material parameters in the visco-elastoplastic model have been identified according to the experimental results. A 3D numerical model has been created in the simulation software, which is based on the finite element method. The numerical simulation of the filling step of the hot embossing process has been effectuated by taking into account the viscous, elastic and plastic behaviors of thermoplastic polymers. The micro hot embossing process has been carried out using horizontal injection compression moulding equipment. A complete compression mould tool, equipped with the heating system, the cooling system, the ejection system and the vacuum system, has been designed and elaborated for this research work. The microfluidic devices based on the amorphous thermoplastic polymers have been successfully elaborated by hot embossing process. Proper agreement between the numerical simulation and the experimental elaboration has been obtained.

  5. Design, fabrication and testing of hierarchical micro-optical structures and systems

    NASA Astrophysics Data System (ADS)

    Cannistra, Aaron Thomas

    Micro-optical systems are becoming essential components in imaging, sensing, communications, computing, and other applications. Optically based designs are replacing electronic, chemical and mechanical systems for a variety of reasons, including low power consumption, reduced maintenance, and faster operation. However, as the number and variety of applications increases, micro-optical system designs are becoming smaller, more integrated, and more complicated. Micro and nano-optical systems found in nature, such as the imaging systems found in many insects and crustaceans, can have highly integrated optical structures that vary in size by orders of magnitude. These systems incorporate components such as compound lenses, anti-reflective lens surface structuring, spectral filters, and polarization selective elements. For animals, these hybrid optical systems capable of many optical functions in a compact package have been repeatedly selected during the evolutionary process. Understanding the advantages of these designs gives motivation for synthetic optical systems with comparable functionality. However, alternative fabrication methods that deviate from conventional processes are needed to create such systems. Further complicating the issue, the resulting device geometry may not be readily compatible with existing measurement techniques. This dissertation explores several nontraditional fabrication techniques for optical components with hierarchical geometries and measurement techniques to evaluate performance of such components. A micro-transfer molding process is found to produce high-fidelity micro-optical structures and is used to fabricate a spectral filter on a curved surface. By using a custom measurement setup we demonstrate that the spectral filter retains functionality despite the nontraditional geometry. A compound lens is fabricated using similar fabrication techniques and the imaging performance is analyzed. A spray coating technique for photoresist application to curved surfaces combined with interference lithography is also investigated. Using this technique, we generate polarizers on curved surfaces and measure their performance. This work furthers an understanding of how combining multiple optical components affects the performance of each component, the final integrated devices, and leads towards realization of biomimetically inspired imaging systems.

  6. Apollo experience report: Development of the extravehicular mobility unit

    NASA Technical Reports Server (NTRS)

    Lutz, C. C.; Stutesman, H. L.; Carson, M. A.; Mcbarron, J. W., II

    1975-01-01

    The development and performance history of the Apollo extravehicular mobility unit and its major subsystems is described. The three major subsystems, the pressure garment assembly, the portable life-support system, and the oxygen purge system, are defined and described in detail as is the evolutionary process that culminated in each major subsystem component. Descriptions of ground-support equipment and the qualification testing process for component hardware are also presented.

  7. Atomic vapor laser isotope separation process

    DOEpatents

    Wyeth, R.W.; Paisner, J.A.; Story, T.

    1990-08-21

    A laser spectroscopy system is utilized in an atomic vapor laser isotope separation process. The system determines spectral components of an atomic vapor utilizing a laser heterodyne technique. 23 figs.

  8. Micromechanical Machining Processes and their Application to Aerospace Structures, Devices and Systems

    NASA Technical Reports Server (NTRS)

    Friedrich, Craig R.; Warrington, Robert O.

    1995-01-01

    Micromechanical machining processes are those micro fabrication techniques which directly remove work piece material by either a physical cutting tool or an energy process. These processes are direct and therefore they can help reduce the cost and time for prototype development of micro mechanical components and systems. This is especially true for aerospace applications where size and weight are critical, and reliability and the operating environment are an integral part of the design and development process. The micromechanical machining processes are rapidly being recognized as a complementary set of tools to traditional lithographic processes (such as LIGA) for the fabrication of micromechanical components. Worldwide efforts in the U.S., Germany, and Japan are leading to results which sometimes rival lithography at a fraction of the time and cost. Efforts to develop processes and systems specific to aerospace applications are well underway.

  9. Clustering execution in a processing system to increase power savings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bose, Pradip; Buyuktosunoglu, Alper; Jacobson, Hans M.

    Embodiments relate to clustering execution in a processing system. An aspect includes accessing a control flow graph that defines a data dependency and an execution sequence of a plurality of tasks of an application that executes on a plurality of system components. The execution sequence of the tasks in the control flow graph is modified as a clustered control flow graph that clusters active and idle phases of a system component while maintaining the data dependency. The clustered control flow graph is sent to an operating system, where the operating system utilizes the clustered control flow graph for scheduling themore » tasks.« less

  10. Simplifying and upscaling water resources systems models that combine natural and engineered components

    NASA Astrophysics Data System (ADS)

    McIntyre, N.; Keir, G.

    2014-12-01

    Water supply systems typically encompass components of both natural systems (e.g. catchment runoff, aquifer interception) and engineered systems (e.g. process equipment, water storages and transfers). Many physical processes of varying spatial and temporal scales are contained within these hybrid systems models. The need to aggregate and simplify system components has been recognised for reasons of parsimony and comprehensibility; and the use of probabilistic methods for modelling water-related risks also prompts the need to seek computationally efficient up-scaled conceptualisations. How to manage the up-scaling errors in such hybrid systems models has not been well-explored, compared to research in the hydrological process domain. Particular challenges include the non-linearity introduced by decision thresholds and non-linear relations between water use, water quality, and discharge strategies. Using a case study of a mining region, we explore the nature of up-scaling errors in water use, water quality and discharge, and we illustrate an approach to identification of a scale-adjusted model including an error model. Ways forward for efficient modelling of such complex, hybrid systems are discussed, including interactions with human, energy and carbon systems models.

  11. Visualization of Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Gerald-Yamasaki, Michael; Hultquist, Jeff; Bryson, Steve; Kenwright, David; Lane, David; Walatka, Pamela; Clucas, Jean; Watson, Velvin; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    Scientific visualization serves the dual purpose of exploration and exposition of the results of numerical simulations of fluid flow. Along with the basic visualization process which transforms source data into images, there are four additional components to a complete visualization system: Source Data Processing, User Interface and Control, Presentation, and Information Management. The requirements imposed by the desired mode of operation (i.e. real-time, interactive, or batch) and the source data have their effect on each of these visualization system components. The special requirements imposed by the wide variety and size of the source data provided by the numerical simulation of fluid flow presents an enormous challenge to the visualization system designer. We describe the visualization system components including specific visualization techniques and how the mode of operation and source data requirements effect the construction of computational fluid dynamics visualization systems.

  12. Prognostics using Engineering and Environmental Parameters as Applied to State of Health (SOH) Radionuclide Aerosol Sampler Analyzer (RASA) Real-Time Monitoring

    NASA Astrophysics Data System (ADS)

    Hutchenson, K. D.; Hartley-McBride, S.; Saults, T.; Schmidt, D. P.

    2006-05-01

    The International Monitoring System (IMS) is composed in part of radionuclide particulate and gas monitoring systems. Monitoring the operational status of these systems is an important aspect of nuclear weapon test monitoring. Quality data, process control techniques, and predictive models are necessary to detect and predict system component failures. Predicting failures in advance provides time to mitigate these failures, thus minimizing operational downtime. The Provisional Technical Secretariat (PTS) requires IMS radionuclide systems be operational 95 percent of the time. The United States National Data Center (US NDC) offers contributing components to the IMS. This effort focuses on the initial research and process development using prognostics for monitoring and predicting failures of the RASA two (2) days into the future. The predictions, using time series methods, are input to an expert decision system, called SHADES (State of Health Airflow and Detection Expert System). The results enable personnel to make informed judgments about the health of the RASA system. Data are read from a relational database, processed, and displayed to the user in a GIS as a prototype GUI. This procedure mimics the real time application process that could be implemented as an operational system, This initial proof-of-concept effort developed predictive models focused on RASA components for a single site (USP79). Future work shall include the incorporation of other RASA systems, as well as their environmental conditions that play a significant role in performance. Similarly, SHADES currently accommodates specific component behaviors at this one site. Future work shall also include important environmental variables that play an important part of the prediction algorithms.

  13. Design and Analysis of a Sensor System for Cutting Force Measurement in Machining Processes

    PubMed Central

    Liang, Qiaokang; Zhang, Dan; Coppola, Gianmarc; Mao, Jianxu; Sun, Wei; Wang, Yaonan; Ge, Yunjian

    2016-01-01

    Multi-component force sensors have infiltrated a wide variety of automation products since the 1970s. However, one seldom finds full-component sensor systems available in the market for cutting force measurement in machine processes. In this paper, a new six-component sensor system with a compact monolithic elastic element (EE) is designed and developed to detect the tangential cutting forces Fx, Fy and Fz (i.e., forces along x-, y-, and z-axis) as well as the cutting moments Mx, My and Mz (i.e., moments about x-, y-, and z-axis) simultaneously. Optimal structural parameters of the EE are carefully designed via simulation-driven optimization. Moreover, a prototype sensor system is fabricated, which is applied to a 5-axis parallel kinematic machining center. Calibration experimental results demonstrate that the system is capable of measuring cutting forces and moments with good linearity while minimizing coupling error. Both the Finite Element Analysis (FEA) and calibration experimental studies validate the high performance of the proposed sensor system that is expected to be adopted into machining processes. PMID:26751451

  14. Design and Analysis of a Sensor System for Cutting Force Measurement in Machining Processes.

    PubMed

    Liang, Qiaokang; Zhang, Dan; Coppola, Gianmarc; Mao, Jianxu; Sun, Wei; Wang, Yaonan; Ge, Yunjian

    2016-01-07

    Multi-component force sensors have infiltrated a wide variety of automation products since the 1970s. However, one seldom finds full-component sensor systems available in the market for cutting force measurement in machine processes. In this paper, a new six-component sensor system with a compact monolithic elastic element (EE) is designed and developed to detect the tangential cutting forces Fx, Fy and Fz (i.e., forces along x-, y-, and z-axis) as well as the cutting moments Mx, My and Mz (i.e., moments about x-, y-, and z-axis) simultaneously. Optimal structural parameters of the EE are carefully designed via simulation-driven optimization. Moreover, a prototype sensor system is fabricated, which is applied to a 5-axis parallel kinematic machining center. Calibration experimental results demonstrate that the system is capable of measuring cutting forces and moments with good linearity while minimizing coupling error. Both the Finite Element Analysis (FEA) and calibration experimental studies validate the high performance of the proposed sensor system that is expected to be adopted into machining processes.

  15. A Chinese Character Teaching System Using Structure Theory and Morphing Technology

    PubMed Central

    Sun, Linjia; Liu, Min; Hu, Jiajia; Liang, Xiaohui

    2014-01-01

    This paper proposes a Chinese character teaching system by using the Chinese character structure theory and the 2D contour morphing technology. This system, including the offline phase and the online phase, automatically generates animation for the same Chinese character from different writing stages to intuitively show the evolution of shape and topology in the process of Chinese characters teaching. The offline phase builds the component models database for the same script and the components correspondence database for different scripts. Given two or several different scripts of the same Chinese character, the online phase firstly divides the Chinese characters into components by using the process of Chinese character parsing, and then generates the evolution animation by using the process of Chinese character morphing. Finally, two writing stages of Chinese characters, i.e., seal script and clerical script, are used in experiment to show the ability of the system. The result of the user experience study shows that the system can successfully guide students to improve the learning of Chinese characters. And the users agree that the system is interesting and can motivate them to learn. PMID:24978171

  16. A data processing method based on tracking light spot for the laser differential confocal component parameters measurement system

    NASA Astrophysics Data System (ADS)

    Shao, Rongjun; Qiu, Lirong; Yang, Jiamiao; Zhao, Weiqian; Zhang, Xin

    2013-12-01

    We have proposed the component parameters measuring method based on the differential confocal focusing theory. In order to improve the positioning precision of the laser differential confocal component parameters measurement system (LDDCPMS), the paper provides a data processing method based on tracking light spot. To reduce the error caused by the light point moving in collecting the axial intensity signal, the image centroiding algorithm is used to find and track the center of Airy disk of the images collected by the laser differential confocal system. For weakening the influence of higher harmonic noises during the measurement, Gaussian filter is used to process the axial intensity signal. Ultimately the zero point corresponding to the focus of the objective in a differential confocal system is achieved by linear fitting for the differential confocal axial intensity data. Preliminary experiments indicate that the method based on tracking light spot can accurately collect the axial intensity response signal of the virtual pinhole, and improve the anti-interference ability of system. Thus it improves the system positioning accuracy.

  17. Facilitating preemptive hardware system design using partial reconfiguration techniques.

    PubMed

    Dondo Gazzano, Julio; Rincon, Fernando; Vaderrama, Carlos; Villanueva, Felix; Caba, Julian; Lopez, Juan Carlos

    2014-01-01

    In FPGA-based control system design, partial reconfiguration is especially well suited to implement preemptive systems. In real-time systems, the deadline for critical task can compel the preemption of noncritical one. Besides, an asynchronous event can demand immediate attention and, then, force launching a reconfiguration process for high-priority task implementation. If the asynchronous event is previously scheduled, an explicit activation of the reconfiguration process is performed. If the event cannot be previously programmed, such as in dynamically scheduled systems, an implicit activation to the reconfiguration process is demanded. This paper provides a hardware-based approach to explicit and implicit activation of the partial reconfiguration process in dynamically reconfigurable SoCs and includes all the necessary tasks to cope with this issue. Furthermore, the reconfiguration service introduced in this work allows remote invocation of the reconfiguration process and then the remote integration of off-chip components. A model that offers component location transparency is also presented to enhance and facilitate system integration.

  18. Facilitating Preemptive Hardware System Design Using Partial Reconfiguration Techniques

    PubMed Central

    Rincon, Fernando; Vaderrama, Carlos; Villanueva, Felix; Caba, Julian; Lopez, Juan Carlos

    2014-01-01

    In FPGA-based control system design, partial reconfiguration is especially well suited to implement preemptive systems. In real-time systems, the deadline for critical task can compel the preemption of noncritical one. Besides, an asynchronous event can demand immediate attention and, then, force launching a reconfiguration process for high-priority task implementation. If the asynchronous event is previously scheduled, an explicit activation of the reconfiguration process is performed. If the event cannot be previously programmed, such as in dynamically scheduled systems, an implicit activation to the reconfiguration process is demanded. This paper provides a hardware-based approach to explicit and implicit activation of the partial reconfiguration process in dynamically reconfigurable SoCs and includes all the necessary tasks to cope with this issue. Furthermore, the reconfiguration service introduced in this work allows remote invocation of the reconfiguration process and then the remote integration of off-chip components. A model that offers component location transparency is also presented to enhance and facilitate system integration. PMID:24672292

  19. Increased Reliability of Gas Turbine Components by Robust Coatings Manufacturing

    NASA Astrophysics Data System (ADS)

    Sharma, A.; Dudykevych, T.; Sansom, D.; Subramanian, R.

    2017-08-01

    The expanding operational windows of the advanced gas turbine components demand increasing performance capability from protective coating systems. This demand has led to the development of novel multi-functional, multi-materials coating system architectures over the last years. In addition, the increasing dependency of components exposed to extreme environment on protective coatings results in more severe penalties, in case of a coating system failure. This emphasizes that reliability and consistency of protective coating systems are equally important to their superior performance. By means of examples, this paper describes the effects of scatter in the material properties resulting from manufacturing variations on coating life predictions. A strong foundation in process-property-performance correlations as well as regular monitoring and control of the coating process is essential for robust and well-controlled coating process. Proprietary and/or commercially available diagnostic tools can help in achieving these goals, but their usage in industrial setting is still limited. Various key contributors to process variability are briefly discussed along with the limitations of existing process and product control methods. Other aspects that are important for product reliability and consistency in serial manufacturing as well as advanced testing methodologies to simplify and enhance product inspection and improve objectivity are briefly described.

  20. Coupling of snow and permafrost processes using the Basic Modeling Interface (BMI)

    NASA Astrophysics Data System (ADS)

    Wang, K.; Overeem, I.; Jafarov, E. E.; Piper, M.; Stewart, S.; Clow, G. D.; Schaefer, K. M.

    2017-12-01

    We developed a permafrost modeling tool based by implementing the Kudryavtsev empirical permafrost active layer depth model (the so-called "Ku" component). The model is specifically set up to have a basic model interface (BMI), which enhances the potential coupling to other earth surface processes model components. This model is accessible through the Web Modeling Tool in Community Surface Dynamics Modeling System (CSDMS). The Kudryavtsev model has been applied for entire Alaska to model permafrost distribution at high spatial resolution and model predictions have been verified by Circumpolar Active Layer Monitoring (CALM) in-situ observations. The Ku component uses monthly meteorological forcing, including air temperature, snow depth, and snow density, and predicts active layer thickness (ALT) and temperature on the top of permafrost (TTOP), which are important factors in snow-hydrological processes. BMI provides an easy approach to couple the models with each other. Here, we provide a case of coupling the Ku component to snow process components, including the Snow-Degree-Day (SDD) method and Snow-Energy-Balance (SEB) method, which are existing components in the hydrological model TOPOFLOW. The work flow is (1) get variables from meteorology component, set the values to snow process component, and advance the snow process component, (2) get variables from meteorology and snow component, provide these to the Ku component and advance, (3) get variables from snow process component, set the values to meteorology component, and advance the meteorology component. The next phase is to couple the permafrost component with fully BMI-compliant TOPOFLOW hydrological model, which could provide a useful tool to investigate the permafrost hydrological effect.

  1. Automatic assembly of micro-optical components

    NASA Astrophysics Data System (ADS)

    Gengenbach, Ulrich K.

    1996-12-01

    Automatic assembly becomes an important issue as hybrid micro systems enter industrial fabrication. Moving from a laboratory scale production with manual assembly and bonding processes to automatic assembly requires a thorough re- evaluation of the design, the characteristics of the individual components and of the processes involved. Parts supply for automatic operation, sensitive and intelligent grippers adapted to size, surface and material properties of the microcomponents gain importance when the superior sensory and handling skills of a human are to be replaced by a machine. This holds in particular for the automatic assembly of micro-optical components. The paper outlines these issues exemplified at the automatic assembly of a micro-optical duplexer consisting of a micro-optical bench fabricated by the LIGA technique, two spherical lenses, a wavelength filter and an optical fiber. Spherical lenses, wavelength filter and optical fiber are supplied by third party vendors, which raises the question of parts supply for automatic assembly. The bonding processes for these components include press fit and adhesive bonding. The prototype assembly system with all relevant components e.g. handling system, parts supply, grippers and control is described. Results of first automatic assembly tests are presented.

  2. Systems level test and simulation for photonic processing systems

    NASA Astrophysics Data System (ADS)

    Erteza, I. A.; Stalker, K. T.

    1995-08-01

    Photonic technology is growing in importance throughout DOD. Programs have been underway in each of the Services to demonstrate the ability of photonics to enhance current electronic performance in several prototype systems, such as the Navy's SLQ-32 radar warning receiver, the Army's multi-role survivable radar and the phased array radar controller for the Airborne Warning and Control System (AWACS) upgrade. Little, though, is known about radiation effects; the component studies do not furnish the information needed to predict overall system performance in a radiation environment. To date, no comprehensive test and analysis program has been conducted to evaluate sensitivity of overall system performance to the radiation environment. The goal of this program is to relate component level effects to system level performance through modeling and testing of a selected optical processing system, and to help direct component testing to items which can directly and adversely affect overall system performance. This report gives a broad overview of the project, highlighting key results.

  3. Thermodynamic Modelling of Phase Transformation in a Multi-Component System

    NASA Astrophysics Data System (ADS)

    Vala, J.

    2007-09-01

    Diffusion in multi-component alloys can be characterized by the vacancy mechanism for substitutional components, by the existence of sources and sinks for vacancies and by the motion of atoms of interstitial components. The description of diffusive and massive phase transformation of a multi-component system is based on the thermodynamic extremal principle by Onsager; the finite thickness of the interface between both phases is respected. The resulting system of partial differential equations of evolution with integral terms for unknown mole fractions (and additional variables in case of non-ideal sources and sinks for vacancies), can be analyzed using the method of lines and the finite difference technique (or, alternatively, the finite element one) together with the semi-analytic and numerical integration formulae and with certain iteration procedure, making use of the spectral properties of linear operators. The original software code for the numerical evaluation of solutions of such systems, written in MATLAB, offers a chance to simulate various real processes of diffusional phase transformation. Some results for the (nearly) steady-state real processes in substitutional alloys have been published yet. The aim of this paper is to demonstrate that the same approach can handle both substitutional and interstitial components even in case of a general system of evolution.

  4. CDC WONDER: a cooperative processing architecture for public health.

    PubMed Central

    Friede, A; Rosen, D H; Reid, J A

    1994-01-01

    CDC WONDER is an information management architecture designed for public health. It provides access to information and communications without the user's needing to know the location of data or communication pathways and mechanisms. CDC WONDER users have access to extractions from some 40 databases; electronic mail (e-mail); and surveillance data processing. System components include the Remote Client, the Communications Server, the Queue Managers, and Data Servers and Process Servers. The Remote Client software resides in the user's machine; other components are at the Centers for Disease Control and Prevention (CDC). The Remote Client, the Communications Server, and the Applications Server provide access to the information and functions in the Data Servers and Process Servers. The system architecture is based on cooperative processing, and components are coupled via pure message passing, using several protocols. This architecture allows flexibility in the choice of hardware and software. One system limitation is that final results from some subsystems are obtained slowly. Although designed for public health, CDC WONDER could be useful for other disciplines that need flexible, integrated information exchange. PMID:7719813

  5. 76 FR 72691 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-25

    ... into the DLA accounting and financial management process. Records are used by the DOD Components who...: Financial Compliance and Process Management (J-89), Headquarters, Defense Logistics Agency, 8725 John J... DoD Components who receive accounting and financial management support from DLA under an...

  6. Analysis of electromagnetic interference from power system processing and transmission components for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Barber, Peter W.; Demerdash, Nabeel A. O.; Hurysz, B.; Luo, Z.; Denny, Hugh W.; Millard, David P.; Herkert, R.; Wang, R.

    1992-01-01

    The goal of this research project was to analyze the potential effects of electromagnetic interference (EMI) originating from power system processing and transmission components for Space Station Freedom. The approach consists of four steps: (1) developing analytical tools (models and computer programs); (2) conducting parameterization (what if?) studies; (3) predicting the global space station EMI environment; and (4) providing a basis for modification of EMI standards.

  7. The number processing and calculation system: evidence from cognitive neuropsychology.

    PubMed

    Salguero-Alcañiz, M P; Alameda-Bailén, J R

    2015-04-01

    Cognitive neuropsychology focuses on the concepts of dissociation and double dissociation. The performance of number processing and calculation tasks by patients with acquired brain injury can be used to characterise the way in which the healthy cognitive system manipulates number symbols and quantities. The objective of this study is to determine the components of the numerical processing and calculation system. Participants consisted of 6 patients with acquired brain injuries in different cerebral localisations. We used Batería de evaluación del procesamiento numérico y el cálculo, a battery assessing number processing and calculation. Data was analysed using the difference in proportions test. Quantitative numerical knowledge is independent from number transcoding, qualitative numerical knowledge, and calculation. Recodification is independent from qualitative numerical knowledge and calculation. Quantitative numerical knowledge and calculation are also independent functions. The number processing and calculation system comprises at least 4 components that operate independently: quantitative numerical knowledge, number transcoding, qualitative numerical knowledge, and calculation. Therefore, each one may be damaged selectively without affecting the functioning of another. According to the main models of number processing and calculation, each component has different characteristics and cerebral localisations. Copyright © 2013 Sociedad Española de Neurología. Published by Elsevier Espana. All rights reserved.

  8. The Effectiveness of Full Day School System for Students’ Character Building

    NASA Astrophysics Data System (ADS)

    Benawa, A.; Peter, R.; Makmun, S.

    2018-01-01

    The study aims to put forward that full day school which was delivered in Marsudirini Elementary School in Bogor is effective for students’ character building. The study focused on the implementation of full day school system. The qualitative-based research method applied in the study is characteristic evaluation involving non-participant observation, interview, and documentation analysis. The result of this study concludes that the full day school system is significantly effective in education system for elementary students’ character building. The full day school system embraced the entire relevant processes based on the character building standard. The synergy of comprehensive components in instructional process at full day school has influenced the building of the students’ character effectively and efficiently. The relationship emerged between instructional development process in full day school system and the character building of the students. By developing instructional process through systemic and systematic process in full day school system, the support of stakeholders (leaders, human resources, students, parents’ role) and other components (learning resources, facilities, budget) provides a potent and expeditious contribution for character building among the students eventually.

  9. Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.

  10. Experience with chemical system decontamination by the CORD process and electrochemical decontamination of pipe ends

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wille, H.; Bertholdt, H.O.; Operschall, H.

    Efforts to reduce occupational radiation exposure during inspection and repair work in nuclear power plants turns steadily increasing attention to the decontamination of systems and components. Due to the advanced age of nuclear power plants resulting in increasing dose rates, the decontamination of components, or rather of complete systems, or loops to protect operating and inspection personnel becomes demanding. Besides, decontaminating complete primary loops is in many cases less difficult than cleaning large components. Based on experience gained in nuclear power plants, an outline of two different decontamination methods performed recently are given. For the decontamination of complete systems ormore » loops, Kraftwerk Union AG has developed CORD, a low-concentration process. For the decontamination performance of a subsystem, such as the steam generator (SG) channel heads of a pressurized water reactor or the recirculation loops of a boiling water reactor the automated mobile decontamination appliance is used. The electrochemical decontamination process is primarily applicable for the treatment of specially limited surface areas.« less

  11. Optical Information Processing for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Current research in optical processing is reviewed. Its role in future aerospace systems is determined. The development of optical devices and components demonstrates that system concepts can be implemented in practical aerospace configurations.

  12. Comprehensive system models: Strategies for evaluation

    NASA Technical Reports Server (NTRS)

    Field, Christopher; Kutzbach, John E.; Ramanathan, V.; Maccracken, Michael C.

    1992-01-01

    The task of evaluating comprehensive earth system models is vast involving validations of every model component at every scale of organization, as well as tests of all the individual linkages. Even the most detailed evaluation of each of the component processes and the individual links among them should not, however, engender confidence in the performance of the whole. The integrated earth system is so rich with complex feedback loops, often involving components of the atmosphere, oceans, biosphere, and cryosphere, that it is certain to exhibit emergent properties very difficult to predict from the perspective of a narrow focus on any individual component of the system. Therefore, a substantial share of the task of evaluating comprehensive earth system models must reside at the level of whole system evaluations. Since complete, integrated atmosphere/ ocean/ biosphere/ hydrology models are not yet operational, questions of evaluation must be addressed at the level of the kinds of earth system processes that the models should be competent to simulate, rather than at the level of specific performance criteria. Here, we have tried to identify examples of earth system processes that are difficult to simulate with existing models and that involve a rich enough suite of feedbacks that they are unlikely to be satisfactorily described by highly simplified or toy models. Our purpose is not to specify a checklist of evaluation criteria but to introduce characteristics of the earth system that may present useful opportunities for model testing and, of course, improvement.

  13. Virtual commissioning of automated micro-optical assembly

    NASA Astrophysics Data System (ADS)

    Schlette, Christian; Losch, Daniel; Haag, Sebastian; Zontar, Daniel; Roßmann, Jürgen; Brecher, Christian

    2015-02-01

    In this contribution, we present a novel approach to enable virtual commissioning for process developers in micro-optical assembly. Our approach aims at supporting micro-optics experts to effectively develop assisted or fully automated assembly solutions without detailed prior experience in programming while at the same time enabling them to easily implement their own libraries of expert schemes and algorithms for handling optical components. Virtual commissioning is enabled by a 3D simulation and visualization system in which the functionalities and properties of automated systems are modeled, simulated and controlled based on multi-agent systems. For process development, our approach supports event-, state- and time-based visual programming techniques for the agents and allows for their kinematic motion simulation in combination with looped-in simulation results for the optical components. First results have been achieved for simply switching the agents to command the real hardware setup after successful process implementation and validation in the virtual environment. We evaluated and adapted our system to meet the requirements set by industrial partners-- laser manufacturers as well as hardware suppliers of assembly platforms. The concept is applied to the automated assembly of optical components for optically pumped semiconductor lasers and positioning of optical components for beam-shaping

  14. Implementation status of accrual accounting system in health sector.

    PubMed

    Mehrolhassani, Mohammad Hossien; Khayatzadeh-Mahani, Akram; Emami, Mozhgan

    2014-07-29

    Management of financial resources in health systems is one of the major issues of concern for policy makers globally. As a sub-set of financial management, accounting system is of paramount importance. In this paper, which presents part of the results of a wider research project on transition process from a cash accounting system to an accrual accounting system, we look at the impact of components of change on implementation of the new system. Implementing changes is fraught with many obstacles and surveying these challenges will help policy makers to better overcome them. The study applied a quantitative manner in 2012 at Kerman University of Medical Science in Iran. For the evaluation, a teacher made valid questionnaire with Likert scale was used (Cranach's alpha of 0.89) which included 7 change components in accounting system. The study population was 32 subordinate units of Kerman University of Medical Sciences and for data analysis, descriptive and inferential statistics and correlation coefficient in SPSS version 19 were used. Level of effect of all components on the implementation was average downward (5.06±1.86), except for the component "management & leadership (3.46±2.25)" (undesirable from external evaluators' viewpoint) and "technology (6.61±1.92) and work processes (6.35±2.19)" (middle to high from internal evaluators' viewpoint). Results showed that the establishment of accrual accounting system faces infrastructural challenges, especially the components of leadership and management and followers. As such, developing effective measures to overcome implementation obstacles should target these components.

  15. Human Engineering Operations and Habitability Assessment: A Process for Advanced Life Support Ground Facility Testbeds

    NASA Technical Reports Server (NTRS)

    Connolly, Janis H.; Arch, M.; Elfezouaty, Eileen Schultz; Novak, Jennifer Blume; Bond, Robert L. (Technical Monitor)

    1999-01-01

    Design and Human Engineering (HE) processes strive to ensure that the human-machine interface is designed for optimal performance throughout the system life cycle. Each component can be tested and assessed independently to assure optimal performance, but it is not until full integration that the system and the inherent interactions between the system components can be assessed as a whole. HE processes (which are defining/app lying requirements for human interaction with missions/systems) are included in space flight activities, but also need to be included in ground activities and specifically, ground facility testbeds such as Bio-Plex. A unique aspect of the Bio-Plex Facility is the integral issue of Habitability which includes qualities of the environment that allow humans to work and live. HE is a process by which Habitability and system performance can be assessed.

  16. The UARS and open data system concept and analysis study. Executive summary

    NASA Technical Reports Server (NTRS)

    Mittal, M.; Nebb, J.; Woodward, H.

    1983-01-01

    Alternative concepts for a common design for the UARS and OPEN Central Data Handling Facility (CDHF) are offered. The designs are consistent with requirements shared by UARS and OPEN and the data storage and data processing demands of these missions. Because more detailed information is available for UARS, the design approach was to size the system and to select components for a UARS CDHF, but in a manner that does not optimize the CDHF at the expense of OPEN. Costs for alternative implementations of the UARS designs are presented showing that the system design does not restrict the implementation to a single manufacturer. Processing demands on the alternative UARS CDHF implementations are discussed. With this information at hand together with estimates for OPEN processing demands, it is shown that any shortfall in system capability for OPEN support can be remedied by either component upgrades or array processing attachments rather than a system redesign.

  17. Integrable multi-component generalization of a modified short pulse equation

    NASA Astrophysics Data System (ADS)

    Matsuno, Yoshimasa

    2016-11-01

    We propose a multi-component generalization of the modified short pulse (SP) equation which was derived recently as a reduction of Feng's two-component SP equation. Above all, we address the two-component system in depth. We obtain the Lax pair, an infinite number of conservation laws and multisoliton solutions for the system, demonstrating its integrability. Subsequently, we show that the two-component system exhibits cusp solitons and breathers for which the detailed analysis is performed. Specifically, we explore the interaction process of two cusp solitons and derive the formula for the phase shift. While cusp solitons are singular solutions, smooth breather solutions are shown to exist, provided that the parameters characterizing the solutions satisfy certain conditions. Last, we discuss the relation between the proposed system and existing two-component SP equations.

  18. A Multi-mission Event-Driven Component-Based System for Support of Flight Software Development, ATLO, and Operations first used by the Mars Science Laboratory (MSL) Project

    NASA Technical Reports Server (NTRS)

    Dehghani, Navid; Tankenson, Michael

    2006-01-01

    This viewgraph presentation reviews the architectural description of the Mission Data Processing and Control System (MPCS). MPCS is an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is designed with these factors (1) Enabling plug and play architecture (2) MPCS has strong inheritance from GDS components that have been developed for other Flight Projects (MER, MRO, DAWN, MSAP), and are currently being used in operations and ATLO, and (3) MPCS components are Java-based, platform independent, and are designed to consume and produce XML-formatted data

  19. Building of nested components by a double-nozzle droplet deposition process

    NASA Astrophysics Data System (ADS)

    Li, SuLi; Wei, ZhengYing; Du, Jun; Zhao, Guangxi; Wang, Xin; Lu, BingHeng

    2016-07-01

    According to the nested components jointed with multiple parts,a double-nozzle droplet deposition process was put forward in this paper, and the experimental system was developed. Through the research on the properties of support materials and the process of double-nozzle droplet deposition, the linkage control of the metal droplet deposition and the support material extrusion was realized, and a nested component with complex construction was fabricated directly. Compared with the traditional forming processes, this double-nozzle deposition process has the advantages of short cycle, low cost and so on. It can provide an approach way to build the nested parts.

  20. Adaptive Multi-scale PHM for Robotic Assembly Processes

    PubMed Central

    Choo, Benjamin Y.; Beling, Peter A.; LaViers, Amy E.; Marvel, Jeremy A.; Weiss, Brian A.

    2017-01-01

    Adaptive multiscale prognostics and health management (AM-PHM) is a methodology designed to support PHM in smart manufacturing systems. As a rule, PHM information is not used in high-level decision-making in manufacturing systems. AM-PHM leverages and integrates component-level PHM information with hierarchical relationships across the component, machine, work cell, and production line levels in a manufacturing system. The AM-PHM methodology enables the creation of actionable prognostic and diagnostic intelligence up and down the manufacturing process hierarchy. Decisions are made with the knowledge of the current and projected health state of the system at decision points along the nodes of the hierarchical structure. A description of the AM-PHM methodology with a simulated canonical robotic assembly process is presented. PMID:28664161

  1. Image processing for flight crew enhanced situation awareness

    NASA Technical Reports Server (NTRS)

    Roberts, Barry

    1993-01-01

    This presentation describes the image processing work that is being performed for the Enhanced Situational Awareness System (ESAS) application. Specifically, the presented work supports the Enhanced Vision System (EVS) component of ESAS.

  2. Evolving Systems: An Outcome of Fondest Hopes and Wildest Dreams

    NASA Technical Reports Server (NTRS)

    Frost, Susan A.; Balas, Mark J.

    2012-01-01

    New theory is presented for evolving systems, which are autonomously controlled subsystems that self-assemble into a new evolved system with a higher purpose. Evolving systems of aerospace structures often require additional control when assembling to maintain stability during the entire evolution process. This is the concept of Adaptive Key Component Control that operates through one specific component to maintain stability during the evolution. In addition, this control must often overcome persistent disturbances that occur while the evolution is in progress. Theoretical results will be presented for Adaptive Key Component control for persistent disturbance rejection. An illustrative example will demonstrate the Adaptive Key Component controller on a system composed of rigid body and flexible body modes.

  3. 48 CFR 227.7103-13 - Government right to review, verify, challenge and validate asserted restrictions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... contracting officer shall not challenge a contractor's assertion that a commercial item, component, or process... to development of the item, component or process. (2) Presumption regarding development exclusively... validation of asserted restrictions for technical data related to commercial items, and to major systems, on...

  4. 48 CFR 227.7103-13 - Government right to review, verify, challenge and validate asserted restrictions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... contracting officer shall not challenge a contractor's assertion that a commercial item, component, or process... to development of the item, component or process. (2) Presumption regarding development exclusively... validation of asserted restrictions for technical data related to commercial items, and to major systems, on...

  5. 48 CFR 227.7103-13 - Government right to review, verify, challenge and validate asserted restrictions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... contracting officer shall not challenge a contractor's assertion that a commercial item, component, or process... to development of the item, component or process. (2) Presumption regarding development exclusively... validation of asserted restrictions for technical data related to commercial items, and to major systems, on...

  6. 48 CFR 227.7103-13 - Government right to review, verify, challenge and validate asserted restrictions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... contracting officer shall not challenge a contractor's assertion that a commercial item, component, or process... to development of the item, component or process. (2) Presumption regarding development exclusively... validation of asserted restrictions for technical data related to commercial items, and to major systems, on...

  7. Contingency theoretic methodology for agent-based web-oriented manufacturing systems

    NASA Astrophysics Data System (ADS)

    Durrett, John R.; Burnell, Lisa J.; Priest, John W.

    2000-12-01

    The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.

  8. Structured decision making as a conceptual framework to identify thresholds for conservation and management

    USGS Publications Warehouse

    Martin, J.; Runge, M.C.; Nichols, J.D.; Lubow, B.C.; Kendall, W.L.

    2009-01-01

    Thresholds and their relevance to conservation have become a major topic of discussion in the ecological literature. Unfortunately, in many cases the lack of a clear conceptual framework for thinking about thresholds may have led to confusion in attempts to apply the concept of thresholds to conservation decisions. Here, we advocate a framework for thinking about thresholds in terms of a structured decision making process. The purpose of this framework is to promote a logical and transparent process for making informed decisions for conservation. Specification of such a framework leads naturally to consideration of definitions and roles of different kinds of thresholds in the process. We distinguish among three categories of thresholds. Ecological thresholds are values of system state variables at which small changes bring about substantial changes in system dynamics. Utility thresholds are components of management objectives (determined by human values) and are values of state or performance variables at which small changes yield substantial changes in the value of the management outcome. Decision thresholds are values of system state variables at which small changes prompt changes in management actions in order to reach specified management objectives. The approach that we present focuses directly on the objectives of management, with an aim to providing decisions that are optimal with respect to those objectives. This approach clearly distinguishes the components of the decision process that are inherently subjective (management objectives, potential management actions) from those that are more objective (system models, estimates of system state). Optimization based on these components then leads to decision matrices specifying optimal actions to be taken at various values of system state variables. Values of state variables separating different actions in such matrices are viewed as decision thresholds. Utility thresholds are included in the objectives component, and ecological thresholds may be embedded in models projecting consequences of management actions. Decision thresholds are determined by the above-listed components of a structured decision process. These components may themselves vary over time, inducing variation in the decision thresholds inherited from them. These dynamic decision thresholds can then be determined using adaptive management. We provide numerical examples (that are based on patch occupancy models) of structured decision processes that include all three kinds of thresholds. ?? 2009 by the Ecological Society of America.

  9. Characterising the development of the understanding of human body systems in high-school biology students - a longitudinal study

    NASA Astrophysics Data System (ADS)

    Snapir, Zohar; Eberbach, Catherine; Ben-Zvi-Assaraf, Orit; Hmelo-Silver, Cindy; Tripto, Jaklin

    2017-10-01

    Science education today has become increasingly focused on research into complex natural, social and technological systems. In this study, we examined the development of high-school biology students' systems understanding of the human body, in a three-year longitudinal study. The development of the students' system understanding was evaluated using the Components Mechanisms Phenomena (CMP) framework for conceptual representation. We coded and analysed the repertory grid personal constructs of 67 high-school biology students at 4 points throughout the study. Our data analysis builds on the assumption that systems understanding entails a perception of all the system categories, including structures within the system (its Components), specific processes and interactions at the macro and micro levels (Mechanisms), and the Phenomena that present the macro scale of processes and patterns within a system. Our findings suggest that as the learning process progressed, the systems understanding of our students became more advanced, moving forward within each of the major CMP categories. Moreover, there was an increase in the mechanism complexity presented by the students, manifested by more students describing mechanisms at the molecular level. Thus, the 'mechanism' category and the micro level are critical components that enable students to understand system-level phenomena such as homeostasis.

  10. 10 CFR 70.72 - Facility changes and change process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... management system to evaluate, implement, and track each change to the site, structures, processes, systems, equipment, components, computer programs, and activities of personnel. This system must be documented in... licensed material; (3) Modifications to existing operating procedures including any necessary training or...

  11. 10 CFR 70.72 - Facility changes and change process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... management system to evaluate, implement, and track each change to the site, structures, processes, systems, equipment, components, computer programs, and activities of personnel. This system must be documented in... licensed material; (3) Modifications to existing operating procedures including any necessary training or...

  12. 10 CFR 70.72 - Facility changes and change process.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... management system to evaluate, implement, and track each change to the site, structures, processes, systems, equipment, components, computer programs, and activities of personnel. This system must be documented in... licensed material; (3) Modifications to existing operating procedures including any necessary training or...

  13. 10 CFR 70.72 - Facility changes and change process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... management system to evaluate, implement, and track each change to the site, structures, processes, systems, equipment, components, computer programs, and activities of personnel. This system must be documented in... licensed material; (3) Modifications to existing operating procedures including any necessary training or...

  14. 10 CFR 70.72 - Facility changes and change process.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... management system to evaluate, implement, and track each change to the site, structures, processes, systems, equipment, components, computer programs, and activities of personnel. This system must be documented in... licensed material; (3) Modifications to existing operating procedures including any necessary training or...

  15. Rapid tooling for functional prototyping of metal mold processes. CRADA final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zacharia, T.; Ludtka, G.M.; Bjerke, M.A.

    1997-12-01

    The overall scope of this endeavor was to develop an integrated computer system, running on a network of heterogeneous computers, that would allow the rapid development of tool designs, and then use process models to determine whether the initial tooling would have characteristics which produce the prototype parts. The major thrust of this program for ORNL was the definition of the requirements for the development of the integrated die design system with the functional purpose to link part design, tool design, and component fabrication through a seamless software environment. The principal product would be a system control program that wouldmore » coordinate the various application programs and implement the data transfer so that any networked workstation would be useable. The overall system control architecture was to be required to easily facilitate any changes, upgrades, or replacements of the model from either the manufacturing end or the design criteria standpoint. The initial design of such a program is described in the section labeled ``Control Program Design``. A critical aspect of this research was the design of the system flow chart showing the exact system components and the data to be transferred. All of the major system components would have been configured to ensure data file compatibility and transferability across the Internet. The intent was to use commercially available packages to model the various manufacturing processes for creating the die and die inserts in addition to modeling the processes for which these parts were to be used. In order to meet all of these requirements, investigative research was conducted to determine the system flow features and software components within the various organizations contributing to this project. This research is summarized.« less

  16. [Noise hazard and hearing loss in workers in automotive component manufacturing industry in Guangzhou, China].

    PubMed

    Wang, Zhi; Liang, Jiabin; Rong, Xing; Zhou, Hao; Duan, Chuanwei; Du, Weijia; Liu, Yimin

    2015-12-01

    To investigate noise hazard and its influence on hearing loss in workers in the automotive component manufacturing industry. Noise level in the workplace of automotive component manufacturing enterprises was measured and hearing examination was performed for workers to analyze the features and exposure levels of noise in each process, as well as the influence on hearing loss in workers. In the manufacturing processes for different products in this industry, the manufacturing processes of automobile hub and suspension and steering systems had the highest degrees of noise hazard, with over-standard rates of 79.8% and 57.1%, respectively. In the different technical processes for automotive component manufacturing, punching and casting had the highest degrees of noise hazard, with over-standard rates of 65.0% and 50%, respectively. The workers engaged in the automotive air conditioning system had the highest rate of abnormal hearing ability (up to 3.1%). In the automotive component manufacturing industry, noise hazard exceeds the standard seriously. Although the rate of abnormal hearing is lower than the average value of the automobile manufacturing industry in China, this rate tends to increase gradually. Enough emphasis should be placed on the noise hazard in this industry.

  17. Framework Programmable Platform for the Advanced Software Development Workstation: Preliminary system design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.

  18. National Centers for Environmental Prediction

    Science.gov Websites

    System as follows: Changes to the model components Changes to the data assimilation and tropical storm relocation components Changes to the post-processing Changes to output products 1) Changes to the Global the land-atmosphere system from decoupling. 2) Changes to the Global Data Assimilation System (GDAS

  19. 21 CFR 20.43 - Multitrack processing.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PUBLIC... maintained by that component. A multitrack system provides two or more tracks for processing requests, based... single track, ordinarily on a first-in, first-out basis. (c) If a multitrack processing system is...

  20. 21 CFR 20.43 - Multitrack processing.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PUBLIC... maintained by that component. A multitrack system provides two or more tracks for processing requests, based... single track, ordinarily on a first-in, first-out basis. (c) If a multitrack processing system is...

  1. Mass transfer apparatus and method for separation of gases

    DOEpatents

    Blount, Gerald C.

    2015-10-13

    A process and apparatus for separating components of a source gas is provided in which more soluble components of the source gas are dissolved in an aqueous solvent at high pressure. The system can utilize hydrostatic pressure to increase solubility of the components of the source gas. The apparatus includes gas recycle throughout multiple mass transfer stages to improve mass transfer of the targeted components from the liquid to gas phase. Separated components can be recovered for use in a value added application or can be processed for long-term storage, for instance in an underwater reservoir.

  2. Mass transfer apparatus and method for separation of gases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blount, Gerald C.; Gorensek, Maximilian Boris; Hamm, Luther L.

    A process and apparatus for separating components of a source gas is provided in which more soluble components of the source gas are dissolved in an aqueous solvent at high pressure. The system can utilize hydrostatic pressure to increase solubility of the components of the source gas. The apparatus includes gas recycle throughout multiple mass transfer stages to improve mass transfer of the targeted components from the liquid to gas phase. Separated components can be recovered for use in a value added application or can be processed for long-term storage, for instance in an underwater reservoir.

  3. Microphone Array Phased Processing System (MAPPS): Version 4.0 Manual

    NASA Technical Reports Server (NTRS)

    Watts, Michael E.; Mosher, Marianne; Barnes, Michael; Bardina, Jorge

    1999-01-01

    A processing system has been developed to meet increasing demands for detailed noise measurement of individual model components. The Microphone Array Phased Processing System (MAPPS) uses graphical user interfaces to control all aspects of data processing and visualization. The system uses networked parallel computers to provide noise maps at selected frequencies in a near real-time testing environment. The system has been successfully used in the NASA Ames 7- by 10-Foot Wind Tunnel.

  4. Working on the Boundaries: Philosophies and Practices of the Design Process

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.

    1996-01-01

    While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.

  5. Space Shuttle critical function audit

    NASA Technical Reports Server (NTRS)

    Sacks, Ivan J.; Dipol, John; Su, Paul

    1990-01-01

    A large fault-tolerance model of the main propulsion system of the US space shuttle has been developed. This model is being used to identify single components and pairs of components that will cause loss of shuttle critical functions. In addition, this model is the basis for risk quantification of the shuttle. The process used to develop and analyze the model is digraph matrix analysis (DMA). The DMA modeling and analysis process is accessed via a graphics-based computer user interface. This interface provides coupled display of the integrated system schematics, the digraph models, the component database, and the results of the fault tolerance and risk analyses.

  6. Extraction of CYP chemical interactions from biomedical literature using natural language processing methods.

    PubMed

    Jiao, Dazhi; Wild, David J

    2009-02-01

    This paper proposes a system that automatically extracts CYP protein and chemical interactions from journal article abstracts, using natural language processing (NLP) and text mining methods. In our system, we employ a maximum entropy based learning method, using results from syntactic, semantic, and lexical analysis of texts. We first present our system architecture and then discuss the data set for training our machine learning based models and the methods in building components in our system, such as part of speech (POS) tagging, Named Entity Recognition (NER), dependency parsing, and relation extraction. An evaluation of the system is conducted at the end, yielding very promising results: The POS, dependency parsing, and NER components in our system have achieved a very high level of accuracy as measured by precision, ranging from 85.9% to 98.5%, and the precision and the recall of the interaction extraction component are 76.0% and 82.6%, and for the overall system are 68.4% and 72.2%, respectively.

  7. Development and fabrication of a solar cell junction processing system

    NASA Technical Reports Server (NTRS)

    Kiesling, R.

    1981-01-01

    The major component fabrication program was completed. Assembly and system testing of the pulsed electron beam annealing machine are described. The design program for the transport reached completion, and the detailed drawings were released for fabrication and procurement of the long lead time components.

  8. Design and Implementation of Hydrologic Process Knowledge-base Ontology: A case study for the Infiltration Process

    NASA Astrophysics Data System (ADS)

    Elag, M.; Goodall, J. L.

    2013-12-01

    Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.

  9. Structures Technology for Future Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Venneri, Samuel L.; Paul, Donald B.; Hopkins, Mark A.

    2000-01-01

    An overview of structures technology for future aerospace systems is given. Discussion focuses on developments in component technologies that will improve the vehicle performance, advance the technology exploitation process, and reduce system life-cycle costs. The component technologies described are smart materials and structures, multifunctional materials and structures, affordable composite structures, extreme environment structures, flexible load bearing structures, and computational methods and simulation-based design. The trends in each of the component technologies are discussed and the applicability of these technologies to future aerospace vehicles is described.

  10. 21 CFR 892.2050 - Picture archiving and communications system.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...

  11. 21 CFR 892.2050 - Picture archiving and communications system.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...

  12. 21 CFR 892.2050 - Picture archiving and communications system.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...

  13. 21 CFR 892.2050 - Picture archiving and communications system.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...

  14. Administrative Decision Making and Resource Allocation.

    ERIC Educational Resources Information Center

    Sardy, Susan; Sardy, Hyman

    This paper considers selected aspects of the systems analysis of administrative decisionmaking regarding resource allocations in an educational system. A model of the instructional materials purchase system is presented. The major components of this model are: environment, input, decision process, conversion structure, conversion process, output,…

  15. MEMS Technology for Space Applications

    NASA Technical Reports Server (NTRS)

    vandenBerg, A.; Spiering, V. L.; Lammerink, T. S. J.; Elwenspoek, M.; Bergveld, P.

    1995-01-01

    Micro-technology enables the manufacturing of all kinds of components for miniature systems or micro-systems, such as sensors, pumps, valves, and channels. The integration of these components into a micro-electro-mechanical system (MEMS) drastically decreases the total system volume and mass. These properties, combined with the increasing need for monitoring and control of small flows in (bio)chemical experiments, makes MEMS attractive for space applications. The level of integration and applied technology depends on the product demands and the market. The ultimate integration is process integration, which results in a one-chip system. An example of process integration is a dosing system of pump, flow sensor, micromixer, and hybrid feedback electronics to regulate the flow. However, for many applications, a hybrid integration of components is sufficient and offers the advantages of design flexibility and even the exchange of components in the case of a modular set up. Currently, we are working on hybrid integration of all kinds of sensors (physical and chemical) and flow system modules towards a modular system; the micro total analysis system (micro TAS). The substrate contains electrical connections as in a printed circuit board (PCB) as well as fluid channels for a circuit channel board (CCB) which, when integrated, form a mixed circuit board (MCB).

  16. Experimental evaluation of a COTS system for space applications

    NASA Technical Reports Server (NTRS)

    Some, R. R.; Madeira, H.; Moreira, F.; Costa, D.; Rennels, D.

    2002-01-01

    The use of COTS-based systems in space missions for scientific data processing is very attractive, as their ratio of performance to power consumption of commercial components can be an order of magnitude greater than that of radiation hardened components, and the price differential is even higher.

  17. Visual event-related potential changes in multiple system atrophy: delayed N2 latency in selective attention to a color task.

    PubMed

    Kamitani, Toshiaki; Kuroiwa, Yoshiyuki

    2009-01-01

    Recent studies demonstrated an altered P3 component and prolonged reaction time during the visual discrimination tasks in multiple system atrophy (MSA). In MSA, however, little is known about the N2 component which is known to be closely related to the visual discrimination process. We therefore compared the N2 component as well as the N1 and P3 components in 17 MSA patients with these components in 10 normal controls, by using a visual selective attention task to color or to shape. While the P3 in MSA was significantly delayed in selective attention to shape, the N2 in MSA was significantly delayed in selective attention to color. N1 was normally preserved both in attention to color and in attention to shape. Our electrophysiological results indicate that the color discrimination process during selective attention is impaired in MSA.

  18. System Modeling of Lunar Oxygen Production: Mass and Power Requirements

    NASA Technical Reports Server (NTRS)

    Steffen, Christopher J.; Freeh, Joshua E.; Linne, Diane L.; Faykus, Eric W.; Gallo, Christopher A.; Green, Robert D.

    2007-01-01

    A systems analysis tool for estimating the mass and power requirements for a lunar oxygen production facility is introduced. The individual modeling components involve the chemical processing and cryogenic storage subsystems needed to process a beneficiated regolith stream into liquid oxygen via ilmenite reduction. The power can be supplied from one of six different fission reactor-converter systems. A baseline system analysis, capable of producing 15 metric tons of oxygen per annum, is presented. The influence of reactor-converter choice was seen to have a small but measurable impact on the system configuration and performance. Finally, the mission concept of operations can have a substantial impact upon individual component size and power requirements.

  19. Implementation Status of Accrual Accounting System in Health Sector

    PubMed Central

    Mehrolhassani, Mohammad Hossien; Khayatzadeh-Mahani, Akram; Emami, Mozhgan

    2015-01-01

    Introduction: Management of financial resources in health systems is one of the major issues of concern for policy makers globally. As a sub-set of financial management, accounting system is of paramount importance. In this paper, which presents part of the results of a wider research project on transition process from a cash accounting system to an accrual accounting system, we look at the impact of components of change on implementation of the new system. Implementing changes is fraught with many obstacles and surveying these challenges will help policy makers to better overcome them. Methods: The study applied a quantitative manner in 2012 at Kerman University of Medical Science in Iran. For the evaluation, a teacher made valid questionnaire with Likert scale was used (Cranach’s alpha of 0.89) which included 7 change components in accounting system. The study population was 32 subordinate units of Kerman University of Medical Sciences and for data analysis, descriptive and inferential statistics and correlation coefficient in SPSS version 19 were used. Results: Level of effect of all components on the implementation was average downward (5.06±1.86), except for the component “management & leadership (3.46±2.25)” (undesirable from external evaluators’ viewpoint) and “technology (6.61±1.92) and work processes (6.35±2.19)” (middle to high from internal evaluators’ viewpoint). Conclusions: Results showed that the establishment of accrual accounting system faces infrastructural challenges, especially the components of leadership and management and followers. As such, developing effective measures to overcome implementation obstacles should target these components. PMID:25560337

  20. Increasing the Automation and Autonomy for Spacecraft Operations with Criteria Action Table

    NASA Technical Reports Server (NTRS)

    Li, Zhen-Ping; Savki, Cetin

    2005-01-01

    The Criteria Action Table (CAT) is an automation tool developed for monitoring real time system messages for specific events and processes in order to take user defined actions based on a set of user-defined rules. CAT was developed by Lockheed Martin Space Operations as a part of a larger NASA effort at the Goddard Space Flight Center (GSFC) to create a component-based, middleware-based, and standard-based general purpose ground system architecture referred as GMSEC - the GSFC Mission Services Evolution Center. CAT has been integrated into the upgraded ground systems for Tropical Rainfall Measuring Mission (TRMM) and Small Explorer (SMEX) satellites and it plays the central role in their automation effort to reduce the cost and increase the reliability for spacecraft operations. The GMSEC architecture provides a standard communication interface and protocol for components to publish/describe messages to an information bus. It also provides a standard message definition so components can send and receive messages to the bus interface rather than each other, thus reducing the component-to-component coupling, interface, protocols, and link (socket) management. With the GMSEC architecture, components can publish standard event messages to the bus for all nominal, significant, and surprising events in regard to satellite, celestial, ground system, or any other activity. In addition to sending standard event messages, each GMSEC compliant component is required to accept and process GMSEC directive request messages.

  1. Knowledge-based reusable software synthesis system

    NASA Technical Reports Server (NTRS)

    Donaldson, Cammie

    1989-01-01

    The Eli system, a knowledge-based reusable software synthesis system, is being developed for NASA Langley under a Phase 2 SBIR contract. Named after Eli Whitney, the inventor of interchangeable parts, Eli assists engineers of large-scale software systems in reusing components while they are composing their software specifications or designs. Eli will identify reuse potential, search for components, select component variants, and synthesize components into the developer's specifications. The Eli project began as a Phase 1 SBIR to define a reusable software synthesis methodology that integrates reusabilityinto the top-down development process and to develop an approach for an expert system to promote and accomplish reuse. The objectives of the Eli Phase 2 work are to integrate advanced technologies to automate the development of reusable components within the context of large system developments, to integrate with user development methodologies without significant changes in method or learning of special languages, and to make reuse the easiest operation to perform. Eli will try to address a number of reuse problems including developing software with reusable components, managing reusable components, identifying reusable components, and transitioning reuse technology. Eli is both a library facility for classifying, storing, and retrieving reusable components and a design environment that emphasizes, encourages, and supports reuse.

  2. Modular space vehicle boards, control software, reprogramming, and failure recovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judd, Stephen; Dallmann, Nicholas; McCabe, Kevin

    A space vehicle may have a modular board configuration that commonly uses some or all components and a common operating system for at least some of the boards. Each modular board may have its own dedicated processing, and processing loads may be distributed. The space vehicle may be reprogrammable, and may be launched without code that enables all functionality and/or components. Code errors may be detected and the space vehicle may be reset to a working code version to prevent system failure.

  3. Analytical Modeling and Performance Prediction of Remanufactured Gearbox Components

    NASA Astrophysics Data System (ADS)

    Pulikollu, Raja V.; Bolander, Nathan; Vijayakar, Sandeep; Spies, Matthew D.

    Gearbox components operate in extreme environments, often leading to premature removal or overhaul. Though worn or damaged, these components still have the ability to function given the appropriate remanufacturing processes are deployed. Doing so reduces a significant amount of resources (time, materials, energy, manpower) otherwise required to produce a replacement part. Unfortunately, current design and analysis approaches require extensive testing and evaluation to validate the effectiveness and safety of a component that has been used in the field then processed outside of original OEM specification. To test all possible combination of component coupled with various levels of potential damage repaired through various options of processing would be an expensive and time consuming feat, thus prohibiting a broad deployment of remanufacturing processes across industry. However, such evaluation and validation can occur through Integrated Computational Materials Engineering (ICME) modeling and simulation. Sentient developed a microstructure-based component life prediction (CLP) tool to quantify and assist gearbox components remanufacturing process. This was achieved by modeling the design-manufacturing-microstructure-property relationship. The CLP tool assists in remanufacturing of high value, high demand rotorcraft, automotive and wind turbine gears and bearings. This paper summarizes the CLP models development, and validation efforts by comparing the simulation results with rotorcraft spiral bevel gear physical test data. CLP analyzes gear components and systems for safety, longevity, reliability and cost by predicting (1) New gearbox component performance, and optimal time-to-remanufacture (2) Qualification of used gearbox components for remanufacturing process (3) Predicting the remanufactured component performance.

  4. Exploring Systems That Support Good Clinical Care in Indigenous Primary Health-care Services: A Retrospective Analysis of Longitudinal Systems Assessment Tool Data from High-Improving Services.

    PubMed

    Woods, Cindy; Carlisle, Karen; Larkins, Sarah; Thompson, Sandra Claire; Tsey, Komla; Matthews, Veronica; Bailie, Ross

    2017-01-01

    Continuous Quality Improvement is a process for raising the quality of primary health care (PHC) across Indigenous PHC services. In addition to clinical auditing using plan, do, study, and act cycles, engaging staff in a process of reflecting on systems to support quality care is vital. The One21seventy Systems Assessment Tool (SAT) supports staff to assess systems performance in terms of five key components. This study examines quantitative and qualitative SAT data from five high-improving Indigenous PHC services in northern Australia to understand the systems used to support quality care. High-improving services selected for the study were determined by calculating quality of care indices for Indigenous health services participating in the Audit and Best Practice in Chronic Disease National Research Partnership. Services that reported continuing high improvement in quality of care delivered across two or more audit tools in three or more audits were selected for the study. Precollected SAT data (from annual team SAT meetings) are presented longitudinally using radar plots for quantitative scores for each component, and content analysis is used to describe strengths and weaknesses of performance in each systems' component. High-improving services were able to demonstrate strong processes for assessing system performance and consistent improvement in systems to support quality care across components. Key strengths in the quality support systems included adequate and orientated workforce, appropriate health system supports, and engagement with other organizations and community, while the weaknesses included lack of service infrastructure, recruitment, retention, and support for staff and additional costs. Qualitative data revealed clear voices from health service staff expressing concerns with performance, and subsequent SAT data provided evidence of changes made to address concerns. Learning from the processes and strengths of high-improving services may be useful as we work with services striving to improve the quality of care provided in other areas.

  5. Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns

    NASA Technical Reports Server (NTRS)

    Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.

    2006-01-01

    Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.

  6. Quantifying the Relative Contributions of Divisive and Subtractive Feedback to Rhythm Generation

    PubMed Central

    Tabak, Joël; Rinzel, John; Bertram, Richard

    2011-01-01

    Biological systems are characterized by a high number of interacting components. Determining the role of each component is difficult, addressed here in the context of biological oscillations. Rhythmic behavior can result from the interplay of positive feedback that promotes bistability between high and low activity, and slow negative feedback that switches the system between the high and low activity states. Many biological oscillators include two types of negative feedback processes: divisive (decreases the gain of the positive feedback loop) and subtractive (increases the input threshold) that both contribute to slowly move the system between the high- and low-activity states. Can we determine the relative contribution of each type of negative feedback process to the rhythmic activity? Does one dominate? Do they control the active and silent phase equally? To answer these questions we use a neural network model with excitatory coupling, regulated by synaptic depression (divisive) and cellular adaptation (subtractive feedback). We first attempt to apply standard experimental methodologies: either passive observation to correlate the variations of a variable of interest to system behavior, or deletion of a component to establish whether a component is critical for the system. We find that these two strategies can lead to contradictory conclusions, and at best their interpretive power is limited. We instead develop a computational measure of the contribution of a process, by evaluating the sensitivity of the active (high activity) and silent (low activity) phase durations to the time constant of the process. The measure shows that both processes control the active phase, in proportion to their speed and relative weight. However, only the subtractive process plays a major role in setting the duration of the silent phase. This computational method can be used to analyze the role of negative feedback processes in a wide range of biological rhythms. PMID:21533065

  7. Building analytical platform with Big Data solutions for log files of PanDA infrastructure

    NASA Astrophysics Data System (ADS)

    Alekseev, A. A.; Barreiro Megino, F. G.; Klimentov, A. A.; Korchuganova, T. A.; Maendo, T.; Padolski, S. V.

    2018-05-01

    The paper describes the implementation of a high-performance system for the processing and analysis of log files for the PanDA infrastructure of the ATLAS experiment at the Large Hadron Collider (LHC), responsible for the workload management of order of 2M daily jobs across the Worldwide LHC Computing Grid. The solution is based on the ELK technology stack, which includes several components: Filebeat, Logstash, ElasticSearch (ES), and Kibana. Filebeat is used to collect data from logs. Logstash processes data and export to Elasticsearch. ES are responsible for centralized data storage. Accumulated data in ES can be viewed using a special software Kibana. These components were integrated with the PanDA infrastructure and replaced previous log processing systems for increased scalability and usability. The authors will describe all the components and their configuration tuning for the current tasks, the scale of the actual system and give several real-life examples of how this centralized log processing and storage service is used to showcase the advantages for daily operations.

  8. An efficient ASIC implementation of 16-channel on-line recursive ICA processor for real-time EEG system.

    PubMed

    Fang, Wai-Chi; Huang, Kuan-Ju; Chou, Chia-Ching; Chang, Jui-Chung; Cauwenberghs, Gert; Jung, Tzyy-Ping

    2014-01-01

    This is a proposal for an efficient very-large-scale integration (VLSI) design, 16-channel on-line recursive independent component analysis (ORICA) processor ASIC for real-time EEG system, implemented with TSMC 40 nm CMOS technology. ORICA is appropriate to be used in real-time EEG system to separate artifacts because of its highly efficient and real-time process features. The proposed ORICA processor is composed of an ORICA processing unit and a singular value decomposition (SVD) processing unit. Compared with previous work [1], this proposed ORICA processor has enhanced effectiveness and reduced hardware complexity by utilizing a deeper pipeline architecture, shared arithmetic processing unit, and shared registers. The 16-channel random signals which contain 8-channel super-Gaussian and 8-channel sub-Gaussian components are used to analyze the dependence of the source components, and the average correlation coefficient is 0.95452 between the original source signals and extracted ORICA signals. Finally, the proposed ORICA processor ASIC is implemented with TSMC 40 nm CMOS technology, and it consumes 15.72 mW at 100 MHz operating frequency.

  9. The Scenario-Based Engineering Process (SEP): a user-centered approach for the development of health care systems.

    PubMed

    Harbison, K; Kelly, J; Burnell, L; Silva, J

    1995-01-01

    The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.

  10. The Significance of the Understanding of Balance and Coordination in Self-Cognitive "Bio-Electro-Biblio/Info" Systems.

    ERIC Educational Resources Information Center

    Tsai, Bor-sheng

    1991-01-01

    Examines the information communication process and proposes a fuzzy commonality model for improving communication systems. Topics discussed include components of an electronic information programing and processing system and the flow of the formation and transfer of information, including DOS (disk operating system) commands, computer programing…

  11. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  12. Calibration and Testing of Digital Zenith Camera System Components

    NASA Astrophysics Data System (ADS)

    Ulug, Rasit; Halicioglu, Kerem; Tevfik Ozludemir, M.; Albayrak, Muge; Basoglu, Burak; Deniz, Rasim

    2017-04-01

    Starting from the beginning of the new millennium, thanks to the Charged-Coupled Device (CCD) technology, fully or partly automatic zenith camera systems are designed and used in order to determine astro-geodetic deflections of the vertical components in several countries, including Germany, Switzerland, Serbia, Latvia, Poland, Austria, China and Turkey. The Digital Zenith Camera System (DZCS) of Turkey performed successful observations yet it needs to be improved in terms of automating the system and increasing observation accuracy. In order to optimize the observation time and improve the system, some modifications have been implemented. Through the modification process that started at the beginning of 2016, some DZCS components have been replaced with the new ones and some new additional components have been installed. In this presentation, the ongoing calibration and testing process of the DZCS are summarized in general. In particular, one of the tested system components is the High Resolution Tiltmeter (HRTM), which enable orthogonal orientation of DZCS to the direction of plump line, is discussed. For the calibration of these components, two tiltmeters with different accuracies (1 nrad and 0.001 mrad) were observed nearly 30 days. The data recorded under different environmental conditions were divided into hourly, daily, and weekly subsets. In addition to the effects of temperature and humidity, interoperability of two tiltmeters were also investigated. Results show that with the integration of HRTM and the other implementations, the modified DZCS provides higher accuracy for the determination of vertical deflections.

  13. Flexible distributed architecture for semiconductor process control and experimentation

    NASA Astrophysics Data System (ADS)

    Gower, Aaron E.; Boning, Duane S.; McIlrath, Michael B.

    1997-01-01

    Semiconductor fabrication requires an increasingly expensive and integrated set of tightly controlled processes, driving the need for a fabrication facility with fully computerized, networked processing equipment. We describe an integrated, open system architecture enabling distributed experimentation and process control for plasma etching. The system was developed at MIT's Microsystems Technology Laboratories and employs in-situ CCD interferometry based analysis in the sensor-feedback control of an Applied Materials Precision 5000 Plasma Etcher (AME5000). Our system supports accelerated, advanced research involving feedback control algorithms, and includes a distributed interface that utilizes the internet to make these fabrication capabilities available to remote users. The system architecture is both distributed and modular: specific implementation of any one task does not restrict the implementation of another. The low level architectural components include a host controller that communicates with the AME5000 equipment via SECS-II, and a host controller for the acquisition and analysis of the CCD sensor images. A cell controller (CC) manages communications between these equipment and sensor controllers. The CC is also responsible for process control decisions; algorithmic controllers may be integrated locally or via remote communications. Finally, a system server images connections from internet/intranet (web) based clients and uses a direct link with the CC to access the system. Each component communicates via a predefined set of TCP/IP socket based messages. This flexible architecture makes integration easier and more robust, and enables separate software components to run on the same or different computers independent of hardware or software platform.

  14. Statistics of Shared Components in Complex Component Systems

    NASA Astrophysics Data System (ADS)

    Mazzolini, Andrea; Gherardi, Marco; Caselle, Michele; Cosentino Lagomarsino, Marco; Osella, Matteo

    2018-04-01

    Many complex systems are modular. Such systems can be represented as "component systems," i.e., sets of elementary components, such as LEGO bricks in LEGO sets. The bricks found in a LEGO set reflect a target architecture, which can be built following a set-specific list of instructions. In other component systems, instead, the underlying functional design and constraints are not obvious a priori, and their detection is often a challenge of both scientific and practical importance, requiring a clear understanding of component statistics. Importantly, some quantitative invariants appear to be common to many component systems, most notably a common broad distribution of component abundances, which often resembles the well-known Zipf's law. Such "laws" affect in a general and nontrivial way the component statistics, potentially hindering the identification of system-specific functional constraints or generative processes. Here, we specifically focus on the statistics of shared components, i.e., the distribution of the number of components shared by different system realizations, such as the common bricks found in different LEGO sets. To account for the effects of component heterogeneity, we consider a simple null model, which builds system realizations by random draws from a universe of possible components. Under general assumptions on abundance heterogeneity, we provide analytical estimates of component occurrence, which quantify exhaustively the statistics of shared components. Surprisingly, this simple null model can positively explain important features of empirical component-occurrence distributions obtained from large-scale data on bacterial genomes, LEGO sets, and book chapters. Specific architectural features and functional constraints can be detected from occurrence patterns as deviations from these null predictions, as we show for the illustrative case of the "core" genome in bacteria.

  15. [Absorption and metabolism of Chuanxiong Rhizoma decoction with multi-component sequential metabolism method].

    PubMed

    Liu, Yang; Luo, Zhi-Qiang; Lv, Bei-Ran; Zhao, Hai-Yu; Dong, Ling

    2016-04-01

    The multiple components in Chinese herbal medicines (CHMS) will experience complex absorption and metabolism before entering the blood system. Previous studies often lay emphasis on the components in blood. However, the dynamic and sequential absorption and metabolism process following multi-component oral administration has not been studied. In this study, the in situ closed-loop method combined with LC-MS techniques were employed to study the sequential process of Chuanxiong Rhizoma decoction (RCD). A total of 14 major components were identified in RCD. Among them, ferulic acid, senkyunolide J, senkyunolide I, senkyunolide F, senkyunolide G, and butylidenephthalide were detected in all of the samples, indicating that the six components could be absorbed into blood in prototype. Butylphthalide, E-ligustilide, Z-ligustilide, cnidilide, senkyunolide A and senkyunolide Q were not detected in all the samples, suggesting that the six components may not be absorbed or metabolized before entering the hepatic portal vein. Senkyunolide H could be metabolized by the liver, while senkyunolide M could be metabolized by both liver and intestinal flora. This study clearly demonstrated the changes in the absorption and metabolism process following multi-component oral administration of RCD, so as to convert the static multi-component absorption process into a comprehensive dynamic and continuous absorption and metabolism process. Copyright© by the Chinese Pharmaceutical Association.

  16. Scaled CMOS Reliability and Considerations for Spacecraft Systems: Bottom-Up and Top-Down Perspective

    NASA Technical Reports Server (NTRS)

    White, Mark

    2012-01-01

    New space missions will increasingly rely on more advanced technologies because of system requirements for higher performance, particularly in instruments and high-speed processing. Component-level reliability challenges with scaled CMOS in spacecraft systems from a bottom-up perspective have been presented. Fundamental Front-end and Back-end processing reliability issues with more aggressively scaled parts have been discussed. Effective thermal management from system-level to the componentlevel (top-down) is a key element in overall design of reliable systems. Thermal management in space systems must consider a wide range of issues, including thermal loading of many different components, and frequent temperature cycling of some systems. Both perspectives (top-down and bottom-up) play a large role in robust, reliable spacecraft system design.

  17. [Establishment of industry promotion technology system in Chinese medicine secondary exploitation based on "component structure theory"].

    PubMed

    Cheng, Xu-Dong; Feng, Liang; Zhang, Ming-Hua; Gu, Jun-Fei; Jia, Xiao-Bin

    2014-10-01

    The purpose of the secondary exploitation of Chinese medicine is to improve the quality of Chinese medicine products, enhance core competitiveness, for better use in clinical practice, and more effectively solve the patient suffering. Herbs, extraction, separation, refreshing, preparation and quality control are all involved in the industry promotion of Chinese medicine secondary exploitation of industrial production. The Chinese medicine quality improvement and industry promotion could be realized with the whole process of process optimization, quality control, overall processes improvement. Based on the "component structure theory", "multi-dimensional structure & process dynamic quality control system" and systematic and holistic character of Chinese medicine, impacts of whole process were discussed. Technology systems of Chinese medicine industry promotion was built to provide theoretical basis for improving the quality and efficacy of the secondary development of traditional Chinese medicine products.

  18. ERIC Processing Manual. Rules and Guidelines for the Acquisition, Selection, and Technical Processing of Documents and Journal Articles by the Various Components of the ERIC Network.

    ERIC Educational Resources Information Center

    Brandhorst, Ted, Ed.; And Others

    This loose-leaf manual provides the detailed rules, guidelines, and examples to be used by the components of the Educational Resources Information Center (ERIC) Network in acquiring and selecting documents and in processing them (i.e., cataloging, indexing, abstracting) for input to the ERIC computer system and subsequent announcement in…

  19. JOB BUILDER remote batch processing subsystem

    NASA Technical Reports Server (NTRS)

    Orlov, I. G.; Orlova, T. L.

    1980-01-01

    The functions of the JOB BUILDER remote batch processing subsystem are described. Instructions are given for using it as a component of a display system developed by personnel of the System Programming Laboratory, Institute of Space Research, USSR Academy of Sciences.

  20. Bonded polyimide fuel cell package

    DOEpatents

    Morse, Jeffrey D.; Jankowski, Alan; Graff, Robert T.; Bettencourt, Kerry

    2010-06-08

    Described herein are processes for fabricating microfluidic fuel cell systems with embedded components in which micron-scale features are formed by bonding layers of DuPont Kapton.TM. polyimide laminate. A microfluidic fuel cell system fabricated using this process is also described.

  1. Architecture for Survivable System Processing (ASSP)

    NASA Astrophysics Data System (ADS)

    Wood, Richard J.

    1991-11-01

    The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

  2. Architecture for Survivable System Processing (ASSP)

    NASA Technical Reports Server (NTRS)

    Wood, Richard J.

    1991-01-01

    The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

  3. A Systematic Classification for HVAC Systems and Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Han; Chen, Yan; Zhang, Jian

    Depending on the application, the complexity of an HVAC system can range from a small fan coil unit to a large centralized air conditioning system with primary and secondary distribution loops, and central plant components. Currently, the taxonomy of HVAC systems and the components has various aspects, which can get quite complex because of the various components and system configurations. For example, based on cooling and heating medium delivered to terminal units, systems can be classified as either air systems, water systems or air-water systems. In addition, some of the system names might be commonly used in a confusing manner,more » such as “unitary system” vs. “packaged system.” Without a systematic classification, these components and system terminology can be confusing to understand or differentiate from each other, and it creates ambiguity in communication, interpretation, and documentation. It is valuable to organize and classify HVAC systems and components so that they can be easily understood and used in a consistent manner. This paper aims to develop a systematic classification of HVAC systems and components. First, we summarize the HVAC component information and definitions based on published literature, such as ASHRAE handbooks, regulations, and rating standards. Then, we identify common HVAC system types and map them to the collected components in a meaningful way. Classification charts are generated and described based on the component information. Six main categories are identified for the HVAC components and equipment, i.e., heating and cooling production, heat extraction and rejection, air handling process, distribution system, terminal use, and stand-alone system. Components for each main category are further analyzed and classified in detail. More than fifty system names are identified and grouped based on their characteristics. The result from this paper will be helpful for education, communication, and systems and component documentation.« less

  4. Ion beam figuring of small optical components

    NASA Astrophysics Data System (ADS)

    Drueding, Thomas W.; Fawcett, Steven C.; Wilson, Scott R.; Bifano, Thomas G.

    1995-12-01

    Ion beam figuring provides a highly deterministic method for the final precision figuring of optical components with advantages over conventional methods. The process involves bombarding a component with a stable beam of accelerated particles that selectively removes material from the surface. Figure corrections are achieved by rastering the fixed-current beam across the workplace at appropriate, time-varying velocities. Unlike conventional methods, ion figuring is a noncontact technique and thus avoids such problems as edge rolloff effects, tool wear, and force loading of the workpiece. This work is directed toward the development of the precision ion machining system at NASA's Marshall Space Flight Center. This system is designed for processing small (approximately equals 10-cm diam) optical components. Initial experiments were successful in figuring 8-cm-diam fused silica and chemical-vapor-deposited SiC samples. The experiments, procedures, and results of figuring the sample workpieces to shallow spherical, parabolic (concave and convex), and non-axially-symmetric shapes are discussed. Several difficulties and limitations encountered with the current system are discussed. The use of a 1-cm aperture for making finer corrections on optical components is also reported.

  5. An Integrated High Resolution Hydrometeorological Modeling Testbed using LIS and WRF

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Eastman, Joseph L.; Tao, Wei-Kuo

    2007-01-01

    Scientists have made great strides in modeling physical processes that represent various weather and climate phenomena. Many modeling systems that represent the major earth system components (the atmosphere, land surface, and ocean) have been developed over the years. However, developing advanced Earth system applications that integrates these independently developed modeling systems have remained a daunting task due to limitations in computer hardware and software. Recently, efforts such as the Earth System Modeling Ramework (ESMF) and Assistance for Land Modeling Activities (ALMA) have focused on developing standards, guidelines, and computational support for coupling earth system model components. In this article, the development of a coupled land-atmosphere hydrometeorological modeling system that adopts these community interoperability standards, is described. The land component is represented by the Land Information System (LIS), developed by scientists at the NASA Goddard Space Flight Center. The Weather Research and Forecasting (WRF) model, a mesoscale numerical weather prediction system, is used as the atmospheric component. LIS includes several community land surface models that can be executed at spatial scales as fine as 1km. The data management capabilities in LIS enable the direct use of high resolution satellite and observation data for modeling. Similarly, WRF includes several parameterizations and schemes for modeling radiation, microphysics, PBL and other processes. Thus the integrated LIS-WRF system facilitates several multi-model studies of land-atmosphere coupling that can be used to advance earth system studies.

  6. 10-kW-class YAG laser application for heavy components

    NASA Astrophysics Data System (ADS)

    Ishide, Takashi; Tsubota, S.; Nayama, Michisuke; Shimokusu, Yoshiaki; Nagashima, Tadashi; Okimura, K.

    2000-02-01

    The authors have put the YAG laser of the kW class to practical use for repair welding of nuclear power plant steam generator heat exchanger tubes, all-position welding of pipings, etc. This paper describes following developed methods and systems of high power YAG laser processing. First, we apply the 6 kW to 10 kW YAG lasers for welding and cutting in heavy components. The beam guide systems we have used are optical fibers which core diameter is 0.6 mm to 0.8 mm and its length is 200 m as standard one. Using these system, we can get the 1 pass penetration of 15 mm to 20 mm and multi pass welding for more thick plates. Cutting of 100 mm thickness plate data also described for dismantling of nuclear power plants. In these systems we carried out the in-process monitoring by using CCD camera image processing and monitoring fiber which placed coaxial to the YAG optical lens system. In- process monitoring by the monitoring fiber, we measured the light intensity from welding area. Further, we have developed new hybrid welding with the TIG electrode at the center of lens for high power. The hybrid welding with TIG-YAG system aims lightening of welding groove allowances and welding of high quality. Through these techniques we have applied 7 kW class YAG laser for welding in the components of nuclear power plants.

  7. From scenarios to domain models: processes and representations

    NASA Astrophysics Data System (ADS)

    Haddock, Gail; Harbison, Karan

    1994-03-01

    The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.

  8. Systems analysis of a closed loop ECLSS using the ASPEN simulation tool. Thermodynamic efficiency analysis of ECLSS components. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Chatterjee, Sharmista

    1993-01-01

    Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.

  9. Optical systems fabricated by printing-based assembly

    DOEpatents

    Rogers, John; Nuzzo, Ralph; Meitl, Matthew; Menard, Etienne; Baca, Alfred J; Motala, Michael; Ahn, Jong-Hyun; Park, Sang-Il; Yu, Chang-Jae; Ko, Heung Cho; Stoykovich, Mark; Yoon, Jongseung

    2014-05-13

    Provided are optical devices and systems fabricated, at least in part, via printing-based assembly and integration of device components. In specific embodiments the present invention provides light emitting systems, light collecting systems, light sensing systems and photovoltaic systems comprising printable semiconductor elements, including large area, high performance macroelectronic devices. Optical systems of the present invention comprise semiconductor elements assembled, organized and/or integrated with other device components via printing techniques that exhibit performance characteristics and functionality comparable to single crystalline semiconductor based devices fabricated using conventional high temperature processing methods. Optical systems of the present invention have device geometries and configurations, such as form factors, component densities, and component positions, accessed by printing that provide a range of useful device functionalities. Optical systems of the present invention include devices and device arrays exhibiting a range of useful physical and mechanical properties including flexibility, shapeability, conformability and stretchablity.

  10. Optical systems fabricated by printing-based assembly

    DOEpatents

    Rogers, John [Champaign, IL; Nuzzo, Ralph [Champaign, IL; Meitl, Matthew [Durham, NC; Menard, Etienne [Durham, NC; Baca, Alfred J [Urbana, IL; Motala, Michael [Champaign, IL; Ahn, Jong-Hyun [Suwon, KR; Park, Sang-II [Savoy, IL; Yu,; Chang-Jae, [Urbana, IL; Ko, Heung-Cho [Gwangju, KR; Stoykovich,; Mark, [Dover, NH; Yoon, Jongseung [Urbana, IL

    2011-07-05

    Provided are optical devices and systems fabricated, at least in part, via printing-based assembly and integration of device components. In specific embodiments the present invention provides light emitting systems, light collecting systems, light sensing systems and photovoltaic systems comprising printable semiconductor elements, including large area, high performance macroelectronic devices. Optical systems of the present invention comprise semiconductor elements assembled, organized and/or integrated with other device components via printing techniques that exhibit performance characteristics and functionality comparable to single crystalline semiconductor based devices fabricated using conventional high temperature processing methods. Optical systems of the present invention have device geometries and configurations, such as form factors, component densities, and component positions, accessed by printing that provide a range of useful device functionalities. Optical systems of the present invention include devices and device arrays exhibiting a range of useful physical and mechanical properties including flexibility, shapeability, conformability and stretchablity.

  11. Optical systems fabricated by printing-based assembly

    DOEpatents

    Rogers, John; Nuzzo, Ralph; Meitl, Matthew; Menard, Etienne; Baca, Alfred; Motala, Michael; Ahn, Jong -Hyun; Park, Sang -Il; Yu, Chang -Jae; Ko, Heung Cho; Stoykovich, Mark; Yoon, Jongseung

    2015-08-25

    Provided are optical devices and systems fabricated, at least in part, via printing-based assembly and integration of device components. In specific embodiments the present invention provides light emitting systems, light collecting systems, light sensing systems and photovoltaic systems comprising printable semiconductor elements, including large area, high performance macroelectronic devices. Optical systems of the present invention comprise semiconductor elements assembled, organized and/or integrated with other device components via printing techniques that exhibit performance characteristics and functionality comparable to single crystalline semiconductor based devices fabricated using conventional high temperature processing methods. Optical systems of the present invention have device geometries and configurations, such as form factors, component densities, and component positions, accessed by printing that provide a range of useful device functionalities. Optical systems of the present invention include devices and device arrays exhibiting a range of useful physical and mechanical properties including flexibility, shapeability, conformability and stretchablity.

  12. Optical systems fabricated by printing-based assembly

    DOEpatents

    Rogers, John; Nuzzo, Ralph; Meitl, Matthew; Menard, Etienne; Baca, Alfred; Motala, Michael; Ahn, Jong-Hyun; Park, Sang-Il; Yu, Chang-Jae; Ko, Heung Cho; Stoykovich, Mark; Yoon, Jongseung

    2017-03-21

    Provided are optical devices and systems fabricated, at least in part, via printing-based assembly and integration of device components. In specific embodiments the present invention provides light emitting systems, light collecting systems, light sensing systems and photovoltaic systems comprising printable semiconductor elements, including large area, high performance macroelectronic devices. Optical systems of the present invention comprise semiconductor elements assembled, organized and/or integrated with other device components via printing techniques that exhibit performance characteristics and functionality comparable to single crystalline semiconductor based devices fabricated using conventional high temperature processing methods. Optical systems of the present invention have device geometries and configurations, such as form factors, component densities, and component positions, accessed by printing that provide a range of useful device functionalities. Optical systems of the present invention include devices and device arrays exhibiting a range of useful physical and mechanical properties including flexibility, shapeability, conformability and stretchablity.

  13. System on a chip with MPEG-4 capability

    NASA Astrophysics Data System (ADS)

    Yassa, Fathy; Schonfeld, Dan

    2002-12-01

    Current products supporting video communication applications rely on existing computer architectures. RISC processors have been used successfully in numerous applications over several decades. DSP processors have become ubiquitous in signal processing and communication applications. Real-time applications such as speech processing in cellular telephony rely extensively on the computational power of these processors. Video processors designed to implement the computationally intensive codec operations have also been used to address the high demands of video communication applications (e.g., cable set-top boxes and DVDs). This paper presents an overview of a system-on-chip (SOC) architecture used for real-time video in wireless communication applications. The SOC specifications answer to the system requirements imposed by the application environment. A CAM-based video processor is used to accelerate data intensive video compression tasks such as motion estimations and filtering. Other components are dedicated to system level data processing and audio processing. A rich set of I/Os allows the SOC to communicate with other system components such as baseband and memory subsystems.

  14. Online damage inspection of optics for ATP system

    NASA Astrophysics Data System (ADS)

    Chen, Jing; Jiang, Yu; Mao, Yao; Gan, Xun; Liu, Qiong

    2016-09-01

    In the Electro-Optical acquisition-tracking-pointing system (ATP), the optical components will be damaged with the several influencing factors. In this situation, the rate will increase sharply when the arrival of damage to some extent. As the complex processing techniques and long processing cycle of optical components, the damage will cause the great increase of the system development cost and cycle. Therefore, it is significant to detect the laser-induced damage in the ATP system. At present, the major research on the on-line damage detection technology of optical components is for the large optical system in the international. The relevant detection systems have complicated structures and many of components, and require enough installation space reserved, which do not apply for ATP system. To solve the problem mentioned before, This paper use a method based on machine vision to detect the damage on-line for the present ATP system. To start with, CCD and PC are used for image acquisition. Secondly, smoothing filters are used to restrain false damage points produced by noise. Then, with the shape feature included in the damage image, the OTSU Method which can define the best segmentation threshold automatically is used to achieve the goal to locate the damage regions. At last, we can supply some opinions for the lifetime of the optical components by analyzing the damage data, such as damage area, damage position. The method has the characteristics of few-detectors and simple-structures which can be installed without any changes of the original light path. With the method, experimental results show that it is stable and effective to achieve the goal of detecting the damage of optical components on-line in the ATP system.

  15. Capture of carbon dioxide by hybrid sorption

    DOEpatents

    Srinivasachar, Srivats

    2014-09-23

    A composition, process and system for capturing carbon dioxide from a combustion gas stream. The composition has a particulate porous support medium that has a high volume of pores, an alkaline component distributed within the pores and on the surface of the support medium, and water adsorbed on the alkaline component, wherein the proportion of water in the composition is between about 5% and about 35% by weight of the composition. The process and system contemplates contacting the sorbent and the flowing gas stream together at a temperature and for a time such that some water remains adsorbed in the alkaline component when the contact of the sorbent with the flowing gas ceases.

  16. Fabrication technology

    NASA Astrophysics Data System (ADS)

    1988-05-01

    Many laboratory programs continue to need optical components of ever-increasing size and accuracy. Unfortunately, optical surfaces produced by the conventional sequence of grinding, lapping, and polishing can become prohibitively expensive. Research in the Fabrication Technology area focuses on methods of fabricating components with heretofore unrealized levels of precision. In FY87, researchers worked to determine the fundamental mechanical limits of material removal, experimented with unique material removal and deposition processes, developed servo systems for controlling the geometric position of ultraprecise machine tools, and advanced the ability to precisely measure contoured workpieces. Continued work in these areas will lead to more cost-effective processes to fabricate even higher quality optical components for advanced lasers and for visible, ultraviolet, and X-ray diagnostic systems.

  17. Advances and Computational Tools towards Predictable Design in Biological Engineering

    PubMed Central

    2014-01-01

    The design process of complex systems in all the fields of engineering requires a set of quantitatively characterized components and a method to predict the output of systems composed by such elements. This strategy relies on the modularity of the used components or the prediction of their context-dependent behaviour, when parts functioning depends on the specific context. Mathematical models usually support the whole process by guiding the selection of parts and by predicting the output of interconnected systems. Such bottom-up design process cannot be trivially adopted for biological systems engineering, since parts function is hard to predict when components are reused in different contexts. This issue and the intrinsic complexity of living systems limit the capability of synthetic biologists to predict the quantitative behaviour of biological systems. The high potential of synthetic biology strongly depends on the capability of mastering this issue. This review discusses the predictability issues of basic biological parts (promoters, ribosome binding sites, coding sequences, transcriptional terminators, and plasmids) when used to engineer simple and complex gene expression systems in Escherichia coli. A comparison between bottom-up and trial-and-error approaches is performed for all the discussed elements and mathematical models supporting the prediction of parts behaviour are illustrated. PMID:25161694

  18. Maximum flow-based resilience analysis: From component to system

    PubMed Central

    Jin, Chong; Li, Ruiying; Kang, Rui

    2017-01-01

    Resilience, the ability to withstand disruptions and recover quickly, must be considered during system design because any disruption of the system may cause considerable loss, including economic and societal. This work develops analytic maximum flow-based resilience models for series and parallel systems using Zobel’s resilience measure. The two analytic models can be used to evaluate quantitatively and compare the resilience of the systems with the corresponding performance structures. For systems with identical components, the resilience of the parallel system increases with increasing number of components, while the resilience remains constant in the series system. A Monte Carlo-based simulation method is also provided to verify the correctness of our analytic resilience models and to analyze the resilience of networked systems based on that of components. A road network example is used to illustrate the analysis process, and the resilience comparison among networks with different topologies but the same components indicates that a system with redundant performance is usually more resilient than one without redundant performance. However, not all redundant capacities of components can improve the system resilience, the effectiveness of the capacity redundancy depends on where the redundant capacity is located. PMID:28545135

  19. The architecture and dynamics of developing mind: experiential structuralism as a frame for unifying cognitive developmental theories.

    PubMed

    Demetriou, A; Efklides, A; Platsidou, M

    1993-01-01

    This Monograph presents a theory of cognitive development. The theory argues that the mind develops across three fronts. The first refers to a general processing system that defines the general potentials of mind to develop cognitive strategies and skills. The second refers to a hypercognitive system that governs self-understanding and self-regulation. The third involves a set of specialized structural systems (SSSs) that are responsible for the representation and processing of different reality domains. There are specific forces that are responsible for this organization of mind. These are expressed in the Monograph in terms of a set of five organizational principles. The developmental course of the major systems is outlined. Developmental change is ascribed by the theory to the interaction between the various systems. Different types of development require different change mechanisms. Five studies are presented that provide empirical support for these postulates. Study 1 demonstrated the organizational power of principles and SSSs. Study 2 showed that the SSSs constrain the effect of learning. Study 3 established that the hypercognitive system does function as the interface between tasks and SSS-specific processes or between SSSs and general cognitive functions such as attention and memory. Study 4 investigated the relations between one of the components of the processing system, storage, and two different SSSs expressed via two different symbolic systems, namely, the numeric and the imaginal. Finally, Study 5 examined the interaction between the components of the processing system and the relations between each of these components and one SSS, namely, the quantitative-relational SSS. The theoretical implications of these studies with regard to general issues, such as the nature of representation, the causation of cognitive change, and individual differences in cognitive development, are discussed in the concluding chapter.

  20. Automation of NDE on RSRM Metal Components

    NASA Technical Reports Server (NTRS)

    Hartman, John; Kirby, Mark; McCool, Alex (Technical Monitor)

    2002-01-01

    An automated eddy current system has been designed and built, and is being implemented to inspect RSRM (Space Shuttle) metal components. The system provides a significant increase in inspection reliability, as well as other benefits such as data storage, chemical waste reduction and reduction in overall process time. This paper is in viewgraph form.

  1. Teachers Learning to Prepare Future Engineers: A Systemic Analysis Through Five Components of Development and Transfer

    ERIC Educational Resources Information Center

    Hardré, Patricia L.; Ling, Chen; Shehab, Randa L.; Nanny, Mark A.; Refai, Hazem; Nollert, Matthias U.; Ramseyer, Christopher; Wollega, Ebisa D.; Huang, Su-Min; Herron, Jason

    2018-01-01

    This study used a systemic perspective to examine a five-component experiential process of perceptual and developmental growth, and transfer-to-teaching. Nineteen secondary math and science teachers participated in a year-long, engineering immersion and support experience, with university faculty mentors. Teachers identified critical shifts in…

  2. Focusing on the Complexity of Emotion Issues in Academic Learning: A Dynamical Component Systems Approach

    ERIC Educational Resources Information Center

    Eynde, Peter Op 't; Turner, Jeannine E.

    2006-01-01

    Understanding the interrelations among students' cognitive, emotional, motivational, and volitional processes is an emergening focus in educational psychology. A dynamical, component systems theory of emotions is presented as a promising framework to further unravel these complex interrelations. This framework considers emotions to be a process…

  3. System and process for aluminization of metal-containing substrates

    DOEpatents

    Chou, Yeong-Shyung; Stevenson, Jeffry W.

    2017-12-12

    A system and method are detailed for aluminizing surfaces of metallic substrates, parts, and components with a protective alumina layer in-situ. Aluminum (Al) foil sandwiched between the metallic components and a refractory material when heated in an oxidizing gas under a compression load at a selected temperature forms the protective alumina coating on the surface of the metallic components. The alumina coating minimizes evaporation of volatile metals from the metallic substrates, parts, and components in assembled devices that can degrade performance during operation at high temperature.

  4. System and process for aluminization of metal-containing substrates

    DOEpatents

    Chou, Yeong-Shyung; Stevenson, Jeffry W

    2015-11-03

    A system and method are detailed for aluminizing surfaces of metallic substrates, parts, and components with a protective alumina layer in-situ. Aluminum (Al) foil sandwiched between the metallic components and a refractory material when heated in an oxidizing gas under a compression load at a selected temperature forms the protective alumina coating on the surface of the metallic components. The alumina coating minimizes evaporation of volatile metals from the metallic substrates, parts, and components in assembled devices during operation at high temperature that can degrade performance.

  5. Utilization of non-conventional systems for conversion of biomass to food components: Potential for utilization of algae in engineered foods

    NASA Technical Reports Server (NTRS)

    Karel, M.; Kamarei, A. R.; Nakhost, Z.

    1985-01-01

    The major nutritional components of the green algae (Scenedesmus obliquus) grown in a Constant Cell density Apparatus were determined. Suitable methodology to prepare proteins from which three major undesirable components of these cells (i.e., cell walls, nucleic acids, and pigments) were either removed or substantially reduced was developed. Results showed that processing of green algae to protein isolate enhances its potential nutritional and organoleptic acceptability as a diet component in a Controlled Ecological Life Support System.

  6. Microwave components for cellular portable radiotelephone

    NASA Astrophysics Data System (ADS)

    Muraguchi, Masahiro; Aikawa, Masayoshi

    1995-09-01

    Mobile and personal communication systems are expected to represent a huge market for microwave components in the coming years. A number of components in silicon bipolar, silicon Bi-CMOS, GaAs MESFET, HBT and HEMT are now becoming available for system application. There are tradeoffs among the competing technologies with regard to performance, cost, reliability and time-to-market. This paper describes process selection and requirements of cost and r.f. performances to microwave semiconductor components for digital cellular and cordless telephones. Furthermore, new circuit techniques which were developed by NTT are presented.

  7. Mayo clinical Text Analysis and Knowledge Extraction System (cTAKES): architecture, component evaluation and applications

    PubMed Central

    Masanz, James J; Ogren, Philip V; Zheng, Jiaping; Sohn, Sunghwan; Kipper-Schuler, Karin C; Chute, Christopher G

    2010-01-01

    We aim to build and evaluate an open-source natural language processing system for information extraction from electronic medical record clinical free-text. We describe and evaluate our system, the clinical Text Analysis and Knowledge Extraction System (cTAKES), released open-source at http://www.ohnlp.org. The cTAKES builds on existing open-source technologies—the Unstructured Information Management Architecture framework and OpenNLP natural language processing toolkit. Its components, specifically trained for the clinical domain, create rich linguistic and semantic annotations. Performance of individual components: sentence boundary detector accuracy=0.949; tokenizer accuracy=0.949; part-of-speech tagger accuracy=0.936; shallow parser F-score=0.924; named entity recognizer and system-level evaluation F-score=0.715 for exact and 0.824 for overlapping spans, and accuracy for concept mapping, negation, and status attributes for exact and overlapping spans of 0.957, 0.943, 0.859, and 0.580, 0.939, and 0.839, respectively. Overall performance is discussed against five applications. The cTAKES annotations are the foundation for methods and modules for higher-level semantic processing of clinical free-text. PMID:20819853

  8. MOEMs, key optical components for future astronomical instrumentation in space

    NASA Astrophysics Data System (ADS)

    Zamkotsian, Frédéric; Dohlen, Kjetil; Burgarella, Denis; Ferrari, Marc; Buat, Veronique

    2017-11-01

    Based on the micro-electronics fabrication process, MicroOpto-Electro-Mechanical Systems (MOEMS) are under study, in order to be integrated in next-generation astronomical instruments and telescopes, especially for space missions. The main advantages of micro-optical components are their compactness, scalability, specific task customization using elementary building blocks, and they allows remote control. As these systems are easily replicable, the price of the components is decreasing dramatically when their number is increasing. The two major applications of MOEMS are Multi-Object Spectroscopy masks and Deformable Mirror systems.

  9. A Stakeholder-Based System Dynamics Model of Return-to-Work: A Research Protocol.

    PubMed

    Jetha, Arif; Pransky, Glenn; Fish, Jon; Jeffries, Susan; Hettinger, Lawrence J

    2015-07-16

    Returning to work following a job-related injury or illness can be a complex process, influenced by a range of interrelated personal, psychosocial, and organizational components. System dynamics modelling (SDM) takes a sociotechnical systems perspective to view return-to-work (RTW) as a system made up of multiple feedback relationships between influential components. To build the RTW SDM, a mixed-method approach will be used. The first stage, that has already been completed, involved creating a baseline model using key informant interviews. Second, in two manufacturing companies, stakeholder-based models will be developed through interviews and focus groups with senior management, frontline workers, and frontline supervisors. Participants will be asked about the RTW process in general and more targeted questions regarding influential components. Participants will also be led through a reference mode exercise where they will be asked to estimate the direction, shape and magnitude of relationships between influential components. Data will be entered into the software program Vensim that provides a platform for visualizing system-structure and simulating the effects of adapting components. Finally, preliminary model validity testing will be conducted to provide insights on model generalizability and sensitivity. The proposed methodology will create a SDM of the RTW process using feedback relationships of influential components. It will also provide an important simulation tool to understand system behaviour that underlies complex RTW cases, and examine anticipated and unanticipated consequences of disability management policies. Significance for public healthWhile the incidence of occupational injuries and illnesses has declined over the past two decades, the proportion resulting in sickness absence has actually increased. Implementing strategies to address sickness absences and promote return-to-work (RTW) can significantly benefit physical and mental health, and work outcomes like worker engagement, job satisfaction and job strain. As a key social determinant of health, participation in paid work can also ensure that work-disabled individuals generate income necessary for access to housing, education, food, and social services that also benefit health. Improving RTW outcomes can also have significant societal benefits such as a reduction in workers compensation costs, increased economic activity and less burden on social assistance programs. Despite its benefits, returning to work after injury or illness is not a straightforward process and can be complicated by the individual, psychosocial, organizational and regulatory components that influence a disabled person's ability to resume work activities.

  10. Creation of system of computer-aided design for technological objects

    NASA Astrophysics Data System (ADS)

    Zubkova, T. M.; Tokareva, M. A.; Sultanov, N. Z.

    2018-05-01

    Due to the competition in the market of process equipment, its production should be flexible, retuning to various product configurations, raw materials and productivity, depending on the current market needs. This process is not possible without CAD (computer-aided design). The formation of CAD begins with planning. Synthesizing, analyzing, evaluating, converting operations, as well as visualization and decision-making operations, can be automated. Based on formal description of the design procedures, the design route in the form of an oriented graph is constructed. The decomposition of the design process, represented by the formalized description of the design procedures, makes it possible to make an informed choice of the CAD component for the solution of the task. The object-oriented approach allows us to consider the CAD as an independent system whose properties are inherited from the components. The first step determines the range of tasks to be performed by the system, and a set of components for their implementation. The second one is the configuration of the selected components. The interaction between the selected components is carried out using the CALS standards. The chosen CAD / CAE-oriented approach allows creating a single model, which is stored in the database of the subject area. Each of the integration stages is implemented as a separate functional block. The transformation of the CAD model into the model of the internal representation is realized by the block of searching for the geometric parameters of the technological machine, in which the XML-model of the construction is obtained on the basis of the feature method from the theory of image recognition. The configuration of integrated components is divided into three consecutive steps: configuring tasks, components, interfaces. The configuration of the components is realized using the theory of "soft computations" using the Mamdani fuzzy inference algorithm.

  11. System principles, mathematical models and methods to ensure high reliability of safety systems

    NASA Astrophysics Data System (ADS)

    Zaslavskyi, V.

    2017-04-01

    Modern safety and security systems are composed of a large number of various components designed for detection, localization, tracking, collecting, and processing of information from the systems of monitoring, telemetry, control, etc. They are required to be highly reliable in a view to correctly perform data aggregation, processing and analysis for subsequent decision making support. On design and construction phases of the manufacturing of such systems a various types of components (elements, devices, and subsystems) are considered and used to ensure high reliability of signals detection, noise isolation, and erroneous commands reduction. When generating design solutions for highly reliable systems a number of restrictions and conditions such as types of components and various constrains on resources should be considered. Various types of components perform identical functions; however, they are implemented using diverse principles, approaches and have distinct technical and economic indicators such as cost or power consumption. The systematic use of different component types increases the probability of tasks performing and eliminates the common cause failure. We consider type-variety principle as an engineering principle of system analysis, mathematical models based on this principle, and algorithms for solving optimization problems of highly reliable safety and security systems design. Mathematical models are formalized in a class of two-level discrete optimization problems of large dimension. The proposed approach, mathematical models, algorithms can be used for problem solving of optimal redundancy on the basis of a variety of methods and control devices for fault and defects detection in technical systems, telecommunication networks, and energy systems.

  12. Laser materials processing of complex components: from reverse engineering via automated beam path generation to short process development cycles

    NASA Astrophysics Data System (ADS)

    Görgl, Richard; Brandstätter, Elmar

    2017-01-01

    The article presents an overview of what is possible nowadays in the field of laser materials processing. The state of the art in the complete process chain is shown, starting with the generation of a specific components CAD data and continuing with the automated motion path generation for the laser head carried by a CNC or robot system. Application examples from laser cladding and laser-based additive manufacturing are given.

  13. The application of digital techniques to the analysis of metallurgical experiments

    NASA Technical Reports Server (NTRS)

    Rathz, T. J.

    1977-01-01

    The application of a specific digital computer system (known as the Image Data Processing System) to the analysis of three NASA-sponsored metallurgical experiments is discussed in some detail. The basic hardware and software components of the Image Data Processing System are presented. Many figures are presented in the discussion of each experimental analysis in an attempt to show the accuracy and speed that the Image Data Processing System affords in analyzing photographic images dealing with metallurgy, and in particular with material processing.

  14. Mass exchange in an experimental new-generation life support system model based on biological regeneration of environment.

    PubMed

    Tikhomirov, A A; Ushakova, S A; Manukovsky, N S; Lisovsky, G M; Kudenko, Yu A; Kovalev, V S; Gubanov, V G; Barkhatov, Yu V; Gribovskaya, I V; Zolotukhin, I G; Gros, J B; Lasseur, Ch

    2003-01-01

    An experimental model of a biological life support system was used to evaluate qualitative and quantitative parameters of the internal mass exchange. The photosynthesizing unit included the higher plant component (wheat and radish), and the heterotrophic unit consisted of a soil-like substrate, California worms, mushrooms and microbial microflora. The gas mass exchange involved evolution of oxygen by the photosynthesizing component and its uptake by the heterotroph component along with the formation and maintaining of the SLS structure, growth of mushrooms and California worms, human respiration, and some other processes. Human presence in the system in the form of "virtual human" that at regular intervals took part in the respirative gas exchange during the experiment. Experimental data demonstrated good oxygen/carbon dioxide balance, and the closure of the cycles of these gases was almost complete. The water cycle was nearly 100% closed. The main components in the water mass exchange were transpiration water and the watering solution with mineral elements. Human consumption of the edible plant biomass (grains and roots) was simulated by processing these products by a unique physicochemical method of oxidizing them to inorganic mineral compounds, which were then returned into the system and fully assimilated by the plants. The oxidation was achieved by "wet combustion" of organic biomass, using hydrogen peroxide following a special procedure, which does not require high temperature and pressure. Hydrogen peroxide is produced from the water inside the system. The closure of the cycle was estimated for individual elements and compounds. Stoichiometric proportions are given for the main components included in the experimental model of the system. Approaches to the mathematical modeling of the cycling processes are discussed, using the data of the experimental model. Nitrogen, as a representative of biogenic elements, shows an almost 100% closure of the cycle inside the system. The proposed experimental model of a biological system is discussed as a candidate for potential application in the investigations aimed at creating ecosystems with largely closed cycles of the internal mass exchange. The formation and maintenance of sustainable cycling of vitally important chemical elements and compounds in biological life support systems (BLSS) is an extremely pressing problem. To attain the stable functioning of biological life support systems (BLSS) and to maintain a high degree of closure of material cycles in than, it is essential to understand the character of mass exchange processes and stoichiometnc proportions of the initial and synthesized components of the system. c2003 COSPAR. Published by Elsevier Science Ltd. All rights reserved.

  15. A Core Plug and Play Architecture for Reusable Flight Software Systems

    NASA Technical Reports Server (NTRS)

    Wilmot, Jonathan

    2006-01-01

    The Flight Software Branch, at Goddard Space Flight Center (GSFC), has been working on a run-time approach to facilitate a formal software reuse process. The reuse process is designed to enable rapid development and integration of high-quality software systems and to more accurately predict development costs and schedule. Previous reuse practices have been somewhat successful when the same teams are moved from project to project. But this typically requires taking the software system in an all-or-nothing approach where useful components cannot be easily extracted from the whole. As a result, the system is less flexible and scalable with limited applicability to new projects. This paper will focus on the rationale behind, and implementation of the run-time executive. This executive is the core for the component-based flight software commonality and reuse process adopted at Goddard.

  16. Nonterrestrial material processing and manufacturing of large space systems

    NASA Technical Reports Server (NTRS)

    Vontiesenhausen, G. F.

    1978-01-01

    An attempt is made to provide pertinent and readily usable information on the extraterrestrial processing of materials and manufacturing of components and elements of these planned large space systems from preprocessed lunar materials which are made available at a processing and manufacturing site in space. Required facilities, equipment, machinery, energy and manpower are defined.

  17. Commercial Off-the-Shelf (COTS) Components and Enterprise Component Information System (eCIS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Minihan; Ed Schmidt; Greg Enserro

    The purpose of the project was to develop the processes for using commercial off-the-shelf (COTS) parts for WR production and to put in place a system for implementing the data management tools required to disseminate, store, track procurement, and qualify vendors. Much of the effort was devoted to determining if the use of COTS parts was possible. A basic question: How does the Nuclear Weapons Complex (NWC) begin to use COTS in the weapon Stockpile Life Extension Programs with high reliability, affordability, while managing risk at acceptable levels? In FY00, it was determined that a certain weapon refurbishment program couldmore » not be accomplished without the use of COTS components. The elements driving the use of COTS components included decreased cost, greater availability, and shorter delivery time. Key factors that required implementation included identifying the best suppliers and components, defining life cycles and predictions of obsolescence, testing the feasibility of using COTS components with a test contractor to ensure capability, as well as quality and reliability, and implementing the data management tools required to disseminate, store, track procurement, and qualify vendors. The primary effort of this project then was to concentrate on the risks involved in the use of COTS and address the issues of part and vendor selection, procurement and acceptance processes, and qualification of the parts via part and sample testing. The Enterprise Component Information System (eCIS) was used to manage the information generated by the COTS process. eCIS is a common interface for both the design and production of NWC components and systems integrating information between SNL National Laboratory (SNL) and the Kansas City Plant (KCP). The implementation of COTS components utilizes eCIS from part selection through qualification release. All part related data is linked across an unclassified network for access by both SNL and KCP personnel. The system includes not only NWC part information but also includes technical reference data for over 25 Million electronic and electromechanical commercial and military parts via a data subscription. With the capabilities added to the system through this project, eCIS provides decision support, parts list/BOM analysis, editing, tracking, workflows, reporting, and history/legacy information integrating manufacturer reference, company technical, company business, and design data.« less

  18. Novel Applications of Rapid Prototyping in Gamma-ray and X-ray Imaging

    PubMed Central

    Miller, Brian W.; Moore, Jared W.; Gehm, Michael E.; Furenlid, Lars R.; Barrett, Harrison H.

    2010-01-01

    Advances in 3D rapid-prototyping printers, 3D modeling software, and casting techniques allow for the fabrication of cost-effective, custom components in gamma-ray and x-ray imaging systems. Applications extend to new fabrication methods for custom collimators, pinholes, calibration and resolution phantoms, mounting and shielding components, and imaging apertures. Details of the fabrication process for these components are presented, specifically the 3D printing process, cold casting with a tungsten epoxy, and lost-wax casting in platinum. PMID:22984341

  19. Optical fiber technology for space: challenges of development and qualification

    NASA Astrophysics Data System (ADS)

    Goepel, Michael

    2017-11-01

    Using fiber optical components and assemblies for space flight applications brings several challenges for the design and the qualification process. Good knowledge of the system and environmental requirements is needed to derive design decisions and select suitable components for the fiber optical subsystem. Furthermore, the manufacturing process and integration limitations are providing additional constraints, which have to be considered at the beginning of the design phase. Besides Commercial off the shelf (COTS) components, custom made parts are often necessary.

  20. A qualitative assessment of a community pharmacy cognitive pharmaceutical services program, using a work system approach.

    PubMed

    Chui, Michelle A; Mott, David A; Maxwell, Leigh

    2012-01-01

    Although lack of time, trained personnel, and reimbursement have been identified as barriers to pharmacists providing cognitive pharmaceutical services (CPS) in community pharmacies, the underlying contributing factors of these barriers have not been explored. One approach to better understand barriers and facilitators to providing CPS is to use a work system approach to examine different components of a work system and how the components may impact care processes. The goals of this study were to identify and describe pharmacy work system characteristics that pharmacists identified and changed to provide CPS in a demonstration program. A qualitative approach was used for data collection. A purposive sample of 8 pharmacists at 6 community pharmacies participating in a demonstration program was selected to be interviewed. Each semistructured interview was audio recorded and transcribed, and the text was analyzed in a descriptive and interpretive manner by 3 analysts. Themes were identified in the text and aligned with 1 of 5 components of the Systems Engineering Initiative for Patient Safety (SEIPS) work system model (organization, tasks, tools/technology, people, and environment). A total of 21 themes were identified from the interviews, and 7 themes were identified across all 6 interviews. The organization component of the SEIPS model contained the most (n=10) themes. Numerous factors within a pharmacy work system appear important to enable pharmacists to provide CPS. Leadership and foresight by the organization to implement processes (communication, coordination, planning, etc.) to facilitate providing CPS was a key finding across the interviews. Expanding technician responsibilities was reported to be essential for successfully implementing CPS. To be successful in providing CPS, pharmacists must be cognizant of the different components of the pharmacy work system and how these components influence providing CPS. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. A qualitative assessment of a community pharmacy cognitive pharmaceutical services program, using a work system approach

    PubMed Central

    Chui, Michelle A.; Mott, David A.; Maxwell, Leigh

    2012-01-01

    Background Although lack of time, trained personnel, and reimbursement have been identified as barriers to pharmacists providing cognitive pharmaceutical services (CPS) in community pharmacies, the underlying contributing factors of these barriers have not been explored. One approach to better understand barriers and facilitators to providing CPS is to use a work system approach to examine different components of a work system and how the components may impact care processes. Objectives The goals of this study were to identify and describe pharmacy work system characteristics that pharmacists identified and changed to provide CPS in a demonstration program. Methods A qualitative approach was used for data collection. A purposive sample of 8 pharmacists at 6 community pharmacies participating in a demonstration program was selected to be interviewed. Each semistructured interview was audio recorded and transcribed, and the text was analyzed in a descriptive and interpretive manner by 3 analysts. Themes were identified in the text and aligned with 1 of 5 components of the Systems Engineering Initiative for Patient Safety (SEIPS) work system model (organization, tasks, tools/technology, people, and environment). Results A total of 21 themes were identified from the interviews, and 7 themes were identified across all 6 interviews. The organization component of the SEIPS model contained the most (n = 10) themes. Numerous factors within a pharmacy work system appear important to enable pharmacists to provide CPS. Leadership and foresight by the organization to implement processes (communication, coordination, planning, etc.) to facilitate providing CPS was a key finding across the interviews. Expanding technician responsibilities was reported to be essential for successfully implementing CPS. Conclusions To be successful in providing CPS, pharmacists must be cognizant of the different components of the pharmacy work system and how these components influence providing CPS. PMID:21824822

  2. Advanced optical sensing and processing technologies for the distributed control of large flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Williams, G. M.; Fraser, J. C.

    1991-01-01

    The objective was to examine state-of-the-art optical sensing and processing technology applied to control the motion of flexible spacecraft. Proposed large flexible space systems, such an optical telescopes and antennas, will require control over vast surfaces. Most likely distributed control will be necessary involving many sensors to accurately measure the surface. A similarly large number of actuators must act upon the system. The used technical approach included reviewing proposed NASA missions to assess system needs and requirements. A candidate mission was chosen as a baseline study spacecraft for comparison of conventional and optical control components. Control system requirements of the baseline system were used for designing both a control system containing current off-the-shelf components and a system utilizing electro-optical devices for sensing and processing. State-of-the-art surveys of conventional sensor, actuator, and processor technologies were performed. A technology development plan is presented that presents a logical, effective way to develop and integrate advancing technologies.

  3. Laser beam soldering of micro-optical components

    NASA Astrophysics Data System (ADS)

    Eberhardt, R.

    2003-05-01

    MOTIVATION Ongoing miniaturisation and higher requirements within optical assemblies and the processing of temperature sensitive components demands for innovative selective joining techniques. So far adhesive bonding has primarily been used to assemble and adjust hybrid micro optical systems. However, the properties of the organic polymers used for the adhesives limit the application of these systems. In fields of telecommunication and lithography, an enhancement of existing joining techniques is necessary to improve properties like humidity resistance, laserstability, UV-stability, thermal cycle reliability and life time reliability. Against this background laser beam soldering of optical components is a reasonable joining technology alternative. Properties like: - time and area restricted energy input - energy input can be controlled by the process temperature - direct and indirect heating of the components is possible - no mechanical contact between joining tool and components give good conditions to meet the requirements on a joining technology for sensitive optical components. Additionally to the laser soldering head, for the assembly of optical components it is necessary to include positioning units to adjust the position of the components with high accuracy before joining. Furthermore, suitable measurement methods to characterize the soldered assemblies (for instance in terms of position tolerances) need to be developed.

  4. Bonded polyimide fuel cell package and method thereof

    DOEpatents

    Morse, Jeffrey D.; Jankowski, Alan; Graff, Robert T.; Bettencourt, Kerry

    2005-11-01

    Described herein are processes for fabricating microfluidic fuel cell systems with embedded components in which micron-scale features are formed by bonding layers of DuPont Kapton.TM. polyimide laminate. A microfluidic fuel cell system fabricated using this process is also described.

  5. Effective Software Engineering Leadership for Development Programs

    ERIC Educational Resources Information Center

    Cagle West, Marsha

    2010-01-01

    Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…

  6. Method of preparation of bonded polyimide fuel cell package

    DOEpatents

    Morse, Jeffrey D [Martinez, CA; Jankowski, Alan [Livermore, CA; Graff, Robert T [Modesto, CA; Bettencourt, Kerry [Dublin, CA

    2011-04-26

    Described herein are processes for fabricating microfluidic fuel cell systems with embedded components in which micron-scale features are formed by bonding layers of DuPont Kapton.TM. polyimide laminate. A microfluidic fuel cell system fabricated using this process is also described.

  7. A flexible continuous-variable QKD system using off-the-shelf components

    NASA Astrophysics Data System (ADS)

    Comandar, Lucian C.; Brunner, Hans H.; Bettelli, Stefano; Fung, Fred; Karinou, Fotini; Hillerkuss, David; Mikroulis, Spiros; Wang, Dawei; Kuschnerov, Maxim; Xie, Changsong; Poppe, Andreas; Peev, Momtchil

    2017-10-01

    We present the development of a robust and versatile CV-QKD architecture based on commercially available optical and electronic components. The system uses a pilot tone for phase synchronization with a local oscillator, as well as local feedback loops to mitigate frequency and polarization drifts. Transmit and receive-side digital signal processing is performed fully in software, allowing for rapid protocol reconfiguration. The quantum link is complemented with a software stack for secure-key processing, key storage and encrypted communication. All these features allow for the system to be at the same time a prototype for a future commercial product and a research platform.

  8. Cyber-Informed Engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Robert S.; Benjamin, Jacob; Wright, Virginia L.

    A continuing challenge for engineers who utilize digital systems is to understand the impact of cyber-attacks across the entire product and program lifecycle. This is a challenge due to the evolving nature of cyber threats that may impact the design, development, deployment, and operational phases of all systems. Cyber Informed Engineering is the process by which engineers are made aware of both how to use their engineering knowledge to positively impact the cyber security in the processes by which they architect and design components and the services and security of the components themselves.

  9. Microgravity Manufacturing Via Fused Deposition

    NASA Technical Reports Server (NTRS)

    Cooper, K. G.; Griffin, M. R.

    2003-01-01

    Manufacturing polymer hardware during space flight is currently outside the state of the art. A process called fused deposition modeling (FDM) can make this approach a reality by producing net-shaped components of polymer materials directly from a CAE model. FDM is a rapid prototyping process developed by Stratasys, Inc.. which deposits a fine line of semi-molten polymer onto a substrate while moving via computer control to form the cross-sectional shape of the part it is building. The build platen is then lowered and the process is repeated, building a component directly layer by layer. This method enables direct net-shaped production of polymer components directly from a computer file. The layered manufacturing process allows for the manufacture of complex shapes and internal cavities otherwise impossible to machine. This task demonstrated the benefits of the FDM technique to quickly and inexpensively produce replacement components or repair broken hardware in a Space Shuttle or Space Station environment. The intent of the task was to develop and fabricate an FDM system that was lightweight, compact, and required minimum power consumption to fabricate ABS plastic hardware in microgravity. The final product of the shortened task turned out to be a ground-based breadboard device, demonstrating miniaturization capability of the system.

  10. Security for safety critical space borne systems

    NASA Technical Reports Server (NTRS)

    Legrand, Sue

    1987-01-01

    The Space Station contains safety critical computer software components in systems that can affect life and vital property. These components require a multilevel secure system that provides dynamic access control of the data and processes involved. A study is under way to define requirements for a security model providing access control through level B3 of the Orange Book. The model will be prototyped at NASA-Johnson Space Center.

  11. The Characteristics of Earth System Thinking of Science Gifted Students in relation to Climate Changes

    NASA Astrophysics Data System (ADS)

    Chung, Duk Ho; Cho, Kyu Seong; Hong, Deok Pyo; Park, Kyeong Jin

    2016-04-01

    This study aimed to investigate the perception of earth system thinking of science gifted students in future problem solving (FPS) in relation to climate changes. In order to this study, the research problem associated with climate changes was developed through a literature review. The thirty seven science gifted students participated in lessons. The ideas in problem solving process of science gifted students were analyzed using the semantic network analysis method. The results are as follows. In the problem solving processes, science gifted students are ''changes of the sunlight by water layer'', ''changes of the Earth''s temperature'', ''changes of the air pressure'', '' change of the wind and weather''were represented in order. On other hand, regard to earth system thinking for climate changes, while science gifted students were used sub components related to atmospheres frequently, they were used sub components related to biosphere, geosphere, and hydrosphere a little. But, the analytical results of the structural relationship between the sub components related to earth system, they were recognised that biosphere, geosphere, and hydrosphere used very important in network structures. In conclusion, science gifted students were understood well that components of the earth system are influencing each other. Keywords : Science gifted students, Future problem solving, Climate change, Earth system thinking

  12. Personal Computer Transport Analysis Program

    NASA Technical Reports Server (NTRS)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  13. Sensor fusion of phase measuring profilometry and stereo vision for three-dimensional inspection of electronic components assembled on printed circuit boards.

    PubMed

    Hong, Deokhwa; Lee, Hyunki; Kim, Min Young; Cho, Hyungsuck; Moon, Jeon Il

    2009-07-20

    Automatic optical inspection (AOI) for printed circuit board (PCB) assembly plays a very important role in modern electronics manufacturing industries. Well-developed inspection machines in each assembly process are required to ensure the manufacturing quality of the electronics products. However, generally almost all AOI machines are based on 2D image-analysis technology. In this paper, a 3D-measurement-method-based AOI system is proposed consisting of a phase shifting profilometer and a stereo vision system for assembled electronic components on a PCB after component mounting and the reflow process. In this system information from two visual systems is fused to extend the shape measurement range limited by 2pi phase ambiguity of the phase shifting profilometer, and finally to maintain fine measurement resolution and high accuracy of the phase shifting profilometer with the measurement range extended by the stereo vision. The main purpose is to overcome the low inspection reliability problem of 2D-based inspection machines by using 3D information of components. The 3D shape measurement results on PCB-mounted electronic components are shown and compared with results from contact and noncontact 3D measuring machines. Based on a series of experiments, the usefulness of the proposed sensor system and its fusion technique are discussed and analyzed in detail.

  14. Definition and maintenance of a telemetry database dictionary

    NASA Technical Reports Server (NTRS)

    Knopf, William P. (Inventor)

    2007-01-01

    A telemetry dictionary database includes a component for receiving spreadsheet workbooks of telemetry data over a web-based interface from other computer devices. Another component routes the spreadsheet workbooks to a specified directory on the host processing device. A process then checks the received spreadsheet workbooks for errors, and if no errors are detected the spreadsheet workbooks are routed to another directory to await initiation of a remote database loading process. The loading process first converts the spreadsheet workbooks to comma separated value (CSV) files. Next, a network connection with the computer system that hosts the telemetry dictionary database is established and the CSV files are ported to the computer system that hosts the telemetry dictionary database. This is followed by a remote initiation of a database loading program. Upon completion of loading a flatfile generation program is manually initiated to generate a flatfile to be used in a mission operations environment by the core ground system.

  15. Advanced Environmental Barrier Coating Development for SiC-SiC Ceramic Matrix Composite Components

    NASA Technical Reports Server (NTRS)

    Zhu, Dongming; Harder, Bryan; Bhatt, Ramakrishna; Kiser, Doug; Wiesner, Valerie L.

    2016-01-01

    This presentation reviews the NASA advanced environmental barrier coating (EBC) system development for SiCSiC Ceramic Matrix Composite (CMC) components for next generation turbine engines. The emphasis has been placed on the current design challenges of the 2700F environmental barrier coatings; coating processing and integration with SiCSiC CMCs and component systems; and performance evaluation and demonstration of EBC-CMC systems. This presentation also highlights the EBC-CMC system temperature capability and durability improvements through advanced compositions and architecture designs, as shown in recent simulated engine high heat flux, combustion environment, in conjunction with mechanical creep and fatigue loading testing conditions.

  16. Contributions to the initial development of a microelectromechanical loop heat pipe, which is based on coherent porous silicon

    NASA Astrophysics Data System (ADS)

    Cytrynowicz, Debra G.

    The research project itself was the initiation of the development of a planar miniature loop heat pipe based on a capillary wick structure made of coherent porous silicon. Work on this project fell into four main categories, which were component fabrication, test system construction, characterization testing and test data collection, performance analysis and thermal modeling. Component fabrication involved the production of various components for the evaporator. When applicable, these components were to be produced by microelectronic and MEMS or microelectromechanical fabrication techniques. Required work involved analyses and, where necessary, modifications to the wafer processing sequence, the photo-electrochemical etching process, system and controlling computer program to make it more reliable, flexible and efficient. The development of more than one wick production process was also extremely necessary in the event of equipment failure. Work on developing this alternative also involved investigations into various details of the photo-electrochemical etching process itself. Test system construction involved the actual assembly of open and closed loop test systems. Characterization involved developing and administering a series of tests to evaluate the performance of the wicks and test systems. Although there were some indications that the devices were operating according to loop heat pipe theory, they were transient and unstable. Performance analysis involved the construction of a transparent evaporator, which enabled the visual observation of the phenomena, which occurred in the evaporator during operation. It also involved investigating the effect of the quartz wool secondary wick on the operation of the device. Observations made during the visualization study indicated that the capillary and boiling limits were being reached at extremely low values of input power. The work was performed in a collaborative effort between the Biomedical Nanotechnology Research Laboratory at the University of Toledo, the Center for Microelectronics and Sensors and MEMS at the University of Cincinnati and the Thermo-Mechanical Systems Branch of the Power and On-Board Propulsion Division at the John H. Glenn Research Center of the National Aeronautics and Space Administration in Cleveland, Ohio. Work on the project produced six publications, which presented various details on component fabrication, tests system construction and characterization and thermal modeling.

  17. GCS component development cycle

    NASA Astrophysics Data System (ADS)

    Rodríguez, Jose A.; Macias, Rosa; Molgo, Jordi; Guerra, Dailos; Pi, Marti

    2012-09-01

    The GTC1 is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). First light was at 13/07/2007 and since them it is in the operation phase. The GTC control system (GCS) is a distributed object & component oriented system based on RT-CORBA8 and it is responsible for the management and operation of the telescope, including its instrumentation. GCS has used the Rational Unified process (RUP9) in its development. RUP is an iterative software development process framework. After analysing (use cases) and designing (UML10) any of GCS subsystems, an initial component description of its interface is obtained and from that information a component specification is written. In order to improve the code productivity, GCS has adopted the code generation to transform this component specification into the skeleton of component classes based on a software framework, called Device Component Framework. Using the GCS development tools, based on javadoc and gcc, in only one step, the component is generated, compiled and deployed to be tested for the first time through our GUI inspector. The main advantages of this approach are the following: It reduces the learning curve of new developers and the development error rate, allows a systematic use of design patterns in the development and software reuse, speeds up the deliverables of the software product and massively increase the timescale, design consistency and design quality, and eliminates the future refactoring process required for the code.

  18. Alerts Analysis and Visualization in Network-based Intrusion Detection Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Dr. Li

    2010-08-01

    The alerts produced by network-based intrusion detection systems, e.g. Snort, can be difficult for network administrators to efficiently review and respond to due to the enormous number of alerts generated in a short time frame. This work describes how the visualization of raw IDS alert data assists network administrators in understanding the current state of a network and quickens the process of reviewing and responding to intrusion attempts. The project presented in this work consists of three primary components. The first component provides a visual mapping of the network topology that allows the end-user to easily browse clustered alerts. Themore » second component is based on the flocking behavior of birds such that birds tend to follow other birds with similar behaviors. This component allows the end-user to see the clustering process and provides an efficient means for reviewing alert data. The third component discovers and visualizes patterns of multistage attacks by profiling the attacker s behaviors.« less

  19. Machine vision systems using machine learning for industrial product inspection

    NASA Astrophysics Data System (ADS)

    Lu, Yi; Chen, Tie Q.; Chen, Jie; Zhang, Jian; Tisler, Anthony

    2002-02-01

    Machine vision inspection requires efficient processing time and accurate results. In this paper, we present a machine vision inspection architecture, SMV (Smart Machine Vision). SMV decomposes a machine vision inspection problem into two stages, Learning Inspection Features (LIF), and On-Line Inspection (OLI). The LIF is designed to learn visual inspection features from design data and/or from inspection products. During the OLI stage, the inspection system uses the knowledge learnt by the LIF component to inspect the visual features of products. In this paper we will present two machine vision inspection systems developed under the SMV architecture for two different types of products, Printed Circuit Board (PCB) and Vacuum Florescent Displaying (VFD) boards. In the VFD board inspection system, the LIF component learns inspection features from a VFD board and its displaying patterns. In the PCB board inspection system, the LIF learns the inspection features from the CAD file of a PCB board. In both systems, the LIF component also incorporates interactive learning to make the inspection system more powerful and efficient. The VFD system has been deployed successfully in three different manufacturing companies and the PCB inspection system is the process of being deployed in a manufacturing plant.

  20. Simulation of thermally induced processes of diffusion and phase formation in layered binary metallic systems

    NASA Astrophysics Data System (ADS)

    Rusakov, V. S.; Sukhorukov, I. A.; Zhankadamova, A. M.; Kadyrzhanov, K. K.

    2010-05-01

    Results of the simulation of thermally induced processes of diffusion and phase formation in model and experimentally investigated layered binary metallic systems are presented. The physical model is based on the Darken phenomenological theory and on the mechanism of interdiffusion of components along the continuous diffusion channels of phases in the two-phase regions of the system. The simulation of processes in the model systems showed that the thermally stabilized concentration profiles in two-layer binary metallic systems are virtually independent of the partial diffusion coefficients; for the systems with the average concentration of components that is the same over the sample depth, the time of the thermal stabilization of the structural and phase state inhomogeneous over the depth grows according to a power law with increasing thickness of the system in such a manner that the thicknesses of the surface layers grow, while the thickness of the intermediate layer approaches a constant value. The results of the simulation of the processes of diffusion and phase formation in experimentally investigated layered binary systems Fe-Ti and Cu-Be upon sequential isothermal and isochronous annealings agree well with the experimental data.

  1. Navigation Operations with Prototype Components of an Automated Real-Time Spacecraft Navigation System

    NASA Technical Reports Server (NTRS)

    Cangahuala, L.; Drain, T. R.

    1999-01-01

    At present, ground navigation support for interplanetary spacecraft requires human intervention for data pre-processing, filtering, and post-processing activities; these actions must be repeated each time a new batch of data is collected by the ground data system.

  2. Trickling Filters. Student Manual. Biological Treatment Process Control.

    ERIC Educational Resources Information Center

    Richwine, Reynold D.

    The textual material for a unit on trickling filters is presented in this student manual. Topic areas discussed include: (1) trickling filter process components (preliminary treatment, media, underdrain system, distribution system, ventilation, and secondary clarifier); (2) operational modes (standard rate filters, high rate filters, roughing…

  3. Three-dimensional modelling and geothermal process simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burns, K.L.

    1990-01-01

    The subsurface geological model or 3-D GIS is constructed from three kinds of objects, which are a lithotope (in boundary representation), a number of fault systems, and volumetric textures (vector fields). The chief task of the model is to yield an estimate of the conductance tensors (fluid permeability and thermal conductivity) throughout an array of voxels. This is input as material properties to a FEHM numerical physical process model. The main task of the FEHM process model is to distinguish regions of convective from regions of conductive heat flow, and to estimate the fluid phase, pressure and flow paths. Themore » temperature, geochemical, and seismic data provide the physical constraints on the process. The conductance tensors in the Franciscan Complex are to be derived by the addition of two components. The isotropic component is a stochastic spatial variable due to disruption of lithologies in melange. The deviatoric component is deterministic, due to smoothness and continuity in the textural vector fields. This decomposition probably also applies to the engineering hydrogeological properties of shallow terrestrial fluvial systems. However there are differences in quantity. The isotropic component is much more variable in the Franciscan, to the point where volumetric averages are misleading, and it may be necessary to select that component from several, discrete possible states. The deviatoric component is interpolated using a textural vector field. The Franciscan field is much more complicated, and contains internal singularities. 27 refs., 10 figs.« less

  4. SSME component assembly and life management expert system

    NASA Technical Reports Server (NTRS)

    Ali, M.; Dietz, W. E.; Ferber, H. J.

    1989-01-01

    The space shuttle utilizes several rocket engine systems, all of which must function with a high degree of reliability for successful mission completion. The space shuttle main engine (SSME) is by far the most complex of the rocket engine systems and is designed to be reusable. The reusability of spacecraft systems introduces many problems related to testing, reliability, and logistics. Components must be assembled from parts inventories in a manner which will most effectively utilize the available parts. Assembly must be scheduled to efficiently utilize available assembly benches while still maintaining flight schedules. Assembled components must be assigned to as many contiguous flights as possible, to minimize component changes. Each component must undergo a rigorous testing program prior to flight. In addition, testing and assembly of flight engines and components must be done in conjunction with the assembly and testing of developmental engines and components. The development, testing, manufacture, and flight assignments of the engine fleet involves the satisfaction of many logistical and operational requirements, subject to many constraints. The purpose of the SSME Component Assembly and Life Management Expert System (CALMES) is to assist the engine assembly and scheduling process, and to insure that these activities utilize available resources as efficiently as possible.

  5. A 45° saw-dicing process applied to a glass substrate for wafer-level optical splitter fabrication for optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Maciel, M. J.; Costa, C. G.; Silva, M. F.; Gonçalves, S. B.; Peixoto, A. C.; Ribeiro, A. Fernando; Wolffenbuttel, R. F.; Correia, J. H.

    2016-08-01

    This paper reports on the development of a technology for the wafer-level fabrication of an optical Michelson interferometer, which is an essential component in a micro opto-electromechanical system (MOEMS) for a miniaturized optical coherence tomography (OCT) system. The MOEMS consists on a titanium dioxide/silicon dioxide dielectric beam splitter and chromium/gold micro-mirrors. These optical components are deposited on 45° tilted surfaces to allow the horizontal/vertical separation of the incident beam in the final micro-integrated system. The fabrication process consists of 45° saw dicing of a glass substrate and the subsequent deposition of dielectric multilayers and metal layers. The 45° saw dicing is fully characterized in this paper, which also includes an analysis of the roughness. The optimum process results in surfaces with a roughness of 19.76 nm (rms). The actual saw dicing process for a high-quality final surface results as a compromise between the dicing blade’s grit size (#1200) and the cutting speed (0.3 mm s-1). The proposed wafer-level fabrication allows rapid and low-cost processing, high compactness and the possibility of wafer-level alignment/assembly with other optical micro components for OCT integrated imaging.

  6. Signal processing: opportunities for superconductive circuits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ralston, R.W.

    1985-03-01

    Prime motivators in the evolution of increasingly sophisticated communication and detection systems are the needs for handling ever wider signal bandwidths and higher data processing speeds. These same needs drive the development of electronic device technology. Until recently the superconductive community has been tightly focused on digital devices for high speed computers. The purpose of this paper is to describe opportunities and challenges which exist for both analog and digital devices in a less familiar area, that of wideband signal processing. The function and purpose of analog signal-processing components, including matched filters, correlators and Fourier transformers, will be described andmore » examples of superconductive implementations given. A canonic signal-processing system is then configured using these components in combination with analog/digital converters and digital output circuits to highlight the important issues of dynamic range, accuracy and equivalent computation rate. Superconductive circuits hold promise for processing signals of 10-GHz bandwidth. Signal processing systems, however, can be properly designed and implemented only through a synergistic combination of the talents of device physicists, circuit designers, algorithm architects and system engineers. An immediate challenge to the applied superconductivity community is to begin sharing ideas with these other researchers.« less

  7. Silicon Valley's Processing Needs versus San Jose State University's Manufacturing Systems Processing Component: Implications for Industrial Technology

    ERIC Educational Resources Information Center

    Obi, Samuel C.

    2004-01-01

    Manufacturing professionals within universities tend to view manufacturing systems from a global perspective. This perspective tends to assume that manufacturing processes are employed equally in every manufacturing enterprise, irrespective of the geography and the needs of the people in those diverse regions. But in reality local and societal…

  8. 21 CFR 111.120 - What quality control operations are required for components, packaging, and labels before use in...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... components, packaging, and labels before use in the manufacture of a dietary supplement? 111.120 Section 111..., OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for... labels before use in the manufacture of a dietary supplement? Quality control operations for components...

  9. 21 CFR 111.120 - What quality control operations are required for components, packaging, and labels before use in...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... components, packaging, and labels before use in the manufacture of a dietary supplement? 111.120 Section 111..., OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for... labels before use in the manufacture of a dietary supplement? Quality control operations for components...

  10. 21 CFR 111.120 - What quality control operations are required for components, packaging, and labels before use in...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... components, packaging, and labels before use in the manufacture of a dietary supplement? 111.120 Section 111..., OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for... labels before use in the manufacture of a dietary supplement? Quality control operations for components...

  11. 21 CFR 111.120 - What quality control operations are required for components, packaging, and labels before use in...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... components, packaging, and labels before use in the manufacture of a dietary supplement? 111.120 Section 111..., OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for... labels before use in the manufacture of a dietary supplement? Quality control operations for components...

  12. Working Memory Components and Intelligence in Children

    ERIC Educational Resources Information Center

    Tillman, Carin M.; Nyberg, Lilianne; Bohlin, Gunilla

    2008-01-01

    This study investigated, in children aged 6-13 years, how different components of the working memory (WM) system (short-term storage and executive processes), within both verbal and visuospatial domains, relate to fluid intelligence. We also examined the degree of domain-specificity of the WM components as well as the differentiation of storage…

  13. 21 CFR 111.120 - What quality control operations are required for components, packaging, and labels before use in...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... components, packaging, and labels before use in the manufacture of a dietary supplement? 111.120 Section 111..., OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for... labels before use in the manufacture of a dietary supplement? Quality control operations for components...

  14. Earth Orbiter 1: Wideband Advanced Recorder and Processor (WARP)

    NASA Technical Reports Server (NTRS)

    Smith, Terry; Kessler, John

    1999-01-01

    An advanced on-board spacecraft data system component is presented. The component is computer-based and provides science data acquisition, processing, storage, and base-band transmission functions. Specifically, the component is a very high rate solid state recorder, serving as a pathfinder for achieving the data handling requirements of next-generation hyperspectral imaging missions.

  15. Simultaneous estimation of deterministic and fractal stochastic components in non-stationary time series

    NASA Astrophysics Data System (ADS)

    García, Constantino A.; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G.

    2018-07-01

    In the past few decades, it has been recognized that 1 / f fluctuations are ubiquitous in nature. The most widely used mathematical models to capture the long-term memory properties of 1 / f fluctuations have been stochastic fractal models. However, physical systems do not usually consist of just stochastic fractal dynamics, but they often also show some degree of deterministic behavior. The present paper proposes a model based on fractal stochastic and deterministic components that can provide a valuable basis for the study of complex systems with long-term correlations. The fractal stochastic component is assumed to be a fractional Brownian motion process and the deterministic component is assumed to be a band-limited signal. We also provide a method that, under the assumptions of this model, is able to characterize the fractal stochastic component and to provide an estimate of the deterministic components present in a given time series. The method is based on a Bayesian wavelet shrinkage procedure that exploits the self-similar properties of the fractal processes in the wavelet domain. This method has been validated over simulated signals and over real signals with economical and biological origin. Real examples illustrate how our model may be useful for exploring the deterministic-stochastic duality of complex systems, and uncovering interesting patterns present in time series.

  16. Knowledge Representation Artifacts for Use in Sensemaking Support Systems

    DTIC Science & Technology

    2015-03-12

    and manual processing must be replaced by automated processing wherever it makes sense and is possible. Clearly, given the data and cognitive...knowledge-centric view to situation analysis and decision-making as previously discussed, has lead to the development of several automated processing components...for use in sensemaking support systems [6-11]. In turn, automated processing has required the development of appropriate knowledge

  17. Four-Component Catalytic Machinery: Reversible Three-State Control of Organocatalysis by Walking Back and Forth on a Track.

    PubMed

    Mittal, Nikita; Özer, Merve S; Schmittel, Michael

    2018-04-02

    A three-component supramolecular walker system is presented where a two-footed ligand (biped) walks back and forth on a tetrahedral 3D track upon the addition and removal of copper(I) ions, respectively. The addition of N-methylpyrrolidine as a catalyst to the walker system generates a four-component catalytic machinery, which acts as a three-state switchable catalytic ensemble in the presence of substrates for a conjugate addition. The copper(I)-ion-initiated walking process of the biped ligand on the track regulates the catalytic activity in three steps: ON versus int ON (intermediate ON) versus OFF. To establish the operation of the four-component catalytic machinery in a mixture of all constituents, forward and backward cycles were performed in situ illustrating that both the walking process and catalytic action are fully reversible and reproducible.

  18. Virtually-synchronous communication based on a weak failure suspector

    NASA Technical Reports Server (NTRS)

    Schiper, Andre; Ricciardi, Aleta

    1993-01-01

    Failure detectors (or, more accurately Failure Suspectors (FS)) appear to be a fundamental service upon which to build fault-tolerant, distributed applications. This paper shows that a FS with very weak semantics (i.e., that delivers failure and recovery information in no specific order) suffices to implement virtually-synchronous communication (VSC) in an asynchronous system subject to process crash failures and network partitions. The VSC paradigm is particularly useful in asynchronous systems and greatly simplifies building fault-tolerant applications that mask failures by replicating processes. We suggest a three-component architecture to implement virtually-synchronous communication: (1) at the lowest level, the FS component; (2) on top of it, a component (2a) that defines new views; and (3) a component (2b) that reliably multicasts messages within a view. The issues covered in this paper also lead to a better understanding of the various membership service semantics proposed in recent literature.

  19. Modular System for Shelves and Coasts (MOSSCO v1.0) - a flexible and multi-component framework for coupled coastal ocean ecosystem modelling

    NASA Astrophysics Data System (ADS)

    Lemmen, Carsten; Hofmeister, Richard; Klingbeil, Knut; Hassan Nasermoaddeli, M.; Kerimoglu, Onur; Burchard, Hans; Kösters, Frank; Wirtz, Kai W.

    2018-03-01

    Shelf and coastal sea processes extend from the atmosphere through the water column and into the seabed. These processes reflect intimate interactions between physical, chemical, and biological states on multiple scales. As a consequence, coastal system modelling requires a high and flexible degree of process and domain integration; this has so far hardly been achieved by current model systems. The lack of modularity and flexibility in integrated models hinders the exchange of data and model components and has historically imposed the supremacy of specific physical driver models. We present the Modular System for Shelves and Coasts (MOSSCO; http://www.mossco.de), a novel domain and process coupling system tailored but not limited to the coupling challenges of and applications in the coastal ocean. MOSSCO builds on the Earth System Modeling Framework (ESMF) and on the Framework for Aquatic Biogeochemical Models (FABM). It goes beyond existing technologies by creating a unique level of modularity in both domain and process coupling, including a clear separation of component and basic model interfaces, flexible scheduling of several tens of models, and facilitation of iterative development at the lab and the station and on the coastal ocean scale. MOSSCO is rich in metadata and its concepts are also applicable outside the coastal domain. For coastal modelling, it contains dozens of example coupling configurations and tested set-ups for coupled applications. Thus, MOSSCO addresses the technology needs of a growing marine coastal Earth system community that encompasses very different disciplines, numerical tools, and research questions.

  20. A Fixed Point VHDL Component Library for a High Efficiency Reconfigurable Radio Design Methodology

    NASA Technical Reports Server (NTRS)

    Hoy, Scott D.; Figueiredo, Marco A.

    2006-01-01

    Advances in Field Programmable Gate Array (FPGA) technologies enable the implementation of reconfigurable radio systems for both ground and space applications. The development of such systems challenges the current design paradigms and requires more robust design techniques to meet the increased system complexity. Among these techniques is the development of component libraries to reduce design cycle time and to improve design verification, consequently increasing the overall efficiency of the project development process while increasing design success rates and reducing engineering costs. This paper describes the reconfigurable radio component library developed at the Software Defined Radio Applications Research Center (SARC) at Goddard Space Flight Center (GSFC) Microwave and Communications Branch (Code 567). The library is a set of fixed-point VHDL components that link the Digital Signal Processing (DSP) simulation environment with the FPGA design tools. This provides a direct synthesis path based on the latest developments of the VHDL tools as proposed by the BEE VBDL 2004 which allows for the simulation and synthesis of fixed-point math operations while maintaining bit and cycle accuracy. The VHDL Fixed Point Reconfigurable Radio Component library does not require the use of the FPGA vendor specific automatic component generators and provide a generic path from high level DSP simulations implemented in Mathworks Simulink to any FPGA device. The access to the component synthesizable, source code provides full design verification capability:

  1. Distillation Designs for the Lunar Surface

    NASA Technical Reports Server (NTRS)

    Boul, Peter J.; Lange,Kevin E.; Conger, Bruce; Anderson, Molly

    2010-01-01

    Gravity-based distillation methods may be applied to the purification of wastewater on the lunar base. These solutions to water processing are robust physical separation techniques, which may be more advantageous than many other techniques for their simplicity in design and operation. The two techniques can be used in conjunction with each other to obtain high purity water. The components and feed compositions for modeling waste water streams are presented in conjunction with the Aspen property system for traditional stage distillation. While the individual components for each of the waste streams will vary naturally within certain bounds, an analog model for waste water processing is suggested based on typical concentration ranges for these components. Target purity levels for recycled water are determined for each individual component based on NASA s required maximum contaminant levels for potable water Optimum parameters such as reflux ratio, feed stage location, and processing rates are determined with respect to the power consumption of the process. Multistage distillation is evaluated for components in wastewater to determine the minimum number of stages necessary for each of 65 components in humidity condensate and urine wastewater mixed streams.

  2. Multiparameter magnetic inspection system with magnetic field control and plural magnetic transducers

    DOEpatents

    Jiles, D.C.

    1991-04-16

    A multiparameter magnetic inspection system is disclosed for providing an efficient and economical way to derive a plurality of independent measurements regarding magnetic properties of the magnetic material under investigation. The plurality of transducers for a plurality of different types of measurements operatively connected to the specimen. The transducers are in turn connected to analytical circuits for converting transducer signals to meaningful measurement signals of the magnetic properties of the specimen. The measurement signals are processed and can be simultaneously communicated to a control component. The measurement signals can also be selectively plotted against one another. The control component operates the functioning of the analytical circuits and operates and controls components to impose magnetic fields of desired characteristics upon the specimen. The system therefore allows contemporaneous or simultaneous derivation of the plurality of different independent magnetic properties of the material which can then be processed to derive characteristics of the material. 1 figure.

  3. Multiparameter magnetic inspection system with magnetic field control and plural magnetic transducers

    DOEpatents

    Jiles, David C.

    1991-04-16

    A multiparameter magnetic inspection system for providing an efficient and economical way to derive a plurality of independent measurements regarding magnetic properties of the magnetic material under investigation. The plurality of transducers for a plurality of different types of measurements operatively connected to the specimen. The transducers are in turn connected to analytical circuits for converting transducer signals to meaningful measurement signals of the magnetic properties of the specimen. The measurement signals are processed and can be simultaneously communicated to a control component. The measurement signals can also be selectively plotted against one another. The control component operates the functioning of the analytical circuits and operates and controls components to impose magnetic fields of desired characteristics upon the specimen. The system therefore allows contemporaneous or simultaneous derivation of the plurality of different independent magnetic properties of the material which can then be processed to derive characteristics of the material.

  4. Fatigue of Ti6Al4V Structural Health Monitoring Systems Produced by Selective Laser Melting.

    PubMed

    Strantza, Maria; Vafadari, Reza; de Baere, Dieter; Vrancken, Bey; van Paepegem, Wim; Vandendael, Isabelle; Terryn, Herman; Guillaume, Patrick; van Hemelrijck, Danny

    2016-02-11

    Selective laser melting (SLM) is an additive manufacturing (AM) process which is used for producing metallic components. Currently, the integrity of components produced by SLM is in need of improvement due to residual stresses and unknown fracture behavior. Titanium alloys produced by AM are capable candidates for applications in aerospace and industrial fields due to their fracture resistance, fatigue behavior and corrosion resistance. On the other hand, structural health monitoring (SHM) system technologies are promising and requested from the industry. SHM systems can monitor the integrity of a structure and during the last decades the research has primarily been influenced by bionic engineering. In that aspect a new philosophy for SHM has been developed: the so-called effective structural health monitoring (eSHM) system. The current system uses the design freedom provided by AM. The working principle of the system is based on crack detection by means of a network of capillaries that are integrated in a structure. The main objective of this research is to evaluate the functionality of Ti6Al4V produced by the SLM process in the novel SHM system and to confirm that the eSHM system can successfully detect cracks in SLM components. In this study four-point bending fatigue tests on Ti6Al4V SLM specimens with an integrated SHM system were conducted. Fractographic analysis was performed after the final failure, while finite element simulations were used in order to determine the stress distribution in the capillary region and on the component. It was proven that the SHM system does not influence the crack initiation behavior during fatigue. The results highlight the effectiveness of the eSHM on SLM components, which can potentially be used by industrial and aerospace applications.

  5. Development of performance criteria for advanced Viking seismic experiments

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The characteristics and requirements of the seismic instrument for mapping the internal structure of the planet Mars are briefly described. The types of signals expected to exist are microseismic background generated by wind and pressure variations and thermal effects, disturbances of or in the landed vehicle, signals caused by faulting and volcanic activity, and signals due to meteoritic impacts. The advanced instrument package should include a short-period vertical component system, a long-period or wide-band 3-component system, a high frequency vertical component system, and a system for detection and rejection of lander noises. The Viking '75, Surveyor, and Apollo systems are briefly described as potential instruments to be considered for modification. Data processing and control systems are also summarized.

  6. Active control of complex, multicomponent self-assembly processes

    NASA Astrophysics Data System (ADS)

    Schulman, Rebecca

    The kinetics of many complex biological self-assembly processes such as cytoskeletal assembly are precisely controlled by cells. Spatiotemporal control over rates of filament nucleation, growth and disassembly determine how self-assembly occurs and how the assembled form changes over time. These reaction rates can be manipulated by changing the concentrations of the components needed for assembly by activating or deactivating them. I will describe how we can use these principles to design driven self-assembly processes in which we assemble and disassemble multiple types of components to create micron-scale networks of semiflexible filaments assembled from DNA. The same set of primitive components can be assembled into many different, structures depending on the concentrations of different components and how designed, DNA-based chemical reaction networks manipulate these concentrations over time. These chemical reaction networks can in turn interpret environmental stimuli to direct complex, multistage response. Such a system is a laboratory for understanding complex active material behaviors, such as metamorphosis, self-healing or adaptation to the environment that are ubiquitous in biological systems but difficult to quantitatively characterize or engineer.

  7. Case management information systems: how to put the pieces together now and beyond year 2000.

    PubMed

    Matthews, Pamela

    2002-01-01

    The case management process is a critical management and operational component in the delivery of customer services across the patient care continuum. Case management has transcended time and will continue to be a viable infrastructure process for successful organizations in the future. A key component of the case management infrastructure is information systems and technology support. Case management challenges include effective deployment and use of systems and technology. As more sophisticated, integrated systems are made available, case managers can use these tools to continue to expand effectively beyond the patient's episodic event to provide greater levels of cradle-to-grave management of healthcare. This article explores methods for defining case management system needs and identifying automation options available to the case manager.

  8. Knowledge information management toolkit and method

    DOEpatents

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  9. Materials requirements for optical processing and computing devices

    NASA Technical Reports Server (NTRS)

    Tanguay, A. R., Jr.

    1985-01-01

    Devices for optical processing and computing systems are discussed, with emphasis on the materials requirements imposed by functional constraints. Generalized optical processing and computing systems are described in order to identify principal categories of requisite components for complete system implementation. Three principal device categories are selected for analysis in some detail: spatial light modulators, volume holographic optical elements, and bistable optical devices. The implications for optical processing and computing systems of the materials requirements identified for these device categories are described, and directions for future research are proposed.

  10. Signal processing: opportunities for superconductive circuits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ralston, R.W.

    1985-03-01

    Prime motivators in the evolution of increasingly sophisticated communication and detection systems are the needs for handling ever wider signal bandwidths and higher data-processing speeds. These same needs drive the development of electronic device technology. Until recently the superconductive community has been tightly focused on digital devices for high speed computers. The purpose of this paper is to describe opportunities and challenges which exist for both analog and digital devices in a less familiar area, that of wideband signal processing. The function and purpose of analog signal-processing components, including matched filters, correlators and Fourier transformers, will be described and examplesmore » of superconductive implementations given. A canonic signal-processing system is then configured using these components and digital output circuits to highlight the important issues of dynamic range, accuracy and equivalent computation rate. (Reprints)« less

  11. Principles and Techniques of Radiation Chemistry.

    ERIC Educational Resources Information Center

    Dorfman, Leon M.

    1981-01-01

    Discusses the physical processes involved in the deposition of energy from ionizing radiation in the absorber system. Identifies principles relevant to these processes which are responsible for ionization and excitation of the components of the absorber system. Briefly describes some experimental techniques in use in radiation chemical studies.…

  12. 36 CFR 1225.12 - How are records schedules developed?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... activity to identify records series, systems, and nonrecord materials. (c) Determine the appropriate scope of the records schedule items, e.g., individual series/system component, work process, group of related work processes, or broad program area. (d) Evaluate the period of time the agency needs each...

  13. 36 CFR 1225.12 - How are records schedules developed?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... activity to identify records series, systems, and nonrecord materials. (c) Determine the appropriate scope of the records schedule items, e.g., individual series/system component, work process, group of related work processes, or broad program area. (d) Evaluate the period of time the agency needs each...

  14. 36 CFR 1225.12 - How are records schedules developed?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... activity to identify records series, systems, and nonrecord materials. (c) Determine the appropriate scope of the records schedule items, e.g., individual series/system component, work process, group of related work processes, or broad program area. (d) Evaluate the period of time the agency needs each...

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shah, Kedar G.; Pannu, Satinderpall S.

    An integrated circuit system having an integrated circuit (IC) component which is able to have its functionality destroyed upon receiving a command signal. The system may involve a substrate with the IC component being supported on the substrate. A module may be disposed in proximity to the IC component. The module may have a cavity and a dissolving compound in a solid form disposed in the cavity. A heater component may be configured to heat the dissolving compound to a point of sublimation where the dissolving compound changes from a solid to a gaseous dissolving compound. A triggering mechanism maymore » be used for initiating a dissolution process whereby the gaseous dissolving compound is allowed to attack the IC component and destroy a functionality of the IC component.« less

  16. Characteristics, Process Parameters, and Inner Components of Anaerobic Bioreactors

    PubMed Central

    Abdelgadir, Awad; Chen, Xiaoguang; Liu, Jianshe; Xie, Xuehui; Zhang, Jian; Zhang, Kai; Wang, Heng; Liu, Na

    2014-01-01

    The anaerobic bioreactor applies the principles of biotechnology and microbiology, and nowadays it has been used widely in the wastewater treatment plants due to their high efficiency, low energy use, and green energy generation. Advantages and disadvantages of anaerobic process were shown, and three main characteristics of anaerobic bioreactor (AB), namely, inhomogeneous system, time instability, and space instability were also discussed in this work. For high efficiency of wastewater treatment, the process parameters of anaerobic digestion, such as temperature, pH, Hydraulic retention time (HRT), Organic Loading Rate (OLR), and sludge retention time (SRT) were introduced to take into account the optimum conditions for living, growth, and multiplication of bacteria. The inner components, which can improve SRT, and even enhance mass transfer, were also explained and have been divided into transverse inner components, longitudinal inner components, and biofilm-packing material. At last, the newly developed special inner components were discussed and found more efficient and productive. PMID:24672798

  17. Characteristics, process parameters, and inner components of anaerobic bioreactors.

    PubMed

    Abdelgadir, Awad; Chen, Xiaoguang; Liu, Jianshe; Xie, Xuehui; Zhang, Jian; Zhang, Kai; Wang, Heng; Liu, Na

    2014-01-01

    The anaerobic bioreactor applies the principles of biotechnology and microbiology, and nowadays it has been used widely in the wastewater treatment plants due to their high efficiency, low energy use, and green energy generation. Advantages and disadvantages of anaerobic process were shown, and three main characteristics of anaerobic bioreactor (AB), namely, inhomogeneous system, time instability, and space instability were also discussed in this work. For high efficiency of wastewater treatment, the process parameters of anaerobic digestion, such as temperature, pH, Hydraulic retention time (HRT), Organic Loading Rate (OLR), and sludge retention time (SRT) were introduced to take into account the optimum conditions for living, growth, and multiplication of bacteria. The inner components, which can improve SRT, and even enhance mass transfer, were also explained and have been divided into transverse inner components, longitudinal inner components, and biofilm-packing material. At last, the newly developed special inner components were discussed and found more efficient and productive.

  18. Removal and recovery of acetic acid and two furans during sugar purification of simulated phenols-free biomass hydrolysates.

    PubMed

    Lee, Sang Cheol

    2017-12-01

    A cost-effective five-step sugar purification process involving simultaneous removal and recovery of fermentation inhibitors from biomass hydrolysates was first proposed here. Only the three separation steps (PB, PC and PD) in the process were investigated here. Furfural was selectively removed up to 98.4% from a simulated five-component hydrolysate in a cross-current three-stage extraction system with n-hexane. Most of acetic acid in a simulated four-component hydrolysate was selectively removed by emulsion liquid membrane, and it could be concentrated in the stripping solution up to 4.5 times its initial concentration in the feed solution. 5-Hydroxymethylfurfural was selectively removed from a simulated three-component hydrolysate in batch and continuous fixed-bed column adsorption systems with L-493 adsorbent. Also, 5-hydroxymethylfurfural could be concentrated to about 9 times its feed concentration in the continuous adsorption system through a fixed-bed column desorption experiment with aqueous ethanol solution. These results have shown that the proposed purification process was valid. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Medical diagnosis of atherosclerosis from Carotid Artery Doppler Signals using principal component analysis (PCA), k-NN based weighting pre-processing and Artificial Immune Recognition System (AIRS).

    PubMed

    Latifoğlu, Fatma; Polat, Kemal; Kara, Sadik; Güneş, Salih

    2008-02-01

    In this study, we proposed a new medical diagnosis system based on principal component analysis (PCA), k-NN based weighting pre-processing, and Artificial Immune Recognition System (AIRS) for diagnosis of atherosclerosis from Carotid Artery Doppler Signals. The suggested system consists of four stages. First, in the feature extraction stage, we have obtained the features related with atherosclerosis disease using Fast Fourier Transformation (FFT) modeling and by calculating of maximum frequency envelope of sonograms. Second, in the dimensionality reduction stage, the 61 features of atherosclerosis disease have been reduced to 4 features using PCA. Third, in the pre-processing stage, we have weighted these 4 features using different values of k in a new weighting scheme based on k-NN based weighting pre-processing. Finally, in the classification stage, AIRS classifier has been used to classify subjects as healthy or having atherosclerosis. Hundred percent of classification accuracy has been obtained by the proposed system using 10-fold cross validation. This success shows that the proposed system is a robust and effective system in diagnosis of atherosclerosis disease.

  20. Automated reuseable components system study results

    NASA Technical Reports Server (NTRS)

    Gilroy, Kathy

    1989-01-01

    The Automated Reusable Components System (ARCS) was developed under a Phase 1 Small Business Innovative Research (SBIR) contract for the U.S. Army CECOM. The objectives of the ARCS program were: (1) to investigate issues associated with automated reuse of software components, identify alternative approaches, and select promising technologies, and (2) to develop tools that support component classification and retrieval. The approach followed was to research emerging techniques and experimental applications associated with reusable software libraries, to investigate the more mature information retrieval technologies for applicability, and to investigate the applicability of specialized technologies to improve the effectiveness of a reusable component library. Various classification schemes and retrieval techniques were identified and evaluated for potential application in an automated library system for reusable components. Strategies for library organization and management, component submittal and storage, and component search and retrieval were developed. A prototype ARCS was built to demonstrate the feasibility of automating the reuse process. The prototype was created using a subset of the classification and retrieval techniques that were investigated. The demonstration system was exercised and evaluated using reusable Ada components selected from the public domain. A requirements specification for a production-quality ARCS was also developed.

  1. A perspective on future directions in aerospace propulsion system simulation

    NASA Technical Reports Server (NTRS)

    Miller, Brent A.; Szuch, John R.; Gaugler, Raymond E.; Wood, Jerry R.

    1989-01-01

    The design and development of aircraft engines is a lengthy and costly process using today's methodology. This is due, in large measure, to the fact that present methods rely heavily on experimental testing to verify the operability, performance, and structural integrity of components and systems. The potential exists for achieving significant speedups in the propulsion development process through increased use of computational techniques for simulation, analysis, and optimization. This paper outlines the concept and technology requirements for a Numerical Propulsion Simulation System (NPSS) that would provide capabilities to do interactive, multidisciplinary simulations of complete propulsion systems. By combining high performance computing hardware and software with state-of-the-art propulsion system models, the NPSS will permit the rapid calculation, assessment, and optimization of subcomponent, component, and system performance, durability, reliability and weight-before committing to building hardware.

  2. Conceptual design study: Forest Fire Advanced System Technology (FFAST)

    NASA Technical Reports Server (NTRS)

    Nichols, J. D.; Warren, J. R.

    1986-01-01

    An integrated forest fire detection and mapping system that will be based upon technology available in the 1990s was defined. Uncertainties in emerging and advanced technologies related to the conceptual design were identified and recommended for inclusion as preferred system components. System component technologies identified for an end-to-end system include thermal infrared, linear array detectors, automatic georeferencing and signal processing, geosynchronous satellite communication links, and advanced data integration and display. Potential system configuration options were developed and examined for possible inclusion in the preferred system configuration. The preferred system configuration will provide increased performance and be cost effective over the system currently in use. Forest fire management user requirements and the system component emerging technologies were the basis for the system configuration design. A preferred system configuration was defined that warrants continued refinement and development, examined economic aspects of the current and preferred system, and provided preliminary cost estimates for follow-on system prototype development.

  3. 77 FR 40082 - Certain Gaming and Entertainment Consoles, Related Software, and Components Thereof...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-06

    ... Commission remands for the ALJ to (1) apply the Commission's opinion in Certain Electronic Devices With Image Processing Systems, Components Thereof, and Associated Software, Inv. No. 337-TA-724, Comm'n Op. (Dec. 21...

  4. Water-saving liquid-gas conditioning system

    DOEpatents

    Martin, Christopher; Zhuang, Ye

    2014-01-14

    A method for treating a process gas with a liquid comprises contacting a process gas with a hygroscopic working fluid in order to remove a constituent from the process gas. A system for treating a process gas with a liquid comprises a hygroscopic working fluid comprising a component adapted to absorb or react with a constituent of a process gas, and a liquid-gas contactor for contacting the working fluid and the process gas, wherein the constituent is removed from the process gas within the liquid-gas contactor.

  5. Hydrological processes and the water budget of lakes

    USGS Publications Warehouse

    Winter, Thomas C.; Lerman, Abraham; Imboden, Dieter M.; Gat, Joel R.

    1995-01-01

    Lakes interact with all components of the hydrological system: atmospheric water, surface water, and groundwater. The fluxes of water to and from lakes with regard to each of these components represent the water budget of a lake. Mathematically, the concept of a water budget is deceptively simple: income equals outgo, plus or minus change in storage. In practice, however, measuring the water fluxes to and from lakes accurately is not simple, because understanding of the various hydrological processes and the ability to measure the various hydrological components are limited.

  6. 3D printing in X-ray and Gamma-Ray Imaging: A novel method for fabricating high-density imaging apertures☆

    PubMed Central

    Miller, Brian W.; Moore, Jared W.; Barrett, Harrison H.; Fryé, Teresa; Adler, Steven; Sery, Joe; Furenlid, Lars R.

    2011-01-01

    Advances in 3D rapid-prototyping printers, 3D modeling software, and casting techniques allow for cost-effective fabrication of custom components in gamma-ray and X-ray imaging systems. Applications extend to new fabrication methods for custom collimators, pinholes, calibration and resolution phantoms, mounting and shielding components, and imaging apertures. Details of the fabrication process for these components, specifically the 3D printing process, cold casting with a tungsten epoxy, and lost-wax casting in platinum are presented. PMID:22199414

  7. Risk Analysis using Corrosion Rate Parameter on Gas Transmission Pipeline

    NASA Astrophysics Data System (ADS)

    Sasikirono, B.; Kim, S. J.; Haryadi, G. D.; Huda, A.

    2017-05-01

    In the oil and gas industry, the pipeline is a major component in the transmission and distribution process of oil and gas. Oil and gas distribution process sometimes performed past the pipeline across the various types of environmental conditions. Therefore, in the transmission and distribution process of oil and gas, a pipeline should operate safely so that it does not harm the surrounding environment. Corrosion is still a major cause of failure in some components of the equipment in a production facility. In pipeline systems, corrosion can cause failures in the wall and damage to the pipeline. Therefore it takes care and periodic inspections or checks on the pipeline system. Every production facility in an industry has a level of risk for damage which is a result of the opportunities and consequences of damage caused. The purpose of this research is to analyze the level of risk of 20-inch Natural Gas Transmission Pipeline using Risk-based inspection semi-quantitative based on API 581 associated with the likelihood of failure and the consequences of the failure of a component of the equipment. Then the result is used to determine the next inspection plans. Nine pipeline components were observed, such as a straight pipes inlet, connection tee, and straight pipes outlet. The risk assessment level of the nine pipeline’s components is presented in a risk matrix. The risk level of components is examined at medium risk levels. The failure mechanism that is used in this research is the mechanism of thinning. Based on the results of corrosion rate calculation, remaining pipeline components age can be obtained, so the remaining lifetime of pipeline components are known. The calculation of remaining lifetime obtained and the results vary for each component. Next step is planning the inspection of pipeline components by NDT external methods.

  8. Automated system for measurement, collection and processing of hydrometeorological data aboard scientific research vessels of the GUGMS (SIGMA-s)

    NASA Technical Reports Server (NTRS)

    Borisenkov, Y. P.; Fedorov, O. M.

    1974-01-01

    A report is made on the automated system known as SIGMA-s for the measurement, collection, and processing of hydrometeorological data aboard scientific research vessels of the Hydrometeorological Service. The various components of the system and the interfacing between them are described, as well as the projects that the system is equipped to handle.

  9. Hybrid Modeling for Testing Intelligent Software for Lunar-Mars Closed Life Support

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Nicholson, Leonard S. (Technical Monitor)

    1999-01-01

    Intelligent software is being developed for closed life support systems with biological components, for human exploration of the Moon and Mars. The intelligent software functions include planning/scheduling, reactive discrete control and sequencing, management of continuous control, and fault detection, diagnosis, and management of failures and errors. Four types of modeling information have been essential to system modeling and simulation to develop and test the software and to provide operational model-based what-if analyses: discrete component operational and failure modes; continuous dynamic performance within component modes, modeled qualitatively or quantitatively; configuration of flows and power among components in the system; and operations activities and scenarios. CONFIG, a multi-purpose discrete event simulation tool that integrates all four types of models for use throughout the engineering and operations life cycle, has been used to model components and systems involved in the production and transfer of oxygen and carbon dioxide in a plant-growth chamber and between that chamber and a habitation chamber with physicochemical systems for gas processing.

  10. System Testing of Ground Cooling System Components

    NASA Technical Reports Server (NTRS)

    Ensey, Tyler Steven

    2014-01-01

    This internship focused primarily upon software unit testing of Ground Cooling System (GCS) components, one of the three types of tests (unit, integrated, and COTS/regression) utilized in software verification. Unit tests are used to test the software of necessary components before it is implemented into the hardware. A unit test determines that the control data, usage procedures, and operating procedures of a particular component are tested to determine if the program is fit for use. Three different files are used to make and complete an efficient unit test. These files include the following: Model Test file (.mdl), Simulink SystemTest (.test), and autotest (.m). The Model Test file includes the component that is being tested with the appropriate Discrete Physical Interface (DPI) for testing. The Simulink SystemTest is a program used to test all of the requirements of the component. The autotest tests that the component passes Model Advisor and System Testing, and puts the results into proper files. Once unit testing is completed on the GCS components they can then be implemented into the GCS Schematic and the software of the GCS model as a whole can be tested using integrated testing. Unit testing is a critical part of software verification; it allows for the testing of more basic components before a model of higher fidelity is tested, making the process of testing flow in an orderly manner.

  11. Modeling methodology for MLS range navigation system errors using flight test data

    NASA Technical Reports Server (NTRS)

    Karmali, M. S.; Phatak, A. V.

    1982-01-01

    Flight test data was used to develop a methodology for modeling MLS range navigation system errors. The data used corresponded to the constant velocity and glideslope approach segment of a helicopter landing trajectory. The MLS range measurement was assumed to consist of low frequency and random high frequency components. The random high frequency component was extracted from the MLS range measurements. This was done by appropriate filtering of the range residual generated from a linearization of the range profile for the final approach segment. This range navigation system error was then modeled as an autoregressive moving average (ARMA) process. Maximum likelihood techniques were used to identify the parameters of the ARMA process.

  12. Power processing

    NASA Technical Reports Server (NTRS)

    Schwarz, F. C.

    1971-01-01

    Processing of electric power has been presented as a discipline that draws on almost every field of electrical engineering, including system and control theory, communications theory, electronic network design, and power component technology. The cost of power processing equipment, which often equals that of expensive, sophisticated, and unconventional sources of electrical energy, such as solar batteries, is a significant consideration in the choice of electric power systems.

  13. 39 CFR 501.14 - Postage Evidencing System inventory control processes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Postage Evidencing System inventory control processes. 501.14 Section 501.14 Postal Service UNITED STATES POSTAL SERVICE POSTAGE PROGRAMS AUTHORIZATION... affect Postal Service revenues, or of any memory component, or that affects the accuracy of the registers...

  14. DEMONSTRATION BULLETIN: LOW TEMPERATURE THERMAL AERATION (LTTA®) SYSTEM - CANONIE ENVIRONMENTAL SERVICES, INC.

    EPA Science Inventory

    The Low Temperature Thermal Aeration (LTTA®) process was developed by Canonie Environmental Services, Inc. (Canonie), as a treatment system that desorbs organic contaminants from soils by heating the soils up to 800 °F. The main components of the LTTA process include the follow...

  15. A flexible system for the estimation of infiltration and hydraulic resistance parameters in surface irrigation

    USDA-ARS?s Scientific Manuscript database

    Critical to the use of modeling tools for the hydraulic analysis of surface irrigation systems is characterizing the infiltration and hydraulic resistance process. Since those processes are still not well understood, various formulations are currently used to represent them. A software component h...

  16. Oxidation Ditches. Student Manual. Biological Treatment Process Control.

    ERIC Educational Resources Information Center

    Nelsen, David

    The textual material for a two-lesson unit on oxidation ditches is presented in this student manual. Topics discussed in the first lesson (introduction, theory, and components) include: history of the oxidation ditch process; various designs of the oxidation ditch; multi-trench systems; carrousel system; advantages and disadvantages of the…

  17. Evaluation of Models of the Reading Process.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    A variety of reading process models have been proposed and evaluated in reading research. Traditional approaches to model evaluation specify the workings of a system in a simplified fashion to enable organized, systematic study of the system's components. Following are several statistical methods of model evaluation: (1) empirical research on…

  18. Improving Papanicolaou test quality and reducing medical errors by using Toyota production system methods.

    PubMed

    Raab, Stephen S; Andrew-Jaja, Carey; Condel, Jennifer L; Dabbs, David J

    2006-01-01

    The objective of the study was to determine whether the Toyota production system process improves Papanicolaou test quality and patient safety. An 8-month nonconcurrent cohort study that included 464 case and 639 control women who had a Papanicolaou test was performed. Office workflow was redesigned using Toyota production system methods by introducing a 1-by-1 continuous flow process. We measured the frequency of Papanicolaou tests without a transformation zone component, follow-up and Bethesda System diagnostic frequency of atypical squamous cells of undetermined significance, and diagnostic error frequency. After the intervention, the percentage of Papanicolaou tests lacking a transformation zone component decreased from 9.9% to 4.7% (P = .001). The percentage of Papanicolaou tests with a diagnosis of atypical squamous cells of undetermined significance decreased from 7.8% to 3.9% (P = .007). The frequency of error per correlating cytologic-histologic specimen pair decreased from 9.52% to 7.84%. The introduction of the Toyota production system process resulted in improved Papanicolaou test quality.

  19. Inertial Energy Storage for Spacecraft

    NASA Technical Reports Server (NTRS)

    Rodriguez, G. E.

    1984-01-01

    The feasibility of inertial energy storage in a spacecraft power system is evaluated on the basis of a conceptual integrated design that encompasses a composite rotor, magnetic suspension and a permanent magnet (PM) motor/generator for a 3-kW orbital average payload at a bus distribution voltage of 250 volts dc. The conceptual design, is referred to as a Mechanical Capacitor. The baseline power system configuration selected is a series system employing peak-power-tracking for a Low Earth-Orbiting application. Power processing, required in the motor/generator, provides potential alternative that can only be achieved in systems with electrochemical energy storage by the addition of power processing components. One such alternative configuration provides for peak-power-tracking of the solar array and still maintains a regulated bus, without the expense of additional power processing components. Precise speed control of the two counterrotating wheels is required to reduce interaction with the attitude control system (ACS) or alternatively, used to perform attitude control functions.

  20. Solid oxide fuel cell power plant having a bootstrap start-up system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lines, Michael T

    The bootstrap start-up system (42) achieves an efficient start-up of the power plant (10) that minimizes formation of soot within a reformed hydrogen rich fuel. A burner (48) receives un-reformed fuel directly from the fuel supply (30) and combusts the fuel to heat cathode air which then heats an electrolyte (24) within the fuel cell (12). A dilute hydrogen forming gas (68) cycles through a sealed heat-cycling loop (66) to transfer heat and generated steam from an anode side (32) of the electrolyte (24) through fuel processing system (36) components (38, 40) and back to an anode flow field (26)more » until fuel processing system components (38, 40) achieve predetermined optimal temperatures and steam content. Then, the heat-cycling loop (66) is unsealed and the un-reformed fuel is admitted into the fuel processing system (36) and anode flow (26) field to commence ordinary operation of the power plant (10).« less

  1. A Vision-Based Driver Nighttime Assistance and Surveillance System Based on Intelligent Image Sensing Techniques and a Heterogamous Dual-Core Embedded System Architecture

    PubMed Central

    Chen, Yen-Lin; Chiang, Hsin-Han; Chiang, Chuan-Yen; Liu, Chuan-Ming; Yuan, Shyan-Ming; Wang, Jenq-Haur

    2012-01-01

    This study proposes a vision-based intelligent nighttime driver assistance and surveillance system (VIDASS system) implemented by a set of embedded software components and modules, and integrates these modules to accomplish a component-based system framework on an embedded heterogamous dual-core platform. Therefore, this study develops and implements computer vision and sensing techniques of nighttime vehicle detection, collision warning determination, and traffic event recording. The proposed system processes the road-scene frames in front of the host car captured from CCD sensors mounted on the host vehicle. These vision-based sensing and processing technologies are integrated and implemented on an ARM-DSP heterogamous dual-core embedded platform. Peripheral devices, including image grabbing devices, communication modules, and other in-vehicle control devices, are also integrated to form an in-vehicle-embedded vision-based nighttime driver assistance and surveillance system. PMID:22736956

  2. A vision-based driver nighttime assistance and surveillance system based on intelligent image sensing techniques and a heterogamous dual-core embedded system architecture.

    PubMed

    Chen, Yen-Lin; Chiang, Hsin-Han; Chiang, Chuan-Yen; Liu, Chuan-Ming; Yuan, Shyan-Ming; Wang, Jenq-Haur

    2012-01-01

    This study proposes a vision-based intelligent nighttime driver assistance and surveillance system (VIDASS system) implemented by a set of embedded software components and modules, and integrates these modules to accomplish a component-based system framework on an embedded heterogamous dual-core platform. Therefore, this study develops and implements computer vision and sensing techniques of nighttime vehicle detection, collision warning determination, and traffic event recording. The proposed system processes the road-scene frames in front of the host car captured from CCD sensors mounted on the host vehicle. These vision-based sensing and processing technologies are integrated and implemented on an ARM-DSP heterogamous dual-core embedded platform. Peripheral devices, including image grabbing devices, communication modules, and other in-vehicle control devices, are also integrated to form an in-vehicle-embedded vision-based nighttime driver assistance and surveillance system.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Qishi; Zhu, Mengxia; Rao, Nageswara S

    We propose an intelligent decision support system based on sensor and computer networks that incorporates various component techniques for sensor deployment, data routing, distributed computing, and information fusion. The integrated system is deployed in a distributed environment composed of both wireless sensor networks for data collection and wired computer networks for data processing in support of homeland security defense. We present the system framework and formulate the analytical problems and develop approximate or exact solutions for the subtasks: (i) sensor deployment strategy based on a two-dimensional genetic algorithm to achieve maximum coverage with cost constraints; (ii) data routing scheme tomore » achieve maximum signal strength with minimum path loss, high energy efficiency, and effective fault tolerance; (iii) network mapping method to assign computing modules to network nodes for high-performance distributed data processing; and (iv) binary decision fusion rule that derive threshold bounds to improve system hit rate and false alarm rate. These component solutions are implemented and evaluated through either experiments or simulations in various application scenarios. The extensive results demonstrate that these component solutions imbue the integrated system with the desirable and useful quality of intelligence in decision making.« less

  4. Workshop on the Space Environment: The Effects on the Optical Properties of Airless Bodies

    NASA Technical Reports Server (NTRS)

    Hapke, B. (Editor); Clark, B. (Editor); Benedix, G. (Editor); Domingue, D. (Editor); Cintala, M. (Editor)

    1993-01-01

    Reflectance spectrophotometry and polarimetry are major tools in remote sensing studies of surfaces of solar system bodies. The interpretations of such measurements are often based on laboratory studies of meteoritic, lunar, and terrestrial materials. However, the optical properties of regoliths are known to be affected by the space environment. Thus, some of the major questions addressed in the workshop include identity of the soil component responsible for alteration of the optical properties, the process that produced this component, and how reliably the effects of these processes could be extrapolated to other bodies of the solar system.

  5. Future of Assurance: Ensuring that a System is Trustworthy

    NASA Astrophysics Data System (ADS)

    Sadeghi, Ahmad-Reza; Verbauwhede, Ingrid; Vishik, Claire

    Significant efforts are put in defining and implementing strong security measures for all components of the comput-ing environment. It is equally important to be able to evaluate the strength and robustness of these measures and establish trust among the components of the computing environment based on parameters and attributes of these elements and best practices associated with their production and deployment. Today the inventory of techniques used for security assurance and to establish trust -- audit, security-conscious development process, cryptographic components, external evaluation - is somewhat limited. These methods have their indisputable strengths and have contributed significantly to the advancement in the area of security assurance. However, shorter product and tech-nology development cycles and the sheer complexity of modern digital systems and processes have begun to decrease the efficiency of these techniques. Moreover, these approaches and technologies address only some aspects of security assurance and, for the most part, evaluate assurance in a general design rather than an instance of a product. Additionally, various components of the computing environment participating in the same processes enjoy different levels of security assurance, making it difficult to ensure adequate levels of protection end-to-end. Finally, most evaluation methodologies rely on the knowledge and skill of the evaluators, making reliable assessments of trustworthiness of a system even harder to achieve. The paper outlines some issues in security assurance that apply across the board, with the focus on the trustworthiness and authenticity of hardware components and evaluates current approaches to assurance.

  6. Influence of the coating process on the tribological conditions during cold forging with a MoS2 based lubricant

    NASA Astrophysics Data System (ADS)

    Lorenz, Robby; Hagenah, Hinnerk; Merklein, Marion

    2018-05-01

    Cold forging processes such as forward rod extrusion can be used to produce high quality components like connection rods, shafts and gears. The main advantages of these extruded components are sufficient surface quality, work hardening, compressive residual stresses and fatigue strength. Since one technical disadvantage of extruded components lies in the achievable tolerance classes, the improvement of these should be of crucial importance. For instance, the attainable workpiece accuracy and component quality can be influenced by adapting the tribological system in such a way that the resulting friction is specifically controlled in order to improve component forming. Lubricant modification is one practical way of adapting the tribological system to the requirements of the forming process. An industrial established and highly efficient lubricant system is the application of a zinc-phosphate conversion layer with a molybdenum disulfide-based lubricant. While offering many advantages, its tribological conditions seem to depend strongly on the layer weight and the application strategy. These parameters and the respective interdependencies have not been sufficiently investigated yet. In order to examine this, the tribological conditions depending on the layer weight are analyzed in greater detail using the Ring-Compression-Test (RCT). This tribometer provides a comparative representation of the forming conditions during cold forging. Furthermore, a potential dependency between the tribological conditions and two different coating techniques is analyzed. The latter are represented by the industrial standards dipping and dip-drumming.

  7. Advanced Turbine Technology Applications Project (ATTAP)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    ATTAP activities during the past year included test-bed engine design and development, ceramic component design, materials and component characterization, ceramic component process development and fabrication, ceramic component rig testing, and test-bed engine fabrication and testing. Significant technical challenges remain, but all areas exhibited progress. Test-bed engine design and development included engine mechanical design, combustion system design, alternate aerodynamic designs of gasifier scrolls, and engine system integration aimed at upgrading the AGT-5 from a 1038 C (1900 F) metal engine to a durable 1372 C (2500 F) structural ceramic component test-bed engine. ATTAP-defined ceramic and associated ceramic/metal component design activities completed include the ceramic gasifier turbine static structure, the ceramic gasifier turbine rotor, ceramic combustors, the ceramic regenerator disk, the ceramic power turbine rotors, and the ceramic/metal power turbine static structure. The material and component characterization efforts included the testing and evaluation of seven candidate materials and three development components. Ceramic component process development and fabrication proceeded for the gasifier turbine rotor, gasifier turbine scroll, gasifier turbine vanes and vane platform, extruded regenerator disks, and thermal insulation. Component rig activities included the development of both rigs and the necessary test procedures, and conduct of rig testing of the ceramic components and assemblies. Test-bed engine fabrication, testing, and development supported improvements in ceramic component technology that permit the achievement of both program performance and durability goals. Total test time in 1991 amounted to 847 hours, of which 128 hours were engine testing, and 719 were hot rig testing.

  8. Attacks and intrusion detection in wireless sensor networks of industrial SCADA systems

    NASA Astrophysics Data System (ADS)

    Kamaev, V. A.; Finogeev, A. G.; Finogeev, A. A.; Parygin, D. S.

    2017-01-01

    The effectiveness of automated process control systems (APCS) and supervisory control and data acquisition systems (SCADA) information security depends on the applied protection technologies of transport environment data transmission components. This article investigates the problems of detecting attacks in wireless sensor networks (WSN) of SCADA systems. As a result of analytical studies, the authors developed the detailed classification of external attacks and intrusion detection in sensor networks and brought a detailed description of attacking impacts on components of SCADA systems in accordance with the selected directions of attacks.

  9. Toward the Modularization of Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Raskin, R. G.

    2009-12-01

    Decision support systems are typically developed entirely from scratch without the use of modular components. This “stovepiped” approach is inefficient and costly because it prevents a developer from leveraging the data, models, tools, and services of other developers. Even when a decision support component is made available, it is difficult to know what problem it solves, how it relates to other components, or even that the component exists, The Spatial Decision Support (SDS) Consortium was formed in 2008 to organize the body of knowledge in SDS within a common portal. The portal identifies the canonical steps in the decision process and enables decision support components to be registered, categorized, and searched. This presentation describes how a decision support system can be assembled from modular models, data, tools and services, based on the needs of the Earth science application.

  10. CLIMLAB: a Python-based software toolkit for interactive, process-oriented climate modeling

    NASA Astrophysics Data System (ADS)

    Rose, B. E. J.

    2015-12-01

    Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The IPython notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields. However CLIMLAB is well suited to be deployed as a computational back-end for a graphical gaming environment based on earth-system modeling.

  11. Developing interpretable models with optimized set reduction for identifying high risk software components

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1993-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault frequency components so that testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents the Optimized Set Reduction approach for constructing such models, intended to fulfill specific software engineering needs. Our approach to classification is to measure the software system and build multivariate stochastic models for predicting high risk system components. We present experimental results obtained by classifying Ada components into two classes: is or is not likely to generate faults during system and acceptance test. Also, we evaluate the accuracy of the model and the insights it provides into the error making process.

  12. Advanced and secure architectural EHR approaches.

    PubMed

    Blobel, Bernd

    2006-01-01

    Electronic Health Records (EHRs) provided as a lifelong patient record advance towards core applications of distributed and co-operating health information systems and health networks. For meeting the challenge of scalable, flexible, portable, secure EHR systems, the underlying EHR architecture must be based on the component paradigm and model driven, separating platform-independent and platform-specific models. Allowing manageable models, real systems must be decomposed and simplified. The resulting modelling approach has to follow the ISO Reference Model - Open Distributing Processing (RM-ODP). The ISO RM-ODP describes any system component from different perspectives. Platform-independent perspectives contain the enterprise view (business process, policies, scenarios, use cases), the information view (classes and associations) and the computational view (composition and decomposition), whereas platform-specific perspectives concern the engineering view (physical distribution and realisation) and the technology view (implementation details from protocols up to education and training) on system components. Those views have to be established for components reflecting aspects of all domains involved in healthcare environments including administrative, legal, medical, technical, etc. Thus, security-related component models reflecting all view mentioned have to be established for enabling both application and communication security services as integral part of the system's architecture. Beside decomposition and simplification of system regarding the different viewpoint on their components, different levels of systems' granularity can be defined hiding internals or focusing on properties of basic components to form a more complex structure. The resulting models describe both structure and behaviour of component-based systems. The described approach has been deployed in different projects defining EHR systems and their underlying architectural principles. In that context, the Australian GEHR project, the openEHR initiative, the revision of CEN ENV 13606 "Electronic Health Record communication", all based on Archetypes, but also the HL7 version 3 activities are discussed in some detail. The latter include the HL7 RIM, the HL7 Development Framework, the HL7's clinical document architecture (CDA) as well as the set of models from use cases, activity diagrams, sequence diagrams up to Domain Information Models (DMIMs) and their building blocks Common Message Element Types (CMET) Constraining Models to their underlying concepts. The future-proof EHR architecture as open, user-centric, user-friendly, flexible, scalable, portable core application in health information systems and health networks has to follow advanced architectural paradigms.

  13. The composite load spectra project

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Ho, H.; Kurth, R. E.

    1990-01-01

    Probabilistic methods and generic load models capable of simulating the load spectra that are induced in space propulsion system components are being developed. Four engine component types (the transfer ducts, the turbine blades, the liquid oxygen posts and the turbopump oxidizer discharge duct) were selected as representative hardware examples. The composite load spectra that simulate the probabilistic loads for these components are typically used as the input loads for a probabilistic structural analysis. The knowledge-based system approach used for the composite load spectra project provides an ideal environment for incremental development. The intelligent database paradigm employed in developing the expert system provides a smooth coupling between the numerical processing and the symbolic (information) processing. Large volumes of engine load information and engineering data are stored in database format and managed by a database management system. Numerical procedures for probabilistic load simulation and database management functions are controlled by rule modules. Rules were hard-wired as decision trees into rule modules to perform process control tasks. There are modules to retrieve load information and models. There are modules to select loads and models to carry out quick load calculations or make an input file for full duty-cycle time dependent load simulation. The composite load spectra load expert system implemented today is capable of performing intelligent rocket engine load spectra simulation. Further development of the expert system will provide tutorial capability for users to learn from it.

  14. Stripping Away the Soil: Plant Growth Promoting Microbiology Opportunities in Aquaponics.

    PubMed

    Bartelme, Ryan P; Oyserman, Ben O; Blom, Jesse E; Sepulveda-Villet, Osvaldo J; Newton, Ryan J

    2018-01-01

    As the processes facilitated by plant growth promoting microorganisms (PGPMs) become better characterized, it is evident that PGPMs may be critical for successful sustainable agricultural practices. Microbes enrich plant growth through various mechanisms, such as enhancing resistance to disease and drought, producing beneficial molecules, and supplying nutrients and trace metals to the plant rhizosphere. Previous studies of PGPMs have focused primarily on soil-based crops. In contrast, aquaponics is a water-based agricultural system, in which production relies upon internal nutrient recycling to co-cultivate plants with fish. This arrangement has management benefits compared to soil-based agriculture, as system components may be designed to directly harness microbial processes that make nutrients bioavailable to plants in downstream components. However, aquaponic systems also present unique management challenges. Microbes may compete with plants for certain micronutrients, such as iron, which makes exogenous supplementation necessary, adding production cost and process complexity, and limiting profitability and system sustainability. Research on PGPMs in aquaponic systems currently lags behind traditional agricultural systems, however, it is clear that certain parallels in nutrient use and plant-microbe interactions are retained from soil-based agricultural systems.

  15. Observing System Simulation Experiments

    NASA Technical Reports Server (NTRS)

    Prive, Nikki

    2015-01-01

    This presentation gives an overview of Observing System Simulation Experiments (OSSEs). The components of an OSSE are described, along with discussion of the process for validating, calibrating, and performing experiments. a.

  16. Blogs and Social Network Sites as Activity Systems: Exploring Adult Informal Learning Process through Activity Theory Framework

    ERIC Educational Resources Information Center

    Heo, Gyeong Mi; Lee, Romee

    2013-01-01

    This paper uses an Activity Theory framework to explore adult user activities and informal learning processes as reflected in their blogs and social network sites (SNS). Using the assumption that a web-based space is an activity system in which learning occurs, typical features of the components were investigated and each activity system then…

  17. Multifunctional nanoparticles for drug/gene delivery in nanomedicine

    NASA Astrophysics Data System (ADS)

    Seale, Mary-Margaret; Zemlyanov, Dimitry; Cooper, Christy L.; Haglund, Emily; Prow, Tarl W.; Reece, Lisa M.; Leary, James F.

    2007-02-01

    Multifunctional nanoparticles hold great promise for drug/gene delivery. Multilayered nanoparticles can act as nanomedical systems with on-board "molecular programming" to accomplish complex multi-step tasks. For example, the targeting process has only begun when the nanosystem has found the correct diseased cell of interest. Then it must pass the cell membrane and avoid enzymatic destruction within the endosomes of the cell. Since the nanosystem is only about one millionth the volume of a human cell, for it to have therapeutic efficacy with its contained package, it must deliver that drug or gene to the appropriate site within the living cell. The successive de-layering of these nanosystems in a controlled fashion allows the system to accomplish operations that would be difficult or impossible to do with even complex single molecules. In addition, portions of the nanosystem may be protected from premature degradation or mistargeting to non-diseased cells. All of these problems remain major obstacles to successful drug delivery with a minimum of deleterious side effects to the patient. This paper describes some of the many components involved in the design of a general platform technology for nanomedical systems. The feasibility of most of these components has been demonstrated by our group and others. But the integration of these interacting sub-components remains a challenge. We highlight four components of this process as examples. Each subcomponent has its own sublevels of complexity. But good nanomedical systems have to be designed/engineered as a full nanomedical system, recognizing the need for the other components.

  18. Doppler Lidar System Design via Interdisciplinary Design Concept at NASA Langley Research Center - Part I

    NASA Technical Reports Server (NTRS)

    Boyer, Charles M.; Jackson, Trevor P.; Beyon, Jeffrey Y.; Petway, Larry B.

    2013-01-01

    Optimized designs of the Navigation Doppler Lidar (NDL) instrument for Autonomous Landing Hazard Avoidance Technology (ALHAT) were accomplished via Interdisciplinary Design Concept (IDEC) at NASA Langley Research Center during the summer of 2013. Three branches in the Engineering Directorate and three students were involved in this joint task through the NASA Langley Aerospace Research Summer Scholars (LARSS) Program. The Laser Remote Sensing Branch (LRSB), Mechanical Systems Branch (MSB), and Structural and Thermal Systems Branch (STSB) were engaged to achieve optimal designs through iterative and interactive collaborative design processes. A preliminary design iteration was able to reduce the power consumption, mass, and footprint by removing redundant components and replacing inefficient components with more efficient ones. A second design iteration reduced volume and mass by replacing bulky components with excessive performance with smaller components custom-designed for the power system. Mechanical placement collaboration reduced potential electromagnetic interference (EMI). Through application of newly selected electrical components and thermal analysis data, a total electronic chassis redesign was accomplished. Use of an innovative forced convection tunnel heat sink was employed to meet and exceed project requirements for cooling, mass reduction, and volume reduction. Functionality was a key concern to make efficient use of airflow, and accessibility was also imperative to allow for servicing of chassis internals. The collaborative process provided for accelerated design maturation with substantiated function.

  19. Adsorption and removal of clofibric acid and diclofenac from water with MIEX resin.

    PubMed

    Lu, Xian; Shao, Yisheng; Gao, Naiyun; Chen, Juxiang; Zhang, Yansen; Wang, Qiongfang; Lu, Yuqi

    2016-10-01

    This study demonstrates the use of MIEX resin as an efficient adsorbent for the removal of clofibric acid (CA) and diclofenac (DCF). The adsorption performance of CA and DCF are investigated by a batch mode in single-component or bi-component adsorption system. Various factors influencing the adsorption of CA and DCF, including initial concentration, contact time, adsorbent dosage, initial solution pH, agitation speed, natural organic matter and coexistent anions are studied. The Langmuir model can well describe CA adsorption in single-component system, while the Freundlich model gives better fitting in bi-component system. The DCF adsorption can be well fitted by the Freundlich model in both systems. Thermodynamic analyses show that the adsorption of CA and DCF is an endothermic (ΔH(o) > 0), entropy driven (ΔS(o) > 0) process and more randomness exists in the DCF adsorption process. The values of Gibbs free energy (ΔG(o) < 0) indicate the adsorption of DCF is spontaneous but nonspontaneous (ΔG(o) > 0) for CA adsorption. The kinetic data suggest the adsorption of CA and DCF follow the pseudo-first-order model in both systems and the intra-particle is not the unique rate-limiting step. The adsorption process is controlled simultaneously by external mass transfer and surface diffusion according to the surface diffusion modified Biot number (Bis) ranging from 1.06 to 26.15. Moreover, the possible removal mechanism for CA and DCF is respectively proposed based on the ion exchange stoichiometry. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. The Chandra X-ray Center data system: supporting the mission of the Chandra X-ray Observatory

    NASA Astrophysics Data System (ADS)

    Evans, Janet D.; Cresitello-Dittmar, Mark; Doe, Stephen; Evans, Ian; Fabbiano, Giuseppina; Germain, Gregg; Glotfelty, Kenny; Hall, Diane; Plummer, David; Zografou, Panagoula

    2006-06-01

    The Chandra X-ray Center Data System provides end-to-end scientific software support for Chandra X-ray Observatory mission operations. The data system includes the following components: (1) observers' science proposal planning tools; (2) science mission planning tools; (3) science data processing, monitoring, and trending pipelines and tools; and (4) data archive and database management. A subset of the science data processing component is ported to multiple platforms and distributed to end-users as a portable data analysis package. Web-based user tools are also available for data archive search and retrieval. We describe the overall architecture of the data system and its component pieces, and consider the design choices and their impacts on maintainability. We discuss the many challenges involved in maintaining a large, mission-critical software system with limited resources. These challenges include managing continually changing software requirements and ensuring the integrity of the data system and resulting data products while being highly responsive to the needs of the project. We describe our use of COTS and OTS software at the subsystem and component levels, our methods for managing multiple release builds, and adapting a large code base to new hardware and software platforms. We review our experiences during the life of the mission so-far, and our approaches for keeping a small, but highly talented, development team engaged during the maintenance phase of a mission.

  1. In-Process Thermal Imaging of the Electron Beam Freeform Fabrication Process

    NASA Technical Reports Server (NTRS)

    Taminger, Karen M.; Domack, Christopher S.; Zalameda, Joseph N.; Taminger, Brian L.; Hafley, Robert A.; Burke, Eric R.

    2016-01-01

    Researchers at NASA Langley Research Center have been developing the Electron Beam Freeform Fabrication (EBF3) metal additive manufacturing process for the past 15 years. In this process, an electron beam is used as a heat source to create a small molten pool on a substrate into which wire is fed. The electron beam and wire feed assembly are translated with respect to the substrate to follow a predetermined tool path. This process is repeated in a layer-wise fashion to fabricate metal structural components. In-process imaging has been integrated into the EBF3 system using a near-infrared (NIR) camera. The images are processed to provide thermal and spatial measurements that have been incorporated into a closed-loop control system to maintain consistent thermal conditions throughout the build. Other information in the thermal images is being used to assess quality in real time by detecting flaws in prior layers of the deposit. NIR camera incorporation into the system has improved the consistency of the deposited material and provides the potential for real-time flaw detection which, ultimately, could lead to the manufacture of better, more reliable components using this additive manufacturing process.

  2. Interactive, process-oriented climate modeling with CLIMLAB

    NASA Astrophysics Data System (ADS)

    Rose, B. E. J.

    2016-12-01

    Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The Jupyter Notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields.

  3. Monitoring the Opinions of Parents of College Students as a Component of the Institution's In-House Education Quality Management System

    ERIC Educational Resources Information Center

    Briukhanov, V. M.; Kiselev, V. I.; Timchenko, N. S.; Vdovin, V. M.

    2010-01-01

    The intensive process observed in the past few years, in which higher professional education is coming to be included in the system of market relations, is setting new target guidelines of the activity of institutions of higher learning, as well as the management models of educational institutions. The marketing component is becoming more and more…

  4. First Workshop on Convergence and Consolidation towards Standard AAL Platform Services

    NASA Astrophysics Data System (ADS)

    Lázaro, Juan-Pablo; Guillén, Sergio; Farshchian, Babak; Mikalsen, Marius

    The following document describes the call for papers for a workshop based on identifying which are the potential commonalities that are important for an AAL system, so they can be discussed and proposed for opening an standardization process. Groups of components like context-management, user interaction management or semantic description of services are frequent components and technologies that are part of an AAL system.

  5. Evaluation of an Integrated Multi-Task Machine Learning System with Humans in the Loop

    DTIC Science & Technology

    2007-01-01

    machine learning components natural language processing, and optimization...was examined with a test explicitly developed to measure the impact of integrated machine learning when used by a human user in a real world setting...study revealed that integrated machine learning does produce a positive impact on overall performance. This paper also discusses how specific machine learning components contributed to human-system

  6. Coherent-Phase Monitoring Of Cavitation In Turbomachines

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1996-01-01

    Digital electronic signal-processing system analyzes outputs of accelerometers mounted on turbomachine to detect vibrations characteristic of cavitation. Designed to overcome limitation imposed by interference from discrete components. System digitally implements technique called "coherent-phase wide-band demodulation" (CPWBD), using phase-only (PO) filtering along envelope detection to search for unique coherent-phase relationship associated with cavitation and to minimize influence of large-amplitude discrete components.

  7. Guest Editor's Introduction: Special section on dependable distributed systems

    NASA Astrophysics Data System (ADS)

    Fetzer, Christof

    1999-09-01

    We rely more and more on computers. For example, the Internet reshapes the way we do business. A `computer outage' can cost a company a substantial amount of money. Not only with respect to the business lost during an outage, but also with respect to the negative publicity the company receives. This is especially true for Internet companies. After recent computer outages of Internet companies, we have seen a drastic fall of the shares of the affected companies. There are multiple causes for computer outages. Although computer hardware becomes more reliable, hardware related outages remain an important issue. For example, some of the recent computer outages of companies were caused by failed memory and system boards, and even by crashed disks - a failure type which can easily be masked using disk mirroring. Transient hardware failures might also look like software failures and, hence, might be incorrectly classified as such. However, many outages are software related. Faulty system software, middleware, and application software can crash a system. Dependable computing systems are systems we can rely on. Dependable systems are, by definition, reliable, available, safe and secure [3]. This special section focuses on issues related to dependable distributed systems. Distributed systems have the potential to be more dependable than a single computer because the probability that all computers in a distributed system fail is smaller than the probability that a single computer fails. However, if a distributed system is not built well, it is potentially less dependable than a single computer since the probability that at least one computer in a distributed system fails is higher than the probability that one computer fails. For example, if the crash of any computer in a distributed system can bring the complete system to a halt, the system is less dependable than a single-computer system. Building dependable distributed systems is an extremely difficult task. There is no silver bullet solution. Instead one has to apply a variety of engineering techniques [2]: fault-avoidance (minimize the occurrence of faults, e.g. by using a proper design process), fault-removal (remove faults before they occur, e.g. by testing), fault-evasion (predict faults by monitoring and reconfigure the system before failures occur), and fault-tolerance (mask and/or contain failures). Building a system from scratch is an expensive and time consuming effort. To reduce the cost of building dependable distributed systems, one would choose to use commercial off-the-shelf (COTS) components whenever possible. The usage of COTS components has several potential advantages beyond minimizing costs. For example, through the widespread usage of a COTS component, design failures might be detected and fixed before the component is used in a dependable system. Custom-designed components have to mature without the widespread in-field testing of COTS components. COTS components have various potential disadvantages when used in dependable systems. For example, minimizing the time to market might lead to the release of components with inherent design faults (e.g. use of `shortcuts' that only work most of the time). In addition, the components might be more complex than needed and, hence, potentially have more design faults than simpler components. However, given economic constraints and the ability to cope with some of the problems using fault-evasion and fault-tolerance, only for a small percentage of systems can one justify not using COTS components. Distributed systems built from current COTS components are asynchronous systems in the sense that there exists no a priori known bound on the transmission delay of messages or the execution time of processes. When designing a distributed algorithm, one would like to make sure (e.g. by testing or verification) that it is correct, i.e. satisfies its specification. Many distributed algorithms make use of consensus (eventually all non-crashed processes have to agree on a value), leader election (a crashed leader is eventually replaced by a new leader, but at any time there is at most one leader) or a group membership detection service (a crashed process is eventually suspected to have crashed but only crashed processes are suspected). From a theoretical point of view, the service specifications given for such services are not implementable in asynchronous systems. In particular, for each implementation one can derive a counter example in which the service violates its specification. From a practical point of view, the consensus, the leader election, and the membership detection problem are solvable in asynchronous distributed systems. In this special section, Raynal and Tronel show how to bridge this difference by showing how to implement the group membership detection problem with a negligible probability [1] to fail in an asynchronous system. The group membership detection problem is specified by a liveness condition (L) and a safety property (S): (L) if a process p crashes, then eventually every non-crashed process q has to suspect that p has crashed; and (S) if a process q suspects p, then p has indeed crashed. One can show that either (L) or (S) is implementable, but one cannot implement both (L) and (S) at the same time in an asynchronous system. In practice, one only needs to implement (L) and (S) such that the probability that (L) or (S) is violated becomes negligible. Raynal and Tronel propose and analyse a protocol that implements (L) with certainty and that can be tuned such that the probability that (S) is violated becomes negligible. Designing and implementing distributed fault-tolerant protocols for asynchronous systems is a difficult but not an impossible task. A fault-tolerant protocol has to detect and mask certain failure classes, e.g. crash failures and message omission failures. There is a trade-off between the performance of a fault-tolerant protocol and the failure classes the protocol can tolerate. One wants to tolerate as many failure classes as needed to satisfy the stochastic requirements of the protocol [1] while still maintaining a sufficient performance. Since clients of a protocol have different requirements with respect to the performance/fault-tolerance trade-off, one would like to be able to customize protocols such that one can select an appropriate performance/fault-tolerance trade-off. In this special section Hiltunen et al describe how one can compose protocols from micro-protocols in their Cactus system. They show how a group RPC system can be tailored to the needs of a client. In particular, they show how considering additional failure classes affects the performance of a group RPC system. References [1] Cristian F 1991 Understanding fault-tolerant distributed systems Communications of ACM 34 (2) 56-78 [2] Heimerdinger W L and Weinstock C B 1992 A conceptual framework for system fault tolerance Technical Report 92-TR-33, CMU/SEI [3] Laprie J C (ed) 1992 Dependability: Basic Concepts and Terminology (Vienna: Springer)

  8. Doppler Lidar System Design via Interdisciplinary Design Concept at NASA Langley Research Center - Part III

    NASA Technical Reports Server (NTRS)

    Barnes, Bruce W.; Sessions, Alaric M.; Beyon, Jeffrey; Petway, Larry B.

    2014-01-01

    Optimized designs of the Navigation Doppler Lidar (NDL) instrument for Autonomous Landing Hazard Avoidance Technology (ALHAT) were accomplished via Interdisciplinary Design Concept (IDEC) at NASA Langley Research Center during the summer of 2013. Three branches in the Engineering Directorate and three students were involved in this joint task through the NASA Langley Aerospace Research Summer Scholars (LARSS) Program. The Laser Remote Sensing Branch (LRSB), Mechanical Systems Branch (MSB), and Structural and Thermal Systems Branch (STSB) were engaged to achieve optimal designs through iterative and interactive collaborative design processes. A preliminary design iteration was able to reduce the power consumption, mass, and footprint by removing redundant components and replacing inefficient components with more efficient ones. A second design iteration reduced volume and mass by replacing bulky components with excessive performance with smaller components custom-designed for the power system. The existing power system was analyzed to rank components in terms of inefficiency, power dissipation, footprint and mass. Design considerations and priorities are compared along with the results of each design iteration. Overall power system improvements are summarized for design implementations.

  9. Enabling Dissimilar Material Joining Using Friction Stir Scribe Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hovanski, Yuri; Upadyay, Piyush; Kleinbaum, Sarah

    2017-04-05

    One challenge in adapting welding processes to dissimilar material joining is the diversity of melting temperatures of the different materials. Although the use of mechanical fasteners and adhesives have mostly paved the way for near-term implementation of dissimilar material systems, these processes only accentuate the need for low-cost welding processes capable of joining dissimilar material components regardless of alloy, properties, or melting temperature. Friction stir scribe technology was developed to overcome the challenges of joining dissimilar material components where melting temperatures vary greatly, and properties and/or chemistry are not compatible with more traditional welding processes. Although the friction stir scribemore » process is capable of joining dissimilar metals and metal/polymer systems, a more detailed evaluation of several aluminum/steel joints is presented herein to demonstrate the ability to both chemically and mechanically join dissimilar materials.« less

  10. Enabling Dissimilar Material Joining Using Friction Stir Scribe Technology

    DOE PAGES

    Hovanski, Yuri; Upadyay, Piyush; Kleinbaum, Sarah; ...

    2017-04-05

    One challenge in adapting welding processes to dissimilar material joining is the diversity of melting temperatures of the different materials. Although the use of mechanical fasteners and adhesives have mostly paved the way for near-term implementation of dissimilar material systems, these processes only accentuate the need for low-cost welding processes capable of impartially joining dissimilar material components regardless of alloy, properties, or melting temperature. Friction stir scribe technology was developed to overcome the challenges of joining dissimilar material components where melting temperatures vary greatly, and properties and/or chemistry are not compatible with more traditional welding processes. Finally, although the frictionmore » stir scribe process is capable of joining dissimilar metals and metal/polymer systems, a more detailed evaluation of several aluminum/steel joints is presented herein to demonstrate the ability to both chemically and mechanically join dissimilar materials.« less

  11. Energy Saving Melting and Revert Reduction Technology (Energy-SMARRT): Light Metals Permanent Mold Casting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fasoyinu, Yemi

    2014-03-31

    Current vehicles use mostly ferrous components for structural applications. It is possible to reduce the weight of the vehicle by substituting these parts with those made from light metals such as aluminum and magnesium. Many alloys and manufacturing processes can be used to produce these light metal components and casting is known to be most economical. One of the high integrity casting processes is permanent mold casting which is the focus of this research report. Many aluminum alloy castings used in automotive applications are produced by the sand casting process. Also, aluminum-silicon (Al-Si) alloys are the most widely used alloymore » systems for automotive applications. It is possible that by using high strength aluminum alloys based on an aluminum-copper (Al-Cu) system and permanent mold casting, the performance of these components can be enhanced significantly. This will also help to further reduce the weight. However, many technological obstacles need to be overcome before using these alloys in automotive applications in an economical way. There is very limited information in the open literature on gravity and low-pressure permanent mold casting of high strength aluminum alloys. This report summarizes the results and issues encountered during the casting trials of high strength aluminum alloy 206.0 (Al-Cu alloy) and moderate strength alloy 535.0 (Al-Mg alloy). Five engineering components were cast by gravity tilt-pour or low pressure permanent mold casting processes at CanmetMATERIALS (CMAT) and two production foundries. The results of the casting trials show that high integrity engineering components can be produced successfully from both alloys if specific processing parameters are used. It was shown that a combination of melt processing and mold temperature is necessary for the elimination of hot tears in both alloys.« less

  12. Obsolescence Risk Assessment Process Best Practice

    NASA Astrophysics Data System (ADS)

    Romero Rojo, F. J.; Roy, R.; Kelly, S.

    2012-05-01

    A component becomes obsolete when it is no longer available from the original manufacturer to the original specification. In long-lifecycle projects, obsolescence has become a major problem as it prevents the maintenance of the system. This is the reason why obsolescence management is now an essential part of the product support activities in sectors such as defence, aerospace, nuclear and railway; where systems need to be supported for several decades. The obsolescence risk assessment for the bill of materials (BoM) is a paramount activity in order to manage obsolescence proactively and cost-effectively. This is the reason why it was necessary to undertake a benchmarking study to develop best practice in this process. A total of 22 obsolescence experts from 13 different organisations/projects from across UK and USA have participated in this study. Their current processes and experience have been taken into account in the development of the best practice process for obsolescence risk assessment. The key factors that have to be analysed in the risk assessment process for each component in the BoM are: number of manufacturers, years to end of life, stock available, consumption rate and operational impact criticality. For the very high risk components, a more detailed analysis is required to inform the decisions regarding the most suitable mitigation strategies. On the contrary, for the low risk components, a fully proactive approach is neither appropriate nor cost effective. Therefore, it is advised for these components that obsolescence issues are dealt with reactively. This process has been validated using case studies with several experts from industry and is currently being implemented by the UK Ministry of Defence as technical guidance within the JSP 886 Volume 7 Part 8.13 standards.

  13. A framework for human-hydrologic system model development integrating hydrology and water management: application to the Cutzamala water system in Mexico

    NASA Astrophysics Data System (ADS)

    Wi, S.; Freeman, S.; Brown, C.

    2017-12-01

    This study presents a general approach to developing computational models of human-hydrologic systems where human modification of hydrologic surface processes are significant or dominant. A river basin system is represented by a network of human-hydrologic response units (HHRUs) identified based on locations where river regulations happen (e.g., reservoir operation and diversions). Natural and human processes in HHRUs are simulated in a holistic framework that integrates component models representing rainfall-runoff, river routing, reservoir operation, flow diversion and water use processes. We illustrate the approach in a case study of the Cutzamala water system (CWS) in Mexico, a complex inter-basin water transfer system supplying the Mexico City Metropolitan Area (MCMA). The human-hydrologic system model for CWS (CUTZSIM) is evaluated in terms of streamflow and reservoir storages measured across the CWS and to water supplied for MCMA. The CUTZSIM improves the representation of hydrology and river-operation interaction and, in so doing, advances evaluation of system-wide water management consequences under altered climatic and demand regimes. The integrated modeling framework enables evaluation and simulation of model errors throughout the river basin, including errors in representation of the human component processes. Heretofore, model error evaluation, predictive error intervals and the resultant improved understanding have been limited to hydrologic processes. The general framework represents an initial step towards fuller understanding and prediction of the many and varied processes that determine the hydrologic fluxes and state variables in real river basins.

  14. Multimedia architectures: from desktop systems to portable appliances

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Vasudev; Konstantinides, Konstantinos; Natarajan, Balas R.

    1997-01-01

    Future desktop and portable computing systems will have as their core an integrated multimedia system. Such a system will seamlessly combine digital video, digital audio, computer animation, text, and graphics. Furthermore, such a system will allow for mixed-media creation, dissemination, and interactive access in real time. Multimedia architectures that need to support these functions have traditionally required special display and processing units for the different media types. This approach tends to be expensive and is inefficient in its use of silicon. Furthermore, such media-specific processing units are unable to cope with the fluid nature of the multimedia market wherein the needs and standards are changing and system manufacturers may demand a single component media engine across a range of products. This constraint has led to a shift towards providing a single-component multimedia specific computing engine that can be integrated easily within desktop systems, tethered consumer appliances, or portable appliances. In this paper, we review some of the recent architectural efforts in developing integrated media systems. We primarily focus on two efforts, namely the evolution of multimedia-capable general purpose processors and a more recent effort in developing single component mixed media co-processors. Design considerations that could facilitate the migration of these technologies to a portable integrated media system also are presented.

  15. Calibrating IR Cameras for In-Situ Temperature Measurement During the Electron Beam Melting Process using Inconel 718 and Ti-Al6-V4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinwiddie, Ralph Barton; Lloyd, Peter D; Dehoff, Ryan R

    2016-01-01

    The Department of Energy s (DOE) Manufacturing Demonstration Facility (MDF) at Oak Ridge National Laboratory (ORNL) provides world-leading capabilities in advanced manufacturing (AM) facilities which leverage previous, on-going government investments in materials science research and characterization. MDF contains systems for fabricating components with complex geometries using AM techniques (i.e. 3D-Printing). Various metal alloy printers, for example, use electron beam melting (EBM) systems for creating these components which are otherwise extremely difficult- if not impossible- to machine. ORNL has partnered with manufacturers on improving the final part quality of components and developing new materials for further advancing these devices. One methodmore » being used to study (AM) processes in more depth relies on the advanced imaging capabilities at ORNL. High performance mid-wave infrared (IR) cameras are used for in-situ process monitoring and temperature measurements. However, standard factory calibrations are insufficient due to very low transmissions of the leaded glass window required for X-ray absorption. Two techniques for temperature calibrations will be presented and compared. In-situ measurement of emittance will also be discussed. Ample information can be learned from in-situ IR process monitoring of the EBM process. Ultimately, these imaging systems have the potential for routine use for online quality assurance and feedback control.« less

  16. Cosmic non-TEM radiation and synthetic feed array sensor system in ASIC mixed signal technology

    NASA Astrophysics Data System (ADS)

    Centureli, F.; Scotti, G.; Tommasino, P.; Trifiletti, A.; Romano, F.; Cimmino, R.; Saitto, A.

    2014-08-01

    The paper deals with the opportunity to introduce "Not strictly TEM waves" Synthetic detection Method (NTSM), consisting in a Three Axis Digital Beam Processing (3ADBP), to enhance the performances of radio telescope and sensor systems. Current Radio Telescopes generally use the classic 3D "TEM waves" approximation Detection Method, which consists in a linear tomography process (Single or Dual axis beam forming processing) neglecting the small z component. The Synthetic FEED ARRAY three axis Sensor SYSTEM is an innovative technique using a synthetic detection of the generic "NOT strictly TEM Waves radiation coming from the Cosmo, which processes longitudinal component of Angular Momentum too. Than the simultaneous extraction from radiation of both the linear and quadratic information component, may reduce the complexity to reconstruct the Early Universe in the different requested scales. This next order approximation detection of the observed cosmologic processes, may improve the efficacy of the statistical numerical model used to elaborate the same information acquired. The present work focuses on detection of such waves at carrier frequencies in the bands ranging from LF to MMW. The work shows in further detail the new generation of on line programmable and reconfigurable Mixed Signal ASIC technology that made possible the innovative Synthetic Sensor. Furthermore the paper shows the ability of such technique to increase the Radio Telescope Array Antenna performances.

  17. SORPTION OF TOXIC ORGANIC COMPOUNDS ON WATERWATER SOLIDS: MECHANISMS AND MODELING

    EPA Science Inventory

    It is proposed that sorption is a combination of two fundamentally different processes: adsorption and partitioning. A sorption model was developed for both single-component and multicomponent systems. The model was tested using single-component experimental isotherm data of eig...

  18. Development of Parametric Mass and Volume Models for an Aerospace SOFC/Gas Turbine Hybrid System

    NASA Technical Reports Server (NTRS)

    Tornabene, Robert; Wang, Xiao-yen; Steffen, Christopher J., Jr.; Freeh, Joshua E.

    2005-01-01

    In aerospace power systems, mass and volume are key considerations to produce a viable design. The utilization of fuel cells is being studied for a commercial aircraft electrical power unit. Based on preliminary analyses, a SOFC/gas turbine system may be a potential solution. This paper describes the parametric mass and volume models that are used to assess an aerospace hybrid system design. The design tool utilizes input from the thermodynamic system model and produces component sizing, performance, and mass estimates. The software is designed such that the thermodynamic model is linked to the mass and volume model to provide immediate feedback during the design process. It allows for automating an optimization process that accounts for mass and volume in its figure of merit. Each component in the system is modeled with a combination of theoretical and empirical approaches. A description of the assumptions and design analyses is presented.

  19. A Methodology for the Design and Verification of Globally Asynchronous/Locally Synchronous Architectures

    NASA Technical Reports Server (NTRS)

    Miller, Steven P.; Whalen, Mike W.; O'Brien, Dan; Heimdahl, Mats P.; Joshi, Anjali

    2005-01-01

    Recent advanced in model-checking have made it practical to formally verify the correctness of many complex synchronous systems (i.e., systems driven by a single clock). However, many computer systems are implemented by asynchronously composing several synchronous components, where each component has its own clock and these clocks are not synchronized. Formal verification of such Globally Asynchronous/Locally Synchronous (GA/LS) architectures is a much more difficult task. In this report, we describe a methodology for developing and reasoning about such systems. This approach allows a developer to start from an ideal system specification and refine it along two axes. Along one axis, the system can be refined one component at a time towards an implementation. Along the other axis, the behavior of the system can be relaxed to produce a more cost effective but still acceptable solution. We illustrate this process by applying it to the synchronization logic of a Dual Fight Guidance System, evolving the system from an ideal case in which the components do not fail and communicate synchronously to one in which the components can fail and communicate asynchronously. For each step, we show how the system requirements have to change if the system is to be implemented and prove that each implementation meets the revised system requirements through modelchecking.

  20. Beyond a series of security nets: Applying STAMP & STPA to port security

    DOE PAGES

    Williams, Adam D.

    2015-11-17

    Port security is an increasing concern considering the significant role of ports in global commerce and today’s increasingly complex threat environment. Current approaches to port security mirror traditional models of accident causality -- ‘a series of security nets’ based on component reliability and probabilistic assumptions. Traditional port security frameworks result in isolated and inconsistent improvement strategies. Recent work in engineered safety combines the ideas of hierarchy, emergence, control and communication into a new paradigm for understanding port security as an emergent complex system property. The ‘System-Theoretic Accident Model and Process (STAMP)’ is a new model of causality based on systemsmore » and control theory. The associated analysis process -- System Theoretic Process Analysis (STPA) -- identifies specific technical or procedural security requirements designed to work in coordination with (and be traceable to) overall port objectives. This process yields port security design specifications that can mitigate (if not eliminate) port security vulnerabilities related to an emphasis on component reliability, lack of coordination between port security stakeholders or economic pressures endemic in the maritime industry. As a result, this article aims to demonstrate how STAMP’s broader view of causality and complexity can better address the dynamic and interactive behaviors of social, organizational and technical components of port security.« less

  1. Beyond a series of security nets: Applying STAMP & STPA to port security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Adam D.

    Port security is an increasing concern considering the significant role of ports in global commerce and today’s increasingly complex threat environment. Current approaches to port security mirror traditional models of accident causality -- ‘a series of security nets’ based on component reliability and probabilistic assumptions. Traditional port security frameworks result in isolated and inconsistent improvement strategies. Recent work in engineered safety combines the ideas of hierarchy, emergence, control and communication into a new paradigm for understanding port security as an emergent complex system property. The ‘System-Theoretic Accident Model and Process (STAMP)’ is a new model of causality based on systemsmore » and control theory. The associated analysis process -- System Theoretic Process Analysis (STPA) -- identifies specific technical or procedural security requirements designed to work in coordination with (and be traceable to) overall port objectives. This process yields port security design specifications that can mitigate (if not eliminate) port security vulnerabilities related to an emphasis on component reliability, lack of coordination between port security stakeholders or economic pressures endemic in the maritime industry. As a result, this article aims to demonstrate how STAMP’s broader view of causality and complexity can better address the dynamic and interactive behaviors of social, organizational and technical components of port security.« less

  2. A database for TMT interface control documents

    NASA Astrophysics Data System (ADS)

    Gillies, Kim; Roberts, Scott; Brighton, Allan; Rogers, John

    2016-08-01

    The TMT Software System consists of software components that interact with one another through a software infrastructure called TMT Common Software (CSW). CSW consists of software services and library code that is used by developers to create the subsystems and components that participate in the software system. CSW also defines the types of components that can be constructed and their roles. The use of common component types and shared middleware services allows standardized software interfaces for the components. A software system called the TMT Interface Database System was constructed to support the documentation of the interfaces for components based on CSW. The programmer describes a subsystem and each of its components using JSON-style text files. A command interface file describes each command a component can receive and any commands a component sends. The event interface files describe status, alarms, and events a component publishes and status and events subscribed to by a component. A web application was created to provide a user interface for the required features. Files are ingested into the software system's database. The user interface allows browsing subsystem interfaces, publishing versions of subsystem interfaces, and constructing and publishing interface control documents that consist of the intersection of two subsystem interfaces. All published subsystem interfaces and interface control documents are versioned for configuration control and follow the standard TMT change control processes. Subsystem interfaces and interface control documents can be visualized in the browser or exported as PDF files.

  3. Nonterrestrial material processing and manufacturing of large space systems

    NASA Technical Reports Server (NTRS)

    Von Tiesenhausen, G.

    1979-01-01

    Nonterrestrial processing of materials and manufacturing of large space system components from preprocessed lunar materials at a manufacturing site in space is described. Lunar materials mined and preprocessed at the lunar resource complex will be flown to the space manufacturing facility (SMF), where together with supplementary terrestrial materials, they will be final processed and fabricated into space communication systems, solar cell blankets, radio frequency generators, and electrical equipment. Satellite Power System (SPS) material requirements and lunar material availability and utilization are detailed, and the SMF processing, refining, fabricating facilities, material flow and manpower requirements are described.

  4. Tracing CNO exposed layers in the Algol-type binary system u Her

    NASA Astrophysics Data System (ADS)

    Kolbas, V.; Dervişoğlu, A.; Pavlovski, K.; Southworth, J.

    2014-11-01

    The chemical composition of stellar photospheres in mass-transferring binary systems is a precious diagnostic of the nucleosynthesis processes that occur deep within stars, and preserves information on the components' history. The binary system u Her belongs to a group of hot Algols with both components being B stars. We have isolated the individual spectra of the two components by the technique of spectral disentangling of a new series of 43 high-resolution échelle spectra. Augmenting these with an analysis of the Hipparcos photometry of the system yields revised stellar quantities for the components of u Her. For the primary component (the mass-gaining star), we find MA = 7.88 ± 0.26 M⊙, RA = 4.93 ± 0.15 R⊙ and Teff, A = 21 600 ± 220 K. For the secondary (the mass-losing star) we find MB = 2.79 ± 0.12 M⊙, RB = 4.26 ± 0.06 R⊙ and Teff, B = 12 600 ± 550 K. A non-local thermodynamic equilibrium analysis of the primary star's atmosphere reveals deviations in the abundances of nitrogen and carbon from the standard cosmic abundance pattern in accord with theoretical expectations for CNO nucleosynthesis processing. From a grid of calculated evolutionary models the best match to the observed properties of the stars in u Her enabled tracing the initial properties and history of this binary system. We confirm that it has evolved according to case A mass transfer. A detailed abundance analysis of the primary star gives C/N = 0.9, which supports the evolutionary calculations and indicates strong mixing in the early evolution of the secondary component, which was originally the more massive of the two. The composition of the secondary component would be a further important constraint on the initial properties of u Her system, but requires spectra of a higher signal-to-noise ratio.

  5. Evaluation of IT security – genesis and its state-of-art

    NASA Astrophysics Data System (ADS)

    Livshitz, I. I.; Neklyudov, A. V.; Lontsikh, P. A.

    2018-05-01

    It is topical to evolve processes of an evaluation of the IT security nowadays. Formation and application of common evaluation approaches to the IT component, which are processed by the governmental and civil organizations, are still not solving problem. Successful processing of the independent evaluation for conformity with a security standard is supposed to be the main criteria of a suitability of any IT component to be used in a trusted computer system. The solution of the mentioned-above problem is suggested through the localization of all research, development and producing processes in a national trusted area (digital sovereignty).

  6. New laser machining processes for shape memory alloys

    NASA Astrophysics Data System (ADS)

    Haferkamp, Heinz; Paschko, Stefan; Goede, Martin

    2001-04-01

    Due to special material properties, shape memory alloys (SMA) are finding increasing attention in micro system technology. However, only a few processes are available for the machining of miniaturized SMA-components. In this connection, laser material processing offers completely new possibilities. This paper describes the actual status of two projects that are being carried out to qualify new methods to machine SMA components by means of laser radiation. Within one project, the laser material ablation process of miniaturized SMA- components using ultra-short laser pulses (pulse duration: approx. 200 fs) in comparison to conventional laser material ablation is being investigated. Especially for SMA micro- sensors and actuators, it is important to minimize the heat affected zone (HAZ) to maintain the special mechanical properties. Light-microscopic investigations of the grain texture of SMA devices processed with ultra-short laser pulses show that the HAZ can be neglected. Presently, the main goal of the project is to qualify this new processing technique for the micro-structuring of complex SMA micro devices with high precision. Within a second project, investigations are being carried out to realize the induction of the two-way memory effect (TWME) into SMA components using laser radiation. By precisely heating SMA components with laser radiation, local tensions remain near the component surface. In connection with the shape memory effect, these tensions can be used to make the components execute complicated movements. Compared to conventional training methods to induce the TWME, this procedure is faster and easier. Furthermore, higher numbers of thermal cycling are expected because of the low dislocation density in the main part of the component.

  7. System stability and calibrations for hand-held electromagnetic frequency domain instruments

    NASA Astrophysics Data System (ADS)

    Saksa, Pauli J.; Sorsa, Joona

    2017-05-01

    There are a few multiple-frequency domain electromagnetic induction (EMI) hand-held rigid boom systems available for shallow geophysical resistivity investigations. They basically measure secondary field real and imaginary components after the system calibrations. One multiple-frequency system, the EMP-400 Profiler from Geophysical Survey Systems Inc., was tested for system calibrations, stability and various effects present in normal measurements like height variation, tilting, signal stacking and time stability. Results indicated that in test conditions, repeatable high-accuracy imaginary component values can be recorded for near-surface frequency soundings. In test conditions, real components are also stable but vary strongly in normal surveying measurements. However, certain calibration issues related to the combination of user influence and measurement system height were recognised as an important factor in reducing for data errors and for further processing like static offset corrections.

  8. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quock, D. E. R.; Cianciarulo, M. B.; APS Engineering Support Division

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, themore » necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.« less

  9. Integrating complex business processes for knowledge-driven clinical decision support systems.

    PubMed

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  10. On-rail solution for autonomous inspections in electrical substations

    NASA Astrophysics Data System (ADS)

    Silva, Bruno P. A.; Ferreira, Rafael A. M.; Gomes, Selson C.; Calado, Flavio A. R.; Andrade, Roberto M.; Porto, Matheus P.

    2018-05-01

    This work presents an alternative solution for autonomous inspections in electrical substations. The autonomous system is a robot that moves on rails, collects infrared and visible images of selected targets, also processes the data and predicts the components lifetime. The robot moves on rails to overcome difficulties found in not paved substations commonly encountered in Brazil. We take advantage of using rails to convey the data by them, minimizing the electromagnetic interference, and at the same time transmitting electrical energy to feed the autonomous system. As part of the quality control process, we compared thermographic inspections made by the robot with inspections made by a trained thermographer using a scientific camera Flir® SC660. The results have shown that the robot achieved satisfactory results, identifying components and measuring temperature accurately. The embodied routine considers the weather changes along the day, providing a standard result of the components thermal response, also gives the uncertainty of temperature measurement, contributing to the quality in the decision making process.

  11. Experimental Design and Interpretation of Functional Neuroimaging Studies of Cognitive Processes

    PubMed Central

    Caplan, David

    2008-01-01

    This article discusses how the relation between experimental and baseline conditions in functional neuroimaging studies affects the conclusions that can be drawn from a study about the neural correlates of components of the cognitive system and about the nature and organization of those components. I argue that certain designs in common use—in particular the contrast of qualitatively different representations that are processed at parallel stages of a functional architecture—can never identify the neural basis of a cognitive operation and have limited use in providing information about the nature of cognitive systems. Other types of designs—such as ones that contrast representations that are computed in immediately sequential processing steps and ones that contrast qualitatively similar representations that are parametrically related within a single processing stage—are more easily interpreted. PMID:17979122

  12. Primordial Evolution in the Finitary Process Soup

    NASA Astrophysics Data System (ADS)

    Görnerup, Olof; Crutchfield, James P.

    A general and basic model of primordial evolution—a soup of reacting finitary and discrete processes—is employed to identify and analyze fundamental mechanisms that generate and maintain complex structures in prebiotic systems. The processes—ɛ-machines as defined in computational mechanics—and their interaction networks both provide well defined notions of structure. This enables us to quantitatively demonstrate hierarchical self-organization in the soup in terms of complexity. We found that replicating processes evolve the strategy of successively building higher levels of organization by autocatalysis. Moreover, this is facilitated by local components that have low structural complexity, but high generality. In effect, the finitary process soup spontaneously evolves a selection pressure that favors such components. In light of the finitary process soup's generality, these results suggest a fundamental law of hierarchical systems: global complexity requires local simplicity.

  13. A system for automatic artifact removal in ictal scalp EEG based on independent component analysis and Bayesian classification.

    PubMed

    LeVan, P; Urrestarazu, E; Gotman, J

    2006-04-01

    To devise an automated system to remove artifacts from ictal scalp EEG, using independent component analysis (ICA). A Bayesian classifier was used to determine the probability that 2s epochs of seizure segments decomposed by ICA represented EEG activity, as opposed to artifact. The classifier was trained using numerous statistical, spectral, and spatial features. The system's performance was then assessed using separate validation data. The classifier identified epochs representing EEG activity in the validation dataset with a sensitivity of 82.4% and a specificity of 83.3%. An ICA component was considered to represent EEG activity if the sum of the probabilities that its epochs represented EEG exceeded a threshold predetermined using the training data. Otherwise, the component represented artifact. Using this threshold on the validation set, the identification of EEG components was performed with a sensitivity of 87.6% and a specificity of 70.2%. Most misclassified components were a mixture of EEG and artifactual activity. The automated system successfully rejected a good proportion of artifactual components extracted by ICA, while preserving almost all EEG components. The misclassification rate was comparable to the variability observed in human classification. Current ICA methods of artifact removal require a tedious visual classification of the components. The proposed system automates this process and removes simultaneously multiple types of artifacts.

  14. 10 CFR 434.517 - HVAC systems and equipment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... simulation, except that excess capacity provided to meet process loads need not be modeled unless the process... Reference Buildings. The zones in the simulation shall correspond to the zones provided by the controls in... simulation. Table 517.4.1—HVAC System Description for Prototype and Reference Buildings 1,2 HVAC component...

  15. Probabilistic Design and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  16. EOS MLS Science Data Processing System: A Description of Architecture and Capabilities

    NASA Technical Reports Server (NTRS)

    Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.

    2006-01-01

    This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.

  17. An Image Retrieval and Processing Expert System for the World Wide Web

    NASA Technical Reports Server (NTRS)

    Rodriguez, Ricardo; Rondon, Angelica; Bruno, Maria I.; Vasquez, Ramon

    1998-01-01

    This paper presents a system that is being developed in the Laboratory of Applied Remote Sensing and Image Processing at the University of P.R. at Mayaguez. It describes the components that constitute its architecture. The main elements are: a Data Warehouse, an Image Processing Engine, and an Expert System. Together, they provide a complete solution to researchers from different fields that make use of images in their investigations. Also, since it is available to the World Wide Web, it provides remote access and processing of images.

  18. Parallel Processing with Digital Signal Processing Hardware and Software

    NASA Technical Reports Server (NTRS)

    Swenson, Cory V.

    1995-01-01

    The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.

  19. Evaluation of pressurized water cleaning systems for hardware refurbishment

    NASA Technical Reports Server (NTRS)

    Dillard, Terry W.; Deweese, Charles D.; Hoppe, David T.; Vickers, John H.; Swenson, Gary J.; Hutchens, Dale E.

    1995-01-01

    Historically, refurbishment processes for RSRM motor cases and components have employed environmentally harmful materials. Specifically, vapor degreasing processes consume and emit large amounts of ozone depleting compounds. This program evaluates the use of pressurized water cleaning systems as a replacement for the vapor degreasing process. Tests have been conducted to determine if high pressure water washing, without any form of additive cleaner, is a viable candidate for replacing vapor degreasing processes. This paper discusses the findings thus far of Engineering Test Plan - 1168 (ETP-1168), 'Evaluation of Pressurized Water Cleaning Systems for Hardware Refurbishment.'

  20. 75 FR 68200 - Medical Devices; Radiology Devices; Reclassification of Full-Field Digital Mammography System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-05

    ... exposure control, image processing and reconstruction programs, patient and equipment supports, component..., acquisition workstation, automatic exposure control, image processing and reconstruction programs, patient and... may include was revised by adding automatic exposure control, image processing and reconstruction...

  1. Effect of Alloying Type and Lean Sintering Atmosphere on the Performance of PM Components

    NASA Astrophysics Data System (ADS)

    Sundaram, M. Vattur; Shvab, R.; Millot, S.; Hryha, E.; Nyborg, L.

    2017-12-01

    In order to be cost effective and to meet increasing performance demands, powder metallurgy steel components require continuous improvement in terms of materials and process development. This study demonstrates the feasibility of manufacturing structural components using two different alloys systems, i.e. lean Cr-prealloyed and diffusion bonded water atomised powders with different processing conditions. The components were sintered at two different temperatures, i.e. 1120 and 1250 °C for 30 minutes in three different atmospheres: vacuum, N2- 10%H2 atmosphere as well as lean N2-5%H2-0.5%CO-(0.1-0.4)%CH4 sintering atmosphere. Components after sintering were further processed by either low pressure carburizing, sinterhardening or case hardening. All trials were performed in the industrial furnaces to simulate the actual production of the components. Microstructure, fractography, apparent and micro hardness analyses were performed close to the surface and in the middle of the sample to characterize the degree of sintering (temperature and atmosphere) and the effect of heat treatment. In all cases, components possess mostly martensitic microstructure with a few bainitic regions. The fracture surface shows well developed sinter necks. Inter- and trans-granular ductile and cleavage fracture modes are dominant and their fraction is determined by the alloy and processing route.

  2. Long term trending of engineering data for the Hubble Space Telescope

    NASA Technical Reports Server (NTRS)

    Cox, Ross M.

    1993-01-01

    A major goal in spacecraft engineering analysis is the detection of component failures before the fact. Trending is the process of monitoring subsystem states to discern unusual behaviors. This involves reducing vast amounts of data about a component or subsystem into a form that helps humans discern underlying patterns and correlations. A long term trending system has been developed for the Hubble Space Telescope. Besides processing the data for 988 distinct telemetry measurements each day, it produces plots of 477 important parameters for the entire 24 hours. Daily updates to the trend files also produce 339 thirty day trend plots each month. The total system combines command procedures to control the execution of the C-based data processing program, user-written FORTRAN routines, and commercial off-the-shelf plotting software. This paper includes a discussion the performance of the trending system and of its limitations.

  3. WAMS measurements pre-processing for detecting low-frequency oscillations in power systems

    NASA Astrophysics Data System (ADS)

    Kovalenko, P. Y.

    2017-07-01

    Processing the data received from measurement systems implies the situation when one or more registered values stand apart from the sample collection. These values are referred to as “outliers”. The processing results may be influenced significantly by the presence of those in the data sample under consideration. In order to ensure the accuracy of low-frequency oscillations detection in power systems the corresponding algorithm has been developed for the outliers detection and elimination. The algorithm is based on the concept of the irregular component of measurement signal. This component comprises measurement errors and is assumed to be Gauss-distributed random. The median filtering is employed to detect the values lying outside the range of the normally distributed measurement error on the basis of a 3σ criterion. The algorithm has been validated involving simulated signals and WAMS data as well.

  4. The numerical methods for the development of the mixture region in the vapor explosion simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Y.; Ohashi, H.; Akiyama, M.

    An attempt to numerically simulate the process of the vapor explosion with a general multi-component and multi-dimension code is being challenged. Because of the rapid change of the flow field and extremely nonuniform distribution of the components in the system of the vapor explosion, the numerical divergence and diffusion are subject to occur easily. A dispersed component model and a multiregion scheme, by which these difficulties can be effectively overcome, were proposed. The simulations have been performed for the processes of the premixing and the fragmentation propagation in the vapor explosion.

  5. Automated fiber placement composite manufacturing: The mission at MSFC's Productivity Enhancement Complex

    NASA Technical Reports Server (NTRS)

    Vickers, John H.; Pelham, Larry I.

    1993-01-01

    Automated fiber placement is a manufacturing process used for producing complex composite structures. It is a notable leap to the state-of-the-art in technology for automated composite manufacturing. The fiber placement capability was established at the Marshall Space Flight Center's (MSFC) Productivity Enhancement Complex in 1992 in collaboration with Thiokol Corporation to provide materials and processes research and development, and to fabricate components for many of the Center's Programs. The Fiber Placement System (FPX) was developed as a distinct solution to problems inherent to other automated composite manufacturing systems. This equipment provides unique capabilities to build composite parts in complex 3-D shapes with concave and other asymmetrical configurations. Components with complex geometries and localized reinforcements usually require labor intensive efforts resulting in expensive, less reproducible components; the fiber placement system has the features necessary to overcome these conditions. The mechanical systems of the equipment have the motion characteristics of a filament winder and the fiber lay-up attributes of a tape laying machine, with the additional capabilities of differential tow payout speeds, compaction and cut-restart to selectively place the correct number of fibers where the design dictates. This capability will produce a repeatable process resulting in lower cost and improved quality and reliability.

  6. System for loading executable code into volatile memory in a downhole tool

    DOEpatents

    Hall, David R.; Bartholomew, David B.; Johnson, Monte L.

    2007-09-25

    A system for loading an executable code into volatile memory in a downhole tool string component comprises a surface control unit comprising executable code. An integrated downhole network comprises data transmission elements in communication with the surface control unit and the volatile memory. The executable code, stored in the surface control unit, is not permanently stored in the downhole tool string component. In a preferred embodiment of the present invention, the downhole tool string component comprises boot memory. In another embodiment, the executable code is an operating system executable code. Preferably, the volatile memory comprises random access memory (RAM). A method for loading executable code to volatile memory in a downhole tool string component comprises sending the code from the surface control unit to a processor in the downhole tool string component over the network. A central processing unit writes the executable code in the volatile memory.

  7. Hybrid and electric advanced vehicle systems (heavy) simulation

    NASA Technical Reports Server (NTRS)

    Hammond, R. A.; Mcgehee, R. K.

    1981-01-01

    A computer program to simulate hybrid and electric advanced vehicle systems (HEAVY) is described. It is intended for use early in the design process: concept evaluation, alternative comparison, preliminary design, control and management strategy development, component sizing, and sensitivity studies. It allows the designer to quickly, conveniently, and economically predict the performance of a proposed drive train. The user defines the system to be simulated using a library of predefined component models that may be connected to represent a wide variety of propulsion systems. The development of three models are discussed as examples.

  8. Evidence Of A Black Hole In The X-ray Binary System Cygnus X-3

    NASA Astrophysics Data System (ADS)

    Lombardi, C.; Virgilli, E.; Titarchuk, L.; Frontera, F.; Farinelli, R.

    2011-09-01

    Recently a close correlation between the photon index of the power law component and either the frequency of Quasi Periodic Oscillations (QPOs) or the flow of accretion disk has been found in the X-ray data concerning Black Holes (BH) in binary systems. The shape of this relationship, characterized by a saturation index when the system achieves high spectral brightness, finds a natural explanation in the processes of thermal and bulk Comptonization which are unique characteristic of the presence of a BH. For the whole set of observation we adopted a model consisting of the spectral component of BMC (Bulk Motion Comptonization model) that takes into account the direct emission of black body and the Comptonization process.

  9. 'We didn't know anything, it was a mess!' Emergent structures and the effectiveness of a rescue operation multi-team system.

    PubMed

    Fleştea, Alina Maria; Fodor, Oana Cătălina; Curşeu, Petru Lucian; Miclea, Mircea

    2017-01-01

    Multi-team systems (MTS) are used to tackle unpredictable events and to respond effectively to fast-changing environmental contingencies. Their effectiveness is influenced by within as well as between team processes (i.e. communication, coordination) and emergent phenomena (i.e. situational awareness). The present case study explores the way in which the emergent structures and the involvement of bystanders intertwine with the dynamics of processes and emergent states both within and between the component teams. Our findings show that inefficient transition process and the ambiguous leadership generated poor coordination and hindered the development of emergent phenomena within the whole system. Emergent structures and bystanders substituted leadership functions and provided a pool of critical resources for the MTS. Their involvement fostered the emergence of situational awareness and facilitated contingency planning processes. However, bystander involvement impaired the emergence of cross-understandings and interfered with coordination processes between the component teams. Practitioner Summary: Based on a real emergency situation, the present research provides important theoretical and practical insights about the role of bystander involvement in the dynamics of multi-team systems composed to tackle complex tasks and respond to fast changing and unpredictable environmental contingencies.

  10. Advanced Photonic Processes for Photovoltaic and Energy Storage Systems.

    PubMed

    Sygletou, Maria; Petridis, Constantinos; Kymakis, Emmanuel; Stratakis, Emmanuel

    2017-10-01

    Solar-energy harvesting through photovoltaic (PV) conversion is the most promising technology for long-term renewable energy production. At the same time, significant progress has been made in the development of energy-storage (ES) systems, which are essential components within the cycle of energy generation, transmission, and usage. Toward commercial applications, the enhancement of the performance and competitiveness of PV and ES systems requires the adoption of precise, but simple and low-cost manufacturing solutions, compatible with large-scale and high-throughput production lines. Photonic processes enable cost-efficient, noncontact, highly precise, and selective engineering of materials via photothermal, photochemical, or photophysical routes. Laser-based processes, in particular, provide access to a plethora of processing parameters that can be tuned with a remarkably high degree of precision to enable innovative processing routes that cannot be attained by conventional approaches. The focus here is on the application of advanced light-driven approaches for the fabrication, as well as the synthesis, of materials and components relevant to PV and ES systems. Besides presenting recent advances on recent achievements, the existing limitations are outlined and future possibilities and emerging prospects discussed. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Quality and Safety in Health Care, Part XII: The Work System, Testing, and Clinical Reasoning.

    PubMed

    Harolds, Jay A

    2016-07-01

    Donabedian felt the 3 major components affecting quality were process, structure, and outcome. Later investigators often substitute the word "structure" for a broader concept called the "work system." One component of the latter is the people involved, and for diagnosis, this often is best done with a diagnostic team. The work system in diagnosis has many obstacles to achieve optimum performance. There are also important problems with how tests are ordered and interpreted and clinical reasoning and biases.

  12. Indigenous Peoples Involment At The Environmental Impact Assessment (EIA) Process in Tabi Mamta Area Of Papua Province

    NASA Astrophysics Data System (ADS)

    Dhiksawan, Ferdinand Saras; Hadi, Sudharto P.; Samekto, Adji; Sasongko, Dwi P.

    2018-02-01

    The purpose of this study is to find a picture of the involvement of Indigenous Peoples of Tabi Mamta in the process of environmental impact assessment (EIA) in Tabi Mamta customary territory. The method and type of research used is non-ethnographic qualitative research with data collection techniques using limited observation techniques. Data and information in the field will be analyzed using constructivism paradigm. The paradigm of constructivism is based on an interpretive understanding called hermeneutics (hermeneuien) in the sense of interpreting, giving understanding, translating data and information obtained in the research location as a result of social reality. The results of this study indicate that the customary community of Tabi Mamta is a unit of customary community that still has territorial customary territory, has a customary leadership structure, still visible relationship of kinship, cultural values as well as customary norms and sanctions, and has environmental wisdom in maintaining existence Natural resources. In the socio-cultural system of customary communities there are components such as customary stratification, permissiveness, communication, reciprocity, past history, cultural values, customary norms and sanctions, religious and customary leadership. Components in the socio-cultural system of indigenous and tribal peoples play a role in the EIA process in the Tabi Mamta customary area especially in the environmental feasibility decision making process. The components of custom stratification, cultural values and customary norms play a role in the EIA process. In customary stratification there is uncustomary structure in the ondoafi, Iram and Tribal Leadership. Components in a sociual cultural system is a unity resulting from interaction between individuals and groups to prevent environmental damage and disturbance of natural resources. Natural resources are considered as ancestral symbols passed down by ancestors from generation to generation

  13. Development of expert systems for modeling of technological process of pressure casting on the basis of artificial intelligence

    NASA Astrophysics Data System (ADS)

    Gavarieva, K. N.; Simonova, L. A.; Pankratov, D. L.; Gavariev, R. V.

    2017-09-01

    In article the main component of expert system of process of casting under pressure which consists of algorithms, united in logical models is considered. The characteristics of system showing data on a condition of an object of management are described. A number of logically interconnected steps allowing to increase quality of the received castings is developed

  14. A System for the Individualization and Optimization of Learning Through Computer Management of the Educational Process. Final Report.

    ERIC Educational Resources Information Center

    Schure, Alexander

    A computer-based system model for the monitoring and management of the instructional process was conceived, developed and refined through the techniques of systems analysis. This report describes the various aspects and components of this project in a series of independent and self-contained units. The first unit provides an overview of the entire…

  15. Heat pump processes induced by laser radiation

    NASA Technical Reports Server (NTRS)

    Garbuny, M.; Henningsen, T.

    1980-01-01

    A carbon dioxide laser system was constructed for the demonstration of heat pump processes induced by laser radiation. The system consisted of a frequency doubling stage, a gas reaction cell with its vacuum and high purity gas supply system, and provisions to measure the temperature changes by pressure, or alternatively, by density changes. The theoretical considerations for the choice of designs and components are dicussed.

  16. [Support of the nursing process through electronic nursing documentation systems (UEPD) – Initial validation of an instrument].

    PubMed

    Hediger, Hannele; Müller-Staub, Maria; Petry, Heidi

    2016-01-01

    Electronic nursing documentation systems, with standardized nursing terminology, are IT-based systems for recording the nursing processes. These systems have the potential to improve the documentation of the nursing process and to support nurses in care delivery. This article describes the development and initial validation of an instrument (known by its German acronym UEPD) to measure the subjectively-perceived benefits of an electronic nursing documentation system in care delivery. The validity of the UEPD was examined by means of an evaluation study carried out in an acute care hospital (n = 94 nurses) in German-speaking Switzerland. Construct validity was analyzed by principal components analysis. Initial references of validity of the UEPD could be verified. The analysis showed a stable four factor model (FS = 0.89) scoring in 25 items. All factors loaded ≥ 0.50 and the scales demonstrated high internal consistency (Cronbach's α = 0.73 – 0.90). Principal component analysis revealed four dimensions of support: establishing nursing diagnosis and goals; recording a case history/an assessment and documenting the nursing process; implementation and evaluation as well as information exchange. Further testing with larger control samples and with different electronic documentation systems are needed. Another potential direction would be to employ the UEPD in a comparison of various electronic documentation systems.

  17. Static and dynamic high power, space nuclear electric generating systems

    NASA Technical Reports Server (NTRS)

    Wetch, J. R.; Begg, L. L.; Koester, J. K.

    1985-01-01

    Space nuclear electric generating systems concepts have been assessed for their potential in satisfying future spacecraft high power (several megawatt) requirements. Conceptual designs have been prepared for reactor power systems using the most promising static (thermionic) and the most promising dynamic conversion processes. Component and system layouts, along with system mass and envelope requirements have been made. Key development problems have been identified and the impact of the conversion process selection upon thermal management and upon system and vehicle configuration is addressed.

  18. Gas-Liquid Supersonic Cleaning and Cleaning Verification Spray System

    NASA Technical Reports Server (NTRS)

    Parrish, Lewis M.

    2009-01-01

    NASA Kennedy Space Center (KSC) recently entered into a nonexclusive license agreement with Applied Cryogenic Solutions (ACS), Inc. (Galveston, TX) to commercialize its Gas-Liquid Supersonic Cleaning and Cleaning Verification Spray System technology. This technology, developed by KSC, is a critical component of processes being developed and commercialized by ACS to replace current mechanical and chemical cleaning and descaling methods used by numerous industries. Pilot trials on heat exchanger tubing components have shown that the ACS technology provides for: Superior cleaning in a much shorter period of time. Lower energy and labor requirements for cleaning and de-scaling uper.ninih. Significant reductions in waste volumes by not using water, acidic or basic solutions, organic solvents, or nonvolatile solid abrasives as components in the cleaning process. Improved energy efficiency in post-cleaning heat exchanger operations. The ACS process consists of a spray head containing supersonic converging/diverging nozzles, a source of liquid gas; a novel, proprietary pumping system that permits pumping liquid nitrogen, liquid air, or supercritical carbon dioxide to pressures in the range of 20,000 to 60,000 psi; and various hoses, fittings, valves, and gauges. The size and number of nozzles can be varied so the system can be built in configurations ranging from small hand-held spray heads to large multinozzle cleaners. The system also can be used to verify if a part has been adequately cleaned.

  19. Use of ceramics in point-focus solar receivers

    NASA Technical Reports Server (NTRS)

    Smoak, R. H.; Kudirka, A. A.

    1981-01-01

    One of the research and development efforts in the Solar Thermal Energy Systems Project at the Jet Propulsion Laboratory has been focused on application of ceramic components for advanced point-focus solar receivers. The impetus for this effort is a need for high efficiency, low cost solar receivers which operate in a temperature regime where use of metal components is impractical. The current status of the work on evaluation of ceramic components at JPL and elsewhere is outlined and areas where lack of knowledge is currently slowing application of ceramics are discussed. Future developments of ceramic processing technology and reliability assurance methodology should open up applications for the point-focus solar concentrator system in fuels and chemicals production, in thermochemical energy transport and storage, in detoxification of hazardous materials and in high temperature process heat as well as for electric power generation.

  20. Neural Networks for Rapid Design and Analysis

    NASA Technical Reports Server (NTRS)

    Sparks, Dean W., Jr.; Maghami, Peiman G.

    1998-01-01

    Artificial neural networks have been employed for rapid and efficient dynamics and control analysis of flexible systems. Specifically, feedforward neural networks are designed to approximate nonlinear dynamic components over prescribed input ranges, and are used in simulations as a means to speed up the overall time response analysis process. To capture the recursive nature of dynamic components with artificial neural networks, recurrent networks, which use state feedback with the appropriate number of time delays, as inputs to the networks, are employed. Once properly trained, neural networks can give very good approximations to nonlinear dynamic components, and by their judicious use in simulations, allow the analyst the potential to speed up the analysis process considerably. To illustrate this potential speed up, an existing simulation model of a spacecraft reaction wheel system is executed, first conventionally, and then with an artificial neural network in place.

  1. Singular value decomposition for photon-processing nuclear imaging systems and applications for reconstruction and computing null functions.

    PubMed

    Jha, Abhinav K; Barrett, Harrison H; Frey, Eric C; Clarkson, Eric; Caucci, Luca; Kupinski, Matthew A

    2015-09-21

    Recent advances in technology are enabling a new class of nuclear imaging systems consisting of detectors that use real-time maximum-likelihood (ML) methods to estimate the interaction position, deposited energy, and other attributes of each photon-interaction event and store these attributes in a list format. This class of systems, which we refer to as photon-processing (PP) nuclear imaging systems, can be described by a fundamentally different mathematical imaging operator that allows processing of the continuous-valued photon attributes on a per-photon basis. Unlike conventional photon-counting (PC) systems that bin the data into images, PP systems do not have any binning-related information loss. Mathematically, while PC systems have an infinite-dimensional null space due to dimensionality considerations, PP systems do not necessarily suffer from this issue. Therefore, PP systems have the potential to provide improved performance in comparison to PC systems. To study these advantages, we propose a framework to perform the singular-value decomposition (SVD) of the PP imaging operator. We use this framework to perform the SVD of operators that describe a general two-dimensional (2D) planar linear shift-invariant (LSIV) PP system and a hypothetical continuously rotating 2D single-photon emission computed tomography (SPECT) PP system. We then discuss two applications of the SVD framework. The first application is to decompose the object being imaged by the PP imaging system into measurement and null components. We compare these components to the measurement and null components obtained with PC systems. In the process, we also present a procedure to compute the null functions for a PC system. The second application is designing analytical reconstruction algorithms for PP systems. The proposed analytical approach exploits the fact that PP systems acquire data in a continuous domain to estimate a continuous object function. The approach is parallelizable and implemented for graphics processing units (GPUs). Further, this approach leverages another important advantage of PP systems, namely the possibility to perform photon-by-photon real-time reconstruction. We demonstrate the application of the approach to perform reconstruction in a simulated 2D SPECT system. The results help to validate and demonstrate the utility of the proposed method and show that PP systems can help overcome the aliasing artifacts that are otherwise intrinsically present in PC systems.

  2. Singular value decomposition for photon-processing nuclear imaging systems and applications for reconstruction and computing null functions

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Barrett, Harrison H.; Frey, Eric C.; Clarkson, Eric; Caucci, Luca; Kupinski, Matthew A.

    2015-09-01

    Recent advances in technology are enabling a new class of nuclear imaging systems consisting of detectors that use real-time maximum-likelihood (ML) methods to estimate the interaction position, deposited energy, and other attributes of each photon-interaction event and store these attributes in a list format. This class of systems, which we refer to as photon-processing (PP) nuclear imaging systems, can be described by a fundamentally different mathematical imaging operator that allows processing of the continuous-valued photon attributes on a per-photon basis. Unlike conventional photon-counting (PC) systems that bin the data into images, PP systems do not have any binning-related information loss. Mathematically, while PC systems have an infinite-dimensional null space due to dimensionality considerations, PP systems do not necessarily suffer from this issue. Therefore, PP systems have the potential to provide improved performance in comparison to PC systems. To study these advantages, we propose a framework to perform the singular-value decomposition (SVD) of the PP imaging operator. We use this framework to perform the SVD of operators that describe a general two-dimensional (2D) planar linear shift-invariant (LSIV) PP system and a hypothetical continuously rotating 2D single-photon emission computed tomography (SPECT) PP system. We then discuss two applications of the SVD framework. The first application is to decompose the object being imaged by the PP imaging system into measurement and null components. We compare these components to the measurement and null components obtained with PC systems. In the process, we also present a procedure to compute the null functions for a PC system. The second application is designing analytical reconstruction algorithms for PP systems. The proposed analytical approach exploits the fact that PP systems acquire data in a continuous domain to estimate a continuous object function. The approach is parallelizable and implemented for graphics processing units (GPUs). Further, this approach leverages another important advantage of PP systems, namely the possibility to perform photon-by-photon real-time reconstruction. We demonstrate the application of the approach to perform reconstruction in a simulated 2D SPECT system. The results help to validate and demonstrate the utility of the proposed method and show that PP systems can help overcome the aliasing artifacts that are otherwise intrinsically present in PC systems.

  3. Chemical annotation of small and peptide-like molecules at the Protein Data Bank

    PubMed Central

    Young, Jasmine Y.; Feng, Zukang; Dimitropoulos, Dimitris; Sala, Raul; Westbrook, John; Zhuravleva, Marina; Shao, Chenghua; Quesada, Martha; Peisach, Ezra; Berman, Helen M.

    2013-01-01

    Over the past decade, the number of polymers and their complexes with small molecules in the Protein Data Bank archive (PDB) has continued to increase significantly. To support scientific advancements and ensure the best quality and completeness of the data files over the next 10 years and beyond, the Worldwide PDB partnership that manages the PDB archive is developing a new deposition and annotation system. This system focuses on efficient data capture across all supported experimental methods. The new deposition and annotation system is composed of four major modules that together support all of the processing requirements for a PDB entry. In this article, we describe one such module called the Chemical Component Annotation Tool. This tool uses information from both the Chemical Component Dictionary and Biologically Interesting molecule Reference Dictionary to aid in annotation. Benchmark studies have shown that the Chemical Component Annotation Tool provides significant improvements in processing efficiency and data quality. Database URL: http://wwpdb.org PMID:24291661

  4. The N2-P3 complex of the evoked potential and human performance

    NASA Technical Reports Server (NTRS)

    Odonnell, Brian F.; Cohen, Ronald A.

    1988-01-01

    The N2-P3 complex and other endogenous components of human evoked potential provide a set of tools for the investigation of human perceptual and cognitive processes. These multidimensional measures of central nervous system bioelectrical activity respond to a variety of environmental and internal factors which have been experimentally characterized. Their application to the analysis of human performance in naturalistic task environments is just beginning. Converging evidence suggests that the N2-P3 complex reflects processes of stimulus evaluation, perceptual resource allocation, and decision making that proceed in parallel, rather than in series, with response generation. Utilization of these EP components may provide insights into the central nervous system mechanisms modulating task performance unavailable from behavioral measures alone. The sensitivity of the N2-P3 complex to neuropathology, psychopathology, and pharmacological manipulation suggests that these components might provide sensitive markers for the effects of environmental stressors on the human central nervous system.

  5. Chemical annotation of small and peptide-like molecules at the Protein Data Bank.

    PubMed

    Young, Jasmine Y; Feng, Zukang; Dimitropoulos, Dimitris; Sala, Raul; Westbrook, John; Zhuravleva, Marina; Shao, Chenghua; Quesada, Martha; Peisach, Ezra; Berman, Helen M

    2013-01-01

    Over the past decade, the number of polymers and their complexes with small molecules in the Protein Data Bank archive (PDB) has continued to increase significantly. To support scientific advancements and ensure the best quality and completeness of the data files over the next 10 years and beyond, the Worldwide PDB partnership that manages the PDB archive is developing a new deposition and annotation system. This system focuses on efficient data capture across all supported experimental methods. The new deposition and annotation system is composed of four major modules that together support all of the processing requirements for a PDB entry. In this article, we describe one such module called the Chemical Component Annotation Tool. This tool uses information from both the Chemical Component Dictionary and Biologically Interesting molecule Reference Dictionary to aid in annotation. Benchmark studies have shown that the Chemical Component Annotation Tool provides significant improvements in processing efficiency and data quality. Database URL: http://wwpdb.org.

  6. [Neural and cognitive correlates of social cognition: findings on neuropsychological and neuroimaging studies].

    PubMed

    Kobayakawa, Mutsutaka; Kawamura, Mitsuru

    2011-12-01

    Social cognition includes various components of information processing related to communication with other individuals. In this review, we have discussed 3 components of social cognitive function: face recognition, empathy, and decision making. Our social behavior involves recognition based on facial features and also involves empathizing with others; while making decisions, it is important to consider the social consequences of the course of action followed. Face recognition is divided into 2 routes for information processing: a route responsible for overt recognition of the face's identity and a route for emotional and orienting responses based on the face's personal affective significance. Two systems are possibly involved in empathy: a basic emotional contagion "mirroring" system and a more advanced "theory of mind" system that considers the cognitive perspective. Decision making is mediated by a widespread system that includes several cortical and subcortical components. Numerous lesion and neuroimaging studies have contributed to clarifying the neural correlates of social cognitive function, and greater information can be obtained on social cognitive function by combining these 2 approaches.

  7. Applied Research Study

    NASA Technical Reports Server (NTRS)

    Leach, Ronald J.

    1997-01-01

    The purpose of this project was to study the feasibility of reusing major components of a software system that had been used to control the operations of a spacecraft launched in the 1980s. The study was done in the context of a ground data processing system that was to be rehosted from a large mainframe to an inexpensive workstation. The study concluded that a systematic approach using inexpensive tools could aid in the reengineering process by identifying a set of certified reusable components. The study also developed procedures for determining duplicate versions of software, which were created because of inadequate naming conventions. Such procedures reduced reengineering costs by approximately 19.4 percent.

  8. In situ retreival of contaminants or other substances using a barrier system and leaching solutions and components, processes and methods relating thereto

    DOEpatents

    Nickelson, Reva A.; Walsh, Stephanie; Richardson, John G.; Dick, John R.; Sloan, Paul A.

    2005-06-28

    Processes and methods relating to treating contaminants and collecting desired substances from a zone of interest using subterranean collection and containment barriers. Tubular casings having interlock structures are used to create subterranean barriers for containing and treating buried waste and its effluents. The subterranean barrier includes an effluent collection system. Treatment solutions provided to the zone of interest pass therethrough and are collected by the barrier and treated or recovered, allowing on-site remediation. Barrier components may be used to in the treatment by collecting or removing contaminants or other materials from the zone of interest.

  9. Status of the calibration and alignment framework at the Belle II experiment

    NASA Astrophysics Data System (ADS)

    Dossett, D.; Sevior, M.; Ritter, M.; Kuhr, T.; Bilka, T.; Yaschenko, S.; Belle Software Group, II

    2017-10-01

    The Belle II detector at the Super KEKB e+e-collider plans to take first collision data in 2018. The monetary and CPU time costs associated with storing and processing the data mean that it is crucial for the detector components at Belle II to be calibrated quickly and accurately. A fast and accurate calibration system would allow the high level trigger to increase the efficiency of event selection, and can give users analysis-quality reconstruction promptly. A flexible framework to automate the fast production of calibration constants is being developed in the Belle II Analysis Software Framework (basf2). Detector experts only need to create two components from C++ base classes in order to use the automation system. The first collects data from Belle II event data files and outputs much smaller files to pass to the second component. This runs the main calibration algorithm to produce calibration constants ready for upload into the conditions database. A Python framework coordinates the input files, order of processing, and submission of jobs. Splitting the operation into collection and algorithm processing stages allows the framework to optionally parallelize the collection stage on a batch system.

  10. Automotive Stirling Engine Mod 1 Design Review, volume 2

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The auxiliaries and the control system for the ASE MOD I: (1) provide the required fuel and air flows for a well controlled combustion process, generating heat to the Stirling cycle; (2) provide a driver acceptable method for controlling the power output of the engine; (3) provide adequate lubrication and cooling water circulation; (4) generate the electric energy required for engine and vehicle operation; (5) provide a driver acceptable method for starting, stopping and monitoring the engine; and (6) provide a guard system, that protects the engine at component or system malfunction. The control principles and the way the different components and sub-systems interact are described as well as the different auxiliaries, the air fuel system, the power control systems and the electronics. The arrangement and location of auxiliaries and other major components are also examined.

  11. IAPSA 2 small-scale system specification

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Torkelson, Thomas C.

    1990-01-01

    The details of a hardware implementation of a representative small scale flight critical system is described using Advanced Information Processing System (AIPS) building block components and simulated sensor/actuator interfaces. The system was used to study application performance and reliability issues during both normal and faulted operation.

  12. Low cost solar array project. Task 1: Silicon material, gaseous melt replenishment system

    NASA Technical Reports Server (NTRS)

    Jewett, D. N.; Bates, H. E.; Hill, D. M.

    1979-01-01

    A system to combine silicon formation, by hydrogen reduction of trichlorosilane, with the capability to replenish a crystal growth system is described. A variety of process parameters to allow sizing and specification of gas handling system components was estimated.

  13. A Fully Non-Metallic Gas Turbine Engine Enabled by Additive Manufacturing Part I: System Analysis, Component Identification, Additive Manufacturing, and Testing of Polymer Composites

    NASA Technical Reports Server (NTRS)

    Grady, Joseph E.; Haller, William J.; Poinsatte, Philip E.; Halbig, Michael C.; Schnulo, Sydney L.; Singh, Mrityunjay; Weir, Don; Wali, Natalie; Vinup, Michael; Jones, Michael G.; hide

    2015-01-01

    The research and development activities reported in this publication were carried out under NASA Aeronautics Research Institute (NARI) funded project entitled "A Fully Nonmetallic Gas Turbine Engine Enabled by Additive Manufacturing." The objective of the project was to conduct evaluation of emerging materials and manufacturing technologies that will enable fully nonmetallic gas turbine engines. The results of the activities are described in three part report. The first part of the report contains the data and analysis of engine system trade studies, which were carried out to estimate reduction in engine emissions and fuel burn enabled due to advanced materials and manufacturing processes. A number of key engine components were identified in which advanced materials and additive manufacturing processes would provide the most significant benefits to engine operation. The technical scope of activities included an assessment of the feasibility of using additive manufacturing technologies to fabricate gas turbine engine components from polymer and ceramic matrix composites, which were accomplished by fabricating prototype engine components and testing them in simulated engine operating conditions. The manufacturing process parameters were developed and optimized for polymer and ceramic composites (described in detail in the second and third part of the report). A number of prototype components (inlet guide vane (IGV), acoustic liners, engine access door) were additively manufactured using high temperature polymer materials. Ceramic matrix composite components included turbine nozzle components. In addition, IGVs and acoustic liners were tested in simulated engine conditions in test rigs. The test results are reported and discussed in detail.

  14. Pressure activated interconnection of micro transfer printed components

    NASA Astrophysics Data System (ADS)

    Prevatte, Carl; Guven, Ibrahim; Ghosal, Kanchan; Gomez, David; Moore, Tanya; Bonafede, Salvatore; Raymond, Brook; Trindade, António Jose; Fecioru, Alin; Kneeburg, David; Meitl, Matthew A.; Bower, Christopher A.

    2016-05-01

    Micro transfer printing and other forms of micro assembly deterministically produce heterogeneously integrated systems of miniaturized components on non-native substrates. Most micro assembled systems include electrical interconnections to the miniaturized components, typically accomplished by metal wires formed on the non-native substrate after the assembly operation. An alternative scheme establishing interconnections during the assembly operation is a cost-effective manufacturing method for producing heterogeneous microsystems, and facilitates the repair of integrated microsystems, such as displays, by ex post facto addition of components to correct defects after system-level tests. This letter describes pressure-concentrating conductor structures formed on silicon (1 0 0) wafers to establish connections to preexisting conductive traces on glass and plastic substrates during micro transfer printing with an elastomer stamp. The pressure concentrators penetrate a polymer layer to form the connection, and reflow of the polymer layer bonds the components securely to the target substrate. The experimental yield of series-connected test systems with >1000 electrical connections demonstrates the suitability of the process for manufacturing, and robustness of the test systems against exposure to thermal shock, damp heat, and mechanical flexure shows reliability of the resulting bonds.

  15. Frequency Response Function Expansion for Unmeasured Translation and Rotation Dofs for Impedance Modelling Applications

    NASA Astrophysics Data System (ADS)

    Avitabile, P.; O'Callahan, J.

    2003-07-01

    Inclusion of rotational effects is critical for the accuracy of the predicted system characteristics, in almost all system modelling studies. However, experimentally derived information for the description of one or more of the components for the system will generally not have any rotational effects included in the description of the component. The lack of rotational effects has long affected the results from any system model development whether using a modal-based approach or an impedance-based approach. Several new expansion processes are described herein for the development of FRFs needed for impedance-based system models. These techniques expand experimentally derived mode shapes, residual modes from the modal parameter estimation process and FRFs directly to allow for the inclusion of the necessary rotational dof. The FRFs involving translational to rotational dofs are developed as well as the rotational to rotational dof. Examples are provided to show the use of these techniques.

  16. Design and analysis of x-ray vision systems for high-speed detection of foreign body contamination in food

    NASA Astrophysics Data System (ADS)

    Graves, Mark; Smith, Alexander; Batchelor, Bruce G.; Palmer, Stephen C.

    1994-10-01

    In the food industry there is an ever increasing need to control and monitor food quality. In recent years fully automated x-ray inspection systems have been used to detect food on-line for foreign body contamination. These systems involve a complex integration of x- ray imaging components with state of the art high speed image processing. The quality of the x-ray image obtained by such systems is very poor compared with images obtained from other inspection processes, this makes reliable detection of very small, low contrast defects extremely difficult. It is therefore extremely important to optimize the x-ray imaging components to give the very best image possible. In this paper we present a method of analyzing the x-ray imaging system in order to consider the contrast obtained when viewing small defects.

  17. ICIASF '85 - International Congress on Instrumentation in Aerospace Simulation Facilities, 11th, Stanford University, CA, August 26-28, 1985, Record

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Developments related to laser Doppler velocimetry are discussed, taking into account a three-component dual beam laser-Doppler-anemometer to be operated in large wind tunnels, a new optical system for three-dimensional laser-Doppler-anemometry using an argon-ion and a dye laser, and a two-component laser Doppler velocimeter by switching fringe orientation. Other topics studied are concerned with facilities, instrumentation, control, hot wire/thin film measurements, optical diagnostic techniques, signal and data processing, facilities and adaptive wall test sections, data acquisition and processing, ballistic instrument systems, dynamic testing and material deformation measurements, optical flow measurements, test techniques, force measurement systems, and holography. Attention is given to nonlinear calibration of integral wind tunnel balances, a microcomputer system for real time digitized image compression, and two phase flow diagnostics in propulsion systems.

  18. Composing Models of Geographic Physical Processes

    NASA Astrophysics Data System (ADS)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  19. Understanding Female Receiver Psychology in Reproductive Contexts.

    PubMed

    Lynch, Kathleen S

    2017-10-01

    Mate choice decision-making requires four components: sensory, cognitive, motivation, and salience. During the breeding season, the neural mechanisms underlying these components act in concert to radically transform the way a female perceives the social cues around her as well as the way in which cognitive and motivational processes influence her decision to respond to courting males. The role of each of these four components in mate choice responses will be discussed here as well as the brain regions involved in regulating each component. These components are not independent, modular systems. Instead, they are dependent on one another. This review will discuss the many ways in which these components interact and affect one another. The interaction of these components, however, ultimately leads back to a few key neuromodulators that thread motivation, sensory, salience, and cognitive components into a set of inter-dependent processes. These neuromodulators are estrogens and catecholamines. This review will highlight the need to understand estrogens in reproductive contexts not just as simply a 'sexual motivation modulator' or catecholamines as 'cognitive regulators' but as neuromodulators that work together to fully transform a non-breeding female into a completely reproductive female displaying: heightened sexual interest in courting males, greater arousal and selective attention toward courtship signals, improved signal detection and discrimination abilities, enhanced contextual signal memory, and increased motivation to respond to signals assigned incentive salience. The aim of this review is to build a foundation in which to understand the brain regions associated with cognitive, sensory, motivational, and signal salience not as independently acting systems but as a set of interacting processes that function together in a context-appropriate manner. © The Author 2017. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  20. Systems metabolic engineering in an industrial setting.

    PubMed

    Sagt, Cees M J

    2013-03-01

    Systems metabolic engineering is based on systems biology, synthetic biology, and evolutionary engineering and is now also applied in industry. Industrial use of systems metabolic engineering focuses on strain and process optimization. Since ambitious yields, titers, productivities, and low costs are key in an industrial setting, the use of effective and robust methods in systems metabolic engineering is becoming very important. Major improvements in the field of proteomics and metabolomics have been crucial in the development of genome-wide approaches in strain and process development. This is accompanied by a rapid increase in DNA sequencing and synthesis capacity. These developments enable the use of systems metabolic engineering in an industrial setting. Industrial systems metabolic engineering can be defined as the combined use of genome-wide genomics, transcriptomics, proteomics, and metabolomics to modify strains or processes. This approach has become very common since the technology for generating large data sets of all levels of the cellular processes has developed quite fast into robust, reliable, and affordable methods. The main challenge and scope of this mini review is how to translate these large data sets in relevant biological leads which can be tested for strain or process improvements. Experimental setup, heterogeneity of the culture, and sample pretreatment are important issues which are easily underrated. In addition, the process of structuring, filtering, and visualization of data is important, but also, the availability of a genetic toolbox and equipment for medium/high-throughput fermentation is a key success factor. For an efficient bioprocess, all the different components in this process have to work together. Therefore, mutual tuning of these components is an important strategy.

  1. Behavioral System Feedback Measurement Failure: Sweeping Quality under the Rug

    ERIC Educational Resources Information Center

    Mihalic, Maria T.; Ludwig, Timothy D.

    2009-01-01

    Behavioral Systems rely on valid measurement systems to manage processes and feedback and to deliver contingencies. An examination of measurement system components designed to track customer service quality of furniture delivery drivers revealed the measurement system failed to capture information it was designed to measure. A reason for this…

  2. Computerized Adaptive Testing System Design: Preliminary Design Considerations.

    ERIC Educational Resources Information Center

    Croll, Paul R.

    A functional design model for a computerized adaptive testing (CAT) system was developed and presented through a series of hierarchy plus input-process-output (HIPO) diagrams. System functions were translated into system structure: specifically, into 34 software components. Implementation of the design in a physical system was addressed through…

  3. Novel cost controlled materials and processing for primary structures

    NASA Technical Reports Server (NTRS)

    Dastin, S. J.

    1993-01-01

    Textile laminates, developed a number of years ago, have recently been shown to be applicable to primary aircraft structures for both small and large components. Such structures have the potential to reduce acquisition costs but require advanced automated processing to keep costs controlled while verifying product reliability and assuring structural integrity, durability and affordable life-cycle costs. Recently, resin systems and graphite-reinforced woven shapes have been developed that have the potential for improved RTM processes for aircraft structures. Ciba-Geigy, Brochier Division has registered an RTM prepreg reinforcement called 'Injectex' that has shown effectivity for aircraft components. Other novel approaches discussed are thermotropic resins producing components by injection molding and ceramic polymers for long-duration hot structures. The potential of such materials and processing will be reviewed along with initial information/data available to date.

  4. On-line dimensional measurement of small components on the eyeglasses assembly line

    NASA Astrophysics Data System (ADS)

    Rosati, G.; Boschetti, G.; Biondi, A.; Rossi, A.

    2009-03-01

    Dimensional measurement of the subassemblies at the beginning of the assembly line is a very crucial process for the eyeglasses industry, since even small manufacturing errors of the components can lead to very visible defects on the final product. For this reason, all subcomponents of the eyeglass are verified before beginning the assembly process either with a 100% inspection or on a statistical basis. Inspection is usually performed by human operators, with high costs and a degree of repeatability which is not always satisfactory. This paper presents a novel on-line measuring system for dimensional verification of small metallic subassemblies for the eyeglasses industry. The machine vision system proposed, which was designed to be used at the beginning of the assembly line, could also be employed in the Statistical Process Control (SPC) by the manufacturer of the subassemblies. The automated system proposed is based on artificial vision, and exploits two CCD cameras and an anthropomorphic robot to inspect and manipulate the subcomponents of the eyeglass. Each component is recognized by the first camera in a quite large workspace, picked up by the robot and placed in the small vision field of the second camera which performs the measurement process. Finally, the part is palletized by the robot. The system can be easily taught by the operator by simply placing the template object in the vision field of the measurement camera (for dimensional data acquisition) and hence by instructing the robot via the Teaching Control Pendant within the vision field of the first camera (for pick-up transformation acquisition). The major problem we dealt with is that the shape and dimensions of the subassemblies can vary in a quite wide range, but different positioning of the same component can look very similar one to another. For this reason, a specific shape recognition procedure was developed. In the paper, the whole system is presented together with first experimental lab results.

  5. Design and implementation of highly parallel pipelined VLSI systems

    NASA Astrophysics Data System (ADS)

    Delange, Alphonsus Anthonius Jozef

    A methodology and its realization as a prototype CAD (Computer Aided Design) system for the design and analysis of complex multiprocessor systems is presented. The design is an iterative process in which the behavioral specifications of the system components are refined into structural descriptions consisting of interconnections and lower level components etc. A model for the representation and analysis of multiprocessor systems at several levels of abstraction and an implementation of a CAD system based on this model are described. A high level design language, an object oriented development kit for tool design, a design data management system, and design and analysis tools such as a high level simulator and graphics design interface which are integrated into the prototype system and graphics interface are described. Procedures for the synthesis of semiregular processor arrays, and to compute the switching of input/output signals, memory management and control of processor array, and sequencing and segmentation of input/output data streams due to partitioning and clustering of the processor array during the subsequent synthesis steps, are described. The architecture and control of a parallel system is designed and each component mapped to a module or module generator in a symbolic layout library, compacted for design rules of VLSI (Very Large Scale Integration) technology. An example of the design of a processor that is a useful building block for highly parallel pipelined systems in the signal/image processing domains is given.

  6. Research on design connotation of hair drier system

    NASA Astrophysics Data System (ADS)

    Li, Yongchuan; Wu, Qiong

    2018-04-01

    After the analysis and summary of the research on the design of hair drier system, the system design is focused on. Product system design is not only to study its entity, but also is recognized as the part, element and component with a systematic feature to deeply analyze the innovation way of product system design, which is taken as its concept to carry out the association analysis on the component elements of hair driers and the overall analysis and study on the system design process of hair dryers. The product life cycle is taken as the main goal, through system analysis, system synthesis and system optimization, to solve the problems of product design. It is of great practical significance.

  7. Integration, design, and construction of a CELSS breadboard facility for bioregenerative life support system research

    NASA Technical Reports Server (NTRS)

    Prince, R.; Knott, W.; Buchanan, Paul

    1987-01-01

    Design criteria for the Biomass Production Chamber (BPC), preliminary operating procedures, and requirements for the future development of the Controlled Ecological Life Support System (CELSS) are discussed. CELSS, which uses a bioregenerative system, includes the following three major units: (1) a biomass production component to grow plants under controlled conditions; (2) food processing components to derive maximum edible content from all plant parts; and (3) waste management components to recover and recycle all solids, liquids, and gases necessary to support life. The current status of the CELSS breadboard facility is reviewed; a block diagram of a simplified version of CELSS and schematic diagrams of the BPS are included.

  8. Simulation of the human-telerobot interface on the Space Station

    NASA Technical Reports Server (NTRS)

    Stuart, Mark A.; Smith, Randy L.

    1993-01-01

    Many issues remain unresolved concerning the components of the human-telerobot interface presented in this work. It is critical that these components be optimally designed and arranged to ensure, not only that the overall system's goals are met, but but that the intended end-user has been optimally accommodated. With sufficient testing and evaluation throughout the development cycle, the selection of the components to use in the final telerobotic system can promote efficient, error-free performance. It is recommended that whole-system simulation with full-scale mockups be used to help design the human-telerobot interface. It is contended that the use of simulation can facilitate this design and evaluation process.

  9. Department of Defense picture archiving and communication system acceptance testing: results and identification of problem components.

    PubMed

    Allison, Scott A; Sweet, Clifford F; Beall, Douglas P; Lewis, Thomas E; Monroe, Thomas

    2005-09-01

    The PACS implementation process is complicated requiring a tremendous amount of time, resources, and planning. The Department of Defense (DOD) has significant experience in developing and refining PACS acceptance testing (AT) protocols that assure contract compliance, clinical safety, and functionality. The DOD's AT experience under the initial Medical Diagnostic Imaging Support System contract led to the current Digital Imaging Network-Picture Archiving and Communications Systems (DIN-PACS) contract AT protocol. To identify the most common system and component deficiencies under the current DIN-PACS AT protocol, 14 tri-service sites were evaluated during 1998-2000. Sixteen system deficiency citations with 154 separate types of limitations were noted with problems involving the workstation, interfaces, and the Radiology Information System comprising more than 50% of the citations. Larger PACS deployments were associated with a higher number of deficiencies. The most commonly cited systems deficiencies were among the most expensive components of the PACS.

  10. Phase change water processing for Space Station

    NASA Technical Reports Server (NTRS)

    Zdankiewicz, E. M.; Price, D. F.

    1985-01-01

    The use of a vapor compression distillation subsystem (VCDS) for water recovery on the Space Station is analyzed. The self-contained automated system can process waste water at a rate of 32.6 kg/day and requires only 115 W of electric power. The improvements in the mechanical components of VCDS are studied. The operation of VCDS in the normal mode is examined. The VCDS preprototype is evaluated based on water quality, water production rate, and specific energy. The relation between water production rate and fluids pump speed is investigated; it is concluded that a variable speed fluids pump will optimize water production. Components development and testing currently being conducted are described. The properties and operation of the proposed phase change water processing system for the Space Station, based on vapor compression distillation, are examined.

  11. CIM at GE's factory of the future

    NASA Astrophysics Data System (ADS)

    Waldman, H.

    Functional features of a highly automated aircraft component batch processing factory are described. The system has processing, working, and methodology components. A rotating parts operation installed 20 yr ago features a high density of numerically controlled machines, and is connected to a hierarchical network of data communications and apparatus for moving the rotating parts and tools of engines. Designs produced at one location in the country are sent by telephone link to other sites for development of manufacturing plans, tooling, numerical control programs, and process instructions for the rotating parts. Direct numerical control is implemented at the work stations, which have instructions stored on tape for back-up in case the host computer goes down. Each machine is automatically monitored at 48 points and notice of failure can originate from any point in the system.

  12. Activated Biological Filters (ABF Towers). Student Manual. Biological Treatment Process Control.

    ERIC Educational Resources Information Center

    Wooley, John F.

    This student manual contains textual material for a two-lesson unit on activated bio-filters (ABF). The first lesson (the sewage treatment plant) examines those process units that are unique to the ABF system. The lesson includes a review of the structural components of the ABF system and their functions and a discussion of several operational…

  13. Apparatus and method for converting biomass to feedstock for biofuel and biochemical manufacturing processes

    DOEpatents

    Kania, John; Qiao, Ming; Woods, Elizabeth M.; Cortright, Randy D.; Myren, Paul

    2015-12-15

    The present invention includes improved systems and methods for producing biomass-derived feedstocks for biofuel and biochemical manufacturing processes. The systems and methods use components that are capable of transferring relatively high concentrations of solid biomass utilizing pressure variations between vessels, and allows for the recovery and recycling of heterogeneous catalyst materials.

  14. Health care professional workstation: software system construction using DSSA scenario-based engineering process.

    PubMed

    Hufnagel, S; Harbison, K; Silva, J; Mettala, E

    1994-01-01

    This paper describes a new method for the evolutionary determination of user requirements and system specifications called scenario-based engineering process (SEP). Health care professional workstations are critical components of large scale health care system architectures. We suggest that domain-specific software architectures (DSSAs) be used to specify standard interfaces and protocols for reusable software components throughout those architectures, including workstations. We encourage the use of engineering principles and abstraction mechanisms. Engineering principles are flexible guidelines, adaptable to particular situations. Abstraction mechanisms are simplifications for management of complexity. We recommend object-oriented design principles, graphical structural specifications, and formal components' behavioral specifications. We give an ambulatory care scenario and associated models to demonstrate SEP. The scenario uses health care terminology and gives patients' and health care providers' system views. Our goal is to have a threefold benefit. (i) Scenario view abstractions provide consistent interdisciplinary communications. (ii) Hierarchical object-oriented structures provide useful abstractions for reuse, understandability, and long term evolution. (iii) SEP and health care DSSA integration into computer aided software engineering (CASE) environments. These environments should support rapid construction and certification of individualized systems, from reuse libraries.

  15. Determining of a robot workspace using the integration of a CAD system with a virtual control system

    NASA Astrophysics Data System (ADS)

    Herbuś, K.; Ociepka, P.

    2016-08-01

    The paper presents a method for determining the workspace of an industrial robot using an approach consisting in integration a 3D model of an industrial robot with a virtual control system. The robot model with his work environment, prepared for motion simulation, was created in the “Motion Simulation” module of the Siemens PLM NX software. In the mentioned model components of the “link” type were created which map the geometrical form of particular elements of the robot and the components of “joint” type mapping way of cooperation of components of the “link” type. In the paper is proposed the solution in which the control process of a virtual robot is similar to the control process of a real robot using the manual control panel (teach pendant). For this purpose, the control application “JOINT” was created, which provides the manipulation of a virtual robot in accordance with its internal control system. The set of procedures stored in an .xlsx file is the element integrating the 3D robot model working in the CAD/CAE class system with the elaborated control application.

  16. Component-Level Electronic-Assembly Repair (CLEAR) System Architecture

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard C.; Bradish, Martin A.; Juergens, Jeffrey R.; Lewis, Michael J.; Vrnak, Daniel R.

    2011-01-01

    This document captures the system architecture for a Component-Level Electronic-Assembly Repair (CLEAR) capability needed for electronics maintenance and repair of the Constellation Program (CxP). CLEAR is intended to improve flight system supportability and reduce the mass of spares required to maintain the electronics of human rated spacecraft on long duration missions. By necessity it allows the crew to make repairs that would otherwise be performed by Earth based repair depots. Because of practical knowledge and skill limitations of small spaceflight crews they must be augmented by Earth based support crews and automated repair equipment. This system architecture covers the complete system from ground-user to flight hardware and flight crew and defines an Earth segment and a Space segment. The Earth Segment involves database management, operational planning, and remote equipment programming and validation processes. The Space Segment involves the automated diagnostic, test and repair equipment required for a complete repair process. This document defines three major subsystems including, tele-operations that links the flight hardware to ground support, highly reconfigurable diagnostics and test instruments, and a CLEAR Repair Apparatus that automates the physical repair process.

  17. Advanced Environmental Barrier Coating Development for SiC-SiC Ceramic Matrix Composite Components

    NASA Technical Reports Server (NTRS)

    Zhu, Dongming; Harder, Bryan; Hurst, Janet B.; Halbig, Michael Charles; Puleo, Bernadette J.; Costa, Gustavo; Mccue, Terry R.

    2017-01-01

    This presentation reviews the NASA advanced environmental barrier coating (EBC) system development for SiC-SiC Ceramic Matrix Composite (CMC) combustors particularly under the NASA Environmentally Responsible Aviation, Fundamental Aeronautics and Transformative Aeronautics Concepts Programs. The emphases have been placed on the current design challenges of the 2700-3000F capable environmental barrier coatings for low NOX emission combustors for next generation turbine engines by using advanced plasma spray based processes, and the coating processing and integration with SiC-SiC CMCs and component systems. The developments also have included candidate coating composition system designs, degradation mechanisms, performance evaluation and down-selects; the processing optimizations using TriplexPro Air Plasma Spray Low Pressure Plasma Spray (LPPS), Plasma Spray Physical Vapor Deposition and demonstration of EBC-CMC systems. This presentation also highlights the EBC-CMC system temperature capability and durability improvements under the NASA development programs, as demonstrated in the simulated engine high heat flux, combustion environments, in conjunction with high heat flux, mechanical creep and fatigue loading testing conditions.

  18. Stripping Away the Soil: Plant Growth Promoting Microbiology Opportunities in Aquaponics

    PubMed Central

    Bartelme, Ryan P.; Oyserman, Ben O.; Blom, Jesse E.; Sepulveda-Villet, Osvaldo J.; Newton, Ryan J.

    2018-01-01

    As the processes facilitated by plant growth promoting microorganisms (PGPMs) become better characterized, it is evident that PGPMs may be critical for successful sustainable agricultural practices. Microbes enrich plant growth through various mechanisms, such as enhancing resistance to disease and drought, producing beneficial molecules, and supplying nutrients and trace metals to the plant rhizosphere. Previous studies of PGPMs have focused primarily on soil-based crops. In contrast, aquaponics is a water-based agricultural system, in which production relies upon internal nutrient recycling to co-cultivate plants with fish. This arrangement has management benefits compared to soil-based agriculture, as system components may be designed to directly harness microbial processes that make nutrients bioavailable to plants in downstream components. However, aquaponic systems also present unique management challenges. Microbes may compete with plants for certain micronutrients, such as iron, which makes exogenous supplementation necessary, adding production cost and process complexity, and limiting profitability and system sustainability. Research on PGPMs in aquaponic systems currently lags behind traditional agricultural systems, however, it is clear that certain parallels in nutrient use and plant-microbe interactions are retained from soil-based agricultural systems. PMID:29403461

  19. [Study on "multi-dimensional structure and process dynamics quality control system" of Danshen infusion solution based on component structure theory].

    PubMed

    Feng, Liang; Zhang, Ming-Hua; Gu, Jun-Fei; Wang, Gui-You; Zhao, Zi-Yu; Jia, Xiao-Bin

    2013-11-01

    As traditional Chinese medicine (TCM) preparation products feature complex compounds and multiple preparation processes, the implementation of quality control in line with the characteristics of TCM preparation products provides a firm guarantee for the clinical efficacy and safety of TCM preparation products. Danshen infusion solution is a preparation commonly used in clinic, but its quality control is restricted to indexes of finished products, which can not guarantee its inherent quality. Our study group has proposed "multi-dimensional structure and process dynamics quality control system" on the basis of "component structure theory", for the purpose of controlling the quality of Danshen infusion solution at multiple levels and in multiple links from the efficacy-related material basis, the safety-related material basis, the characteristics of dosage form to the preparation process. This article, we bring forth new ideas and models to the quality control of TCM preparation products.

  20. Modeling and Simulation Roadmap to Enhance Electrical Energy Security of U.S. Naval Bases

    DTIC Science & Technology

    2012-03-01

    evaluating power system architectures and technologies and, therefore, can become a valuable tool for the implementation of the described plan for Navy...a well validated and consistent process for evaluating power system architectures and technologies and, therefore, can be a valuable tool for the...process for evaluating power system architectures and component technologies is needed to support the development and implementation of these new

  1. Color Image Processing and Object Tracking System

    NASA Technical Reports Server (NTRS)

    Klimek, Robert B.; Wright, Ted W.; Sielken, Robert S.

    1996-01-01

    This report describes a personal computer based system for automatic and semiautomatic tracking of objects on film or video tape, developed to meet the needs of the Microgravity Combustion and Fluids Science Research Programs at the NASA Lewis Research Center. The system consists of individual hardware components working under computer control to achieve a high degree of automation. The most important hardware components include 16-mm and 35-mm film transports, a high resolution digital camera mounted on a x-y-z micro-positioning stage, an S-VHS tapedeck, an Hi8 tapedeck, video laserdisk, and a framegrabber. All of the image input devices are remotely controlled by a computer. Software was developed to integrate the overall operation of the system including device frame incrementation, grabbing of image frames, image processing of the object's neighborhood, locating the position of the object being tracked, and storing the coordinates in a file. This process is performed repeatedly until the last frame is reached. Several different tracking methods are supported. To illustrate the process, two representative applications of the system are described. These applications represent typical uses of the system and include tracking the propagation of a flame front and tracking the movement of a liquid-gas interface with extremely poor visibility.

  2. Security Implications of OPC, OLE, DCOM, and RPC in Control Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2006-01-01

    OPC is a collection of software programming standards and interfaces used in the process control industry. It is intended to provide open connectivity and vendor equipment interoperability. The use of OPC technology simplifies the development of control systems that integrate components from multiple vendors and support multiple control protocols. OPC-compliant products are available from most control system vendors, and are widely used in the process control industry. OPC was originally known as OLE for Process Control; the first standards for OPC were based on underlying services in the Microsoft Windows computing environment. These underlying services (OLE [Object Linking and Embedding],more » DCOM [Distributed Component Object Model], and RPC [Remote Procedure Call]) have been the source of many severe security vulnerabilities. It is not feasible to automatically apply vendor patches and service packs to mitigate these vulnerabilities in a control systems environment. Control systems using the original OPC data access technology can thus inherit the vulnerabilities associated with these services. Current OPC standardization efforts are moving away from the original focus on Microsoft protocols, with a distinct trend toward web-based protocols that are independent of any particular operating system. However, the installed base of OPC equipment consists mainly of legacy implementations of the OLE for Process Control protocols.« less

  3. The development of a non-cryogenic nitrogen/oxygen supply system. [using hydrazine/water electrolysis

    NASA Technical Reports Server (NTRS)

    Greenough, B. M.; Mahan, R. E.

    1974-01-01

    A hydrazine/water electrolysis process system module design was fabricated and tested to demonstrate component and module performance. This module is capable of providing both the metabolic oxygen for crew needs and the oxygen and nitrogen for spacecraft leak makeup. The component designs evolved through previous R and D efforts, and were fabricated and tested individually and then were assembled into a complete module which was successfully tested for 1000 hours to demonstrate integration of the individual components. A survey was made of hydrazine sensor technology and a cell math model was derived.

  4. Digital imaging technology assessment: Digital document storage project

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An ongoing technical assessment and requirements definition project is examining the potential role of digital imaging technology at NASA's STI facility. The focus is on the basic components of imaging technology in today's marketplace as well as the components anticipated in the near future. Presented is a requirement specification for a prototype project, an initial examination of current image processing at the STI facility, and an initial summary of image processing projects at other sites. Operational imaging systems incorporate scanners, optical storage, high resolution monitors, processing nodes, magnetic storage, jukeboxes, specialized boards, optical character recognition gear, pixel addressable printers, communications, and complex software processes.

  5. Process for Selecting System Level Assessments for Human System Technologies

    NASA Technical Reports Server (NTRS)

    Watts, James; Park, John

    2006-01-01

    The integration of many life support systems necessary to construct a stable habitat is difficult. The correct identification of the appropriate technologies and corresponding interfaces is an exhaustive process. Once technologies are selected secondary issues such as mechanical and electrical interfaces must be addressed. The required analytical and testing work must be approached in a piecewise fashion to achieve timely results. A repeatable process has been developed to identify and prioritize system level assessments and testing needs. This Assessment Selection Process has been defined to assess cross cutting integration issues on topics at the system or component levels. Assessments are used to identify risks, encourage future actions to mitigate risks, or spur further studies.

  6. Analysis and optimization of solid oxide fuel cell-based auxiliary power units using a generic zero-dimensional fuel cell model

    NASA Astrophysics Data System (ADS)

    Göll, S.; Samsun, R. C.; Peters, R.

    Fuel-cell-based auxiliary power units can help to reduce fuel consumption and emissions in transportation. For this application, the combination of solid oxide fuel cells (SOFCs) with upstream fuel processing by autothermal reforming (ATR) is seen as a highly favorable configuration. Notwithstanding the necessity to improve each single component, an optimized architecture of the fuel cell system as a whole must be achieved. To enable model-based analyses, a system-level approach is proposed in which the fuel cell system is modeled as a multi-stage thermo-chemical process using the "flowsheeting" environment PRO/II™. Therein, the SOFC stack and the ATR are characterized entirely by corresponding thermodynamic processes together with global performance parameters. The developed model is then used to achieve an optimal system layout by comparing different system architectures. A system with anode and cathode off-gas recycling was identified to have the highest electric system efficiency. Taking this system as a basis, the potential for further performance enhancement was evaluated by varying four parameters characterizing different system components. Using methods from the design and analysis of experiments, the effects of these parameters and of their interactions were quantified, leading to an overall optimized system with encouraging performance data.

  7. Process and application of shock compression by nanosecond pulses of frequency-doubled Nd:YAG laser

    NASA Astrophysics Data System (ADS)

    Sano, Yuji; Kimura, Motohiko; Mukai, Naruhiko; Yoda, Masaki; Obata, Minoru; Ogisu, Tatsuki

    2000-02-01

    The authors have developed a new process of laser-induced shock compression to introduce a residual compressive stress on material surface, which is effective for prevention of stress corrosion cracking (SCC) and enhancement of fatigue strength of metal materials. The process developed is unique and beneficial. It requires no pre-conditioning for the surface, whereas the conventional process requires that the so-called sacrificial layer is made to protect the surface from damage. The new process can be freely applied to water- immersed components, since it uses water-penetrable green light of a frequency-doubled Nd:YAG laser. The process developed has the potential to open up new high-power laser applications in manufacturing and maintenance technologies. The laser-induced shock compression process (LSP) can be used to improve a residual stress field from tensile to compressive. In order to understand the physics and optimize the process, the propagation of a shock wave generated by the impulse of laser irradiation and the dynamic response of the material were analyzed by time-dependent elasto-plastic calculations with a finite element program using laser-induced plasma pressure as an external load. The analysis shows that a permanent strain and a residual compressive stress remain after the passage of the shock wave with amplitude exceeding the yield strength of the material. A practical system materializing the LSP was designed, manufactured, and tested to confirm the applicability to core components of light water reactors (LWRs). The system accesses the target component and remotely irradiates laser pulses to the heat affected zone (HAZ) along weld lines. Various functional tests were conducted using a full-scale mockup facility, in which remote maintenance work in a reactor vessel could be simulated. The results showed that the system remotely accessed the target weld lines and successfully introduced a residual compressive stress. After sufficient training for operational personnel, the system was applied to the core shroud of an existing nuclear power plant.

  8. Ground Testing of Prototype Hardware and Processing Algorithms for a Wide Area Space Surveillance System (WASSS)

    DTIC Science & Technology

    2013-09-01

    Ground testing of prototype hardware and processing algorithms for a Wide Area Space Surveillance System (WASSS) Neil Goldstein, Rainer A...at Magdalena Ridge Observatory using the prototype Wide Area Space Surveillance System (WASSS) camera, which has a 4 x 60 field-of-view , < 0.05...objects with larger-aperture cameras. The sensitivity of the system depends on multi-frame averaging and a Principal Component Analysis based image

  9. Independent component analysis based digital signal processing in coherent optical fiber communication systems

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Luo, Ming; Qiu, Ying; Alphones, Arokiaswami; Zhong, Wen-De; Yu, Changyuan; Yang, Qi

    2018-02-01

    In this paper, channel equalization techniques for coherent optical fiber transmission systems based on independent component analysis (ICA) are reviewed. The principle of ICA for blind source separation is introduced. The ICA based channel equalization after both single-mode fiber and few-mode fiber transmission for single-carrier and orthogonal frequency division multiplexing (OFDM) modulation formats are investigated, respectively. The performance comparisons with conventional channel equalization techniques are discussed.

  10. Review of Existing Programs, 2009-10

    ERIC Educational Resources Information Center

    Nevada System of Higher Education, 2010

    2010-01-01

    Pursuant to Board policy ("Title 4, Chapter 14, Section 4"--in part), a review of existing programs shall be conducted by all institutions of the Nevada System of Higher Education on a regularly scheduled basis. The process for reviewing programs varies by institution but contains similar vital components. These components include…

  11. Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.

  12. Analysis of Hospital Processes with Process Mining Techniques.

    PubMed

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  13. Supportability Technologies for Future Exploration Missions

    NASA Technical Reports Server (NTRS)

    Watson, Kevin; Thompson, Karen

    2007-01-01

    Future long-duration human exploration missions will be challenged by resupply limitations and mass and volume constraints. Consequently, it will be essential that the logistics footprint required to support these missions be minimized and that capabilities be provided to make them highly autonomous from a logistics perspective. Strategies to achieve these objectives include broad implementation of commonality and standardization at all hardware levels and across all systems, repair of failed hardware at the lowest possible hardware level, and manufacture of structural and mechanical replacement components as needed. Repair at the lowest hardware levels will require the availability of compact, portable systems for diagnosis of failures in electronic systems and verification of system functionality following repair. Rework systems will be required that enable the removal and replacement of microelectronic components with minimal human intervention to minimize skill requirements and training demand for crews. Materials used in the assembly of electronic systems (e.g. solders, fluxes, conformal coatings) must be compatible with the available repair methods and the spacecraft environment. Manufacturing of replacement parts for structural and mechanical applications will require additive manufacturing systems that can generate near-net-shape parts from the range of engineering alloys employed in the spacecraft structure and in the parts utilized in other surface systems. These additive manufacturing processes will need to be supported by real-time non-destructive evaluation during layer-additive processing for on-the-fly quality control. This will provide capabilities for quality control and may serve as an input for closed-loop process control. Additionally, non-destructive methods should be available for material property determination. These nondestructive evaluation processes should be incorporated with the additive manufacturing process - providing an in-process capability to ensure that material deposited during layer-additive processing meets required material property criteria.

  14. An optoelectronic system for fringe pattern analysis

    NASA Astrophysics Data System (ADS)

    Sciammarella, C. A.; Ahmadshahi, M.

    A system capable of retrieving and processing information recorded in fringe patterns is reported. The principal components are described as well as the architecture in which they are assembled. An example of application is given.

  15. Potential for utilization of algal biomass for components of the diet in CELSS

    NASA Technical Reports Server (NTRS)

    Kamarei, A. R.; Nakhost, Z.; Karel, M.

    1986-01-01

    The major nutritional components of the green algae (Scenedesmus obliquus) grown in a Constant Cell Density Apparatus were determined. Suitable methodology to prepare proteins from which three major undesirable components of these cells (i.e., cell walls, nucleic acids, and pigments) were either removed or substantially reduced was developed. Results showed that processing of green algae to protein isolate enhances is potential nutritional and organoleptic acceptability as a diet component in controlled Ecological Life Support System.

  16. Methodology to identify risk-significant components for inservice inspection and testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, M.T.; Hartley, R.S.; Jones, J.L. Jr.

    1992-08-01

    Periodic inspection and testing of vital system components should be performed to ensure the safe and reliable operation of Department of Energy (DOE) nuclear processing facilities. Probabilistic techniques may be used to help identify and rank components by their relative risk. A risk-based ranking would allow varied DOE sites to implement inspection and testing programs in an effective and cost-efficient manner. This report describes a methodology that can be used to rank components, while addressing multiple risk issues.

  17. Linear response to nonstationary random excitation.

    NASA Technical Reports Server (NTRS)

    Hasselman, T.

    1972-01-01

    Development of a method for computing the mean-square response of linear systems to nonstationary random excitation of the form given by y(t) = f(t) x(t), in which x(t) = a stationary process and f(t) is deterministic. The method is suitable for application to multidegree-of-freedom systems when the mean-square response at a point due to excitation applied at another point is desired. Both the stationary process, x(t), and the modulating function, f(t), may be arbitrary. The method utilizes a fundamental component of transient response dependent only on x(t) and the system, and independent of f(t) to synthesize the total response. The role played by this component is analogous to that played by the Green's function or impulse response function in the convolution integral.

  18. Image processing for optical mapping.

    PubMed

    Ravindran, Prabu; Gupta, Aditya

    2015-01-01

    Optical Mapping is an established single-molecule, whole-genome analysis system, which has been used to gain a comprehensive understanding of genomic structure and to study structural variation of complex genomes. A critical component of Optical Mapping system is the image processing module, which extracts single molecule restriction maps from image datasets of immobilized, restriction digested and fluorescently stained large DNA molecules. In this review, we describe robust and efficient image processing techniques to process these massive datasets and extract accurate restriction maps in the presence of noise, ambiguity and confounding artifacts. We also highlight a few applications of the Optical Mapping system.

  19. ATLAS Eventlndex monitoring system using the Kibana analytics and visualization platform

    NASA Astrophysics Data System (ADS)

    Barberis, D.; Cárdenas Zárate, S. E.; Favareto, A.; Fernandez Casani, A.; Gallas, E. J.; Garcia Montoro, C.; Gonzalez de la Hoz, S.; Hrivnac, J.; Malon, D.; Prokoshin, F.; Salt, J.; Sanchez, J.; Toebbicke, R.; Yuan, R.; ATLAS Collaboration

    2016-10-01

    The ATLAS EventIndex is a data catalogue system that stores event-related metadata for all (real and simulated) ATLAS events, on all processing stages. As it consists of different components that depend on other applications (such as distributed storage, and different sources of information) we need to monitor the conditions of many heterogeneous subsystems, to make sure everything is working correctly. This paper describes how we gather information about the EventIndex components and related subsystems: the Producer-Consumer architecture for data collection, health parameters from the servers that run EventIndex components, EventIndex web interface status, and the Hadoop infrastructure that stores EventIndex data. This information is collected, processed, and then displayed using CERN service monitoring software based on the Kibana analytic and visualization package, provided by CERN IT Department. EventIndex monitoring is used both by the EventIndex team and ATLAS Distributed Computing shifts crew.

  20. Argo: an integrative, interactive, text mining-based workbench supporting curation

    PubMed Central

    Rak, Rafal; Rowley, Andrew; Black, William; Ananiadou, Sophia

    2012-01-01

    Curation of biomedical literature is often supported by the automatic analysis of textual content that generally involves a sequence of individual processing components. Text mining (TM) has been used to enhance the process of manual biocuration, but has been focused on specific databases and tasks rather than an environment integrating TM tools into the curation pipeline, catering for a variety of tasks, types of information and applications. Processing components usually come from different sources and often lack interoperability. The well established Unstructured Information Management Architecture is a framework that addresses interoperability by defining common data structures and interfaces. However, most of the efforts are targeted towards software developers and are not suitable for curators, or are otherwise inconvenient to use on a higher level of abstraction. To overcome these issues we introduce Argo, an interoperable, integrative, interactive and collaborative system for text analysis with a convenient graphic user interface to ease the development of processing workflows and boost productivity in labour-intensive manual curation. Robust, scalable text analytics follow a modular approach, adopting component modules for distinct levels of text analysis. The user interface is available entirely through a web browser that saves the user from going through often complicated and platform-dependent installation procedures. Argo comes with a predefined set of processing components commonly used in text analysis, while giving the users the ability to deposit their own components. The system accommodates various areas and levels of user expertise, from TM and computational linguistics to ontology-based curation. One of the key functionalities of Argo is its ability to seamlessly incorporate user-interactive components, such as manual annotation editors, into otherwise completely automatic pipelines. As a use case, we demonstrate the functionality of an in-built manual annotation editor that is well suited for in-text corpus annotation tasks. Database URL: http://www.nactem.ac.uk/Argo PMID:22434844

  1. SPS Energy Conversion Power Management Workshop

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Energy technology concerning photovoltaic conversion, solar thermal conversion systems, and electrical power distribution processing is discussed. The manufacturing processes involving solar cells and solar array production are summarized. Resource issues concerning gallium arsenides and silicon alternatives are reported. Collector structures for solar construction are described and estimates in their service life, failure rates, and capabilities are presented. Theories of advanced thermal power cycles are summarized. Power distribution system configurations and processing components are presented.

  2. Anorexia nervosa and body dysmorphic disorder are associated with abnormalities in processing visual information.

    PubMed

    Li, W; Lai, T M; Bohon, C; Loo, S K; McCurdy, D; Strober, M; Bookheimer, S; Feusner, J

    2015-07-01

    Anorexia nervosa (AN) and body dysmorphic disorder (BDD) are characterized by distorted body image and are frequently co-morbid with each other, although their relationship remains little studied. While there is evidence of abnormalities in visual and visuospatial processing in both disorders, no study has directly compared the two. We used two complementary modalities--event-related potentials (ERPs) and functional magnetic resonance imaging (fMRI)--to test for abnormal activity associated with early visual signaling. We acquired fMRI and ERP data in separate sessions from 15 unmedicated individuals in each of three groups (weight-restored AN, BDD, and healthy controls) while they viewed images of faces and houses of different spatial frequencies. We used joint independent component analyses to compare activity in visual systems. AN and BDD groups demonstrated similar hypoactivity in early secondary visual processing regions and the dorsal visual stream when viewing low spatial frequency faces, linked to the N170 component, as well as in early secondary visual processing regions when viewing low spatial frequency houses, linked to the P100 component. Additionally, the BDD group exhibited hyperactivity in fusiform cortex when viewing high spatial frequency houses, linked to the N170 component. Greater activity in this component was associated with lower attractiveness ratings of faces. Results provide preliminary evidence of similar abnormal spatiotemporal activation in AN and BDD for configural/holistic information for appearance- and non-appearance-related stimuli. This suggests a common phenotype of abnormal early visual system functioning, which may contribute to perceptual distortions.

  3. Processing of odor mixtures in the zebrafish olfactory bulb.

    PubMed

    Tabor, Rico; Yaksi, Emre; Weislogel, Jan-Marek; Friedrich, Rainer W

    2004-07-21

    Components of odor mixtures often are not perceived individually, suggesting that neural representations of mixtures are not simple combinations of the representations of the components. We studied odor responses to binary mixtures of amino acids and food extracts at different processing stages in the olfactory bulb (OB) of zebrafish. Odor-evoked input to the OB was measured by imaging Ca2+ signals in afferents to olfactory glomeruli. Activity patterns evoked by mixtures were predictable within narrow limits from the component patterns, indicating that mixture interactions in the peripheral olfactory system are weak. OB output neurons, the mitral cells (MCs), were recorded extra- and intracellularly and responded to odors with stimulus-dependent temporal firing rate modulations. Responses to mixtures of amino acids often were dominated by one of the component responses. Responses to mixtures of food extracts, in contrast, were more distinct from both component responses. These results show that mixture interactions can result from processing in the OB. Moreover, our data indicate that mixture interactions in the OB become more pronounced with increasing overlap of input activity patterns evoked by the components. Emerging from these results are rules of mixture interactions that may explain behavioral data and provide a basis for understanding the processing of natural odor stimuli in the OB.

  4. Spaceborne sensors (1983-2000 AD): A forecast of technology

    NASA Technical Reports Server (NTRS)

    Kostiuk, T.; Clark, B. P.

    1984-01-01

    A technical review and forecast of space technology as it applies to spaceborne sensors for future NASA missions is presented. A format for categorization of sensor systems covering the entire electromagnetic spectrum, including particles and fields is developed. Major generic sensor systems are related to their subsystems, components, and to basic research and development. General supporting technologies such as cryogenics, optical design, and data processing electronics are addressed where appropriate. The dependence of many classes of instruments on common components, basic R&D and support technologies is also illustrated. A forecast of important system designs and instrument and component performance parameters is provided for the 1983-2000 AD time frame. Some insight into the scientific and applications capabilities and goals of the sensor systems is also given.

  5. Industrial based volume manufacturing of lightweight aluminium alloy panel components with high-strength and complex-shape for car body and chassis structures

    NASA Astrophysics Data System (ADS)

    Anyasodor, Gerald; Koroschetz, Christian

    2017-09-01

    To achieve the high volume manufacture of lightweight passenger cars at economic cost as required in the automotive industry, low density materials and new process route will be needed. While high strength aluminium alloy grades: AA7075 and AA6082 may provide the alternative material solution, hot stamping process used for high-strength and ultrahigh strength steels such as boron steel 22mnb5 can enable the volume manufacture of panel components with high-strength and complex-shape for car body and chassis structures. These aluminium alloy grades can be used to manufacture panel components with possible yield strengths ≥ 500 MPa. Due to the differences in material behaviors, hot stamping process of 22mnb5 cannot be directly applied to high strength aluminium alloy grades. Despite recorded successes in laboratories, researches and niche hot forming processes of high strength aluminium alloy grades, not much have been achieved for adequate and efficient volume manufacturing system applicable in the automotive industry. Due to lack of such system and based on expert knowledge in hot stamping production-line, AP&T presents in this paper a hot stamping processing route for high strength aluminium alloys been suitable for production-line development and volume manufacturing.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washiya, Tadahiro; Komaki, Jun; Funasaka, Hideyuki

    Japan Atomic Energy Agency (JAEA) has been developing the new aqueous reprocessing system named 'NEXT' (New Extraction system for TRU recovery)1-2, which provides many advantages as waste volume reduction, cost savings by advanced components and simplification of process operation. Advanced head-end systems in the 'NEXT' process consist of fuel disassembly system, fuel shearing system and continuous dissolver system. We developed reliable fuel disassembly system with innovative procedure, and short-length shearing system and continuous dissolver system can be provided highly concentrated dissolution to adapt to the uranium crystallization process. We have carried out experimental studies, and fabrication of engineering-scale test devicesmore » to confirm the systems performance. In this paper, research and development of advanced head-end systems are described. (authors)« less

  7. Coherent systems in the terahertz frequency range: Elements, operation, and examples

    NASA Technical Reports Server (NTRS)

    Goldsmith, Paul F.

    1992-01-01

    The topics are presented in viewgraph form and include the following: terahertz coherent systems applications; a brief overview of selected components; radiometry and spectroscopy--astronomy; radiometry--aircraft all weather landing system; radiometry--atmospheric remote sensing; plasma diagnostics; communications; radar systems; and materials measurement and manufacturing process control.

  8. A Prototype of an Intelligent System for Information Retrieval: IOTA.

    ERIC Educational Resources Information Center

    Chiaramella, Y.; Defude, B.

    1987-01-01

    Discusses expert systems and their value as components of information retrieval systems related to semantic inference, and describes IOTA, a model of an intelligent information retrieval system which emphasizes natural language query processing. Experimental results are discussed and current and future developments are highlighted. (Author/LRW)

  9. A portable system for processing donated whole blood into high quality components without centrifugation.

    PubMed

    Gifford, Sean C; Strachan, Briony C; Xia, Hui; Vörös, Eszter; Torabian, Kian; Tomasino, Taylor A; Griffin, Gary D; Lichtiger, Benjamin; Aung, Fleur M; Shevkoplyas, Sergey S

    2018-01-01

    The use of centrifugation-based approaches for processing donated blood into components is routine in the industrialized world, as disparate storage conditions require the rapid separation of 'whole blood' into distinct red blood cell (RBC), platelet, and plasma products. However, the logistical complications and potential cellular damage associated with centrifugation/apheresis manufacturing of blood products are well documented. The objective of this study was to evaluate a proof-of-concept system for whole blood processing, which does not employ electromechanical parts, is easily portable, and can be operated immediately after donation with minimal human labor. In a split-unit study (n = 6), full (~500mL) units of freshly-donated whole blood were divided, with one half processed by conventional centrifugation techniques and the other with the new blood separation system. Each of these processes took 2-3 hours to complete and were performed in parallel. Blood products generated by the two approaches were compared using an extensive panel of cellular and plasma quality metrics. Comparison of nearly all RBC parameters showed no significant differences between the two approaches, although the portable system generated RBC units with a slight but statistically significant improvement in 2,3-diphosphoglyceric acid concentration (p < 0.05). More notably, several markers of platelet damage were significantly and meaningfully higher in products generated with conventional centrifugation: the increase in platelet activation (assessed via P-selectin expression in platelets before and after blood processing) was nearly 4-fold higher for platelet units produced via centrifugation, and the release of pro-inflammatory mediators (soluble CD40-ligand, thromboxane B2) was significantly higher for centrifuged platelets as well (p < 0.01). This study demonstrated that a simple, passive system for separating donated blood into components may be a viable alternative to centrifugation-particularly for applications in remote or resource-limited settings, or for patients requiring highly functional platelet product.

  10. PRMS-IV, the precipitation-runoff modeling system, version 4

    USGS Publications Warehouse

    Markstrom, Steven L.; Regan, R. Steve; Hay, Lauren E.; Viger, Roland J.; Webb, Richard M.; Payn, Robert A.; LaFontaine, Jacob H.

    2015-01-01

    Computer models that simulate the hydrologic cycle at a watershed scale facilitate assessment of variability in climate, biota, geology, and human activities on water availability and flow. This report describes an updated version of the Precipitation-Runoff Modeling System. The Precipitation-Runoff Modeling System is a deterministic, distributed-parameter, physical-process-based modeling system developed to evaluate the response of various combinations of climate and land use on streamflow and general watershed hydrology. Several new model components were developed, and all existing components were updated, to enhance performance and supportability. This report describes the history, application, concepts, organization, and mathematical formulation of the Precipitation-Runoff Modeling System and its model components. This updated version provides improvements in (1) system flexibility for integrated science, (2) verification of conservation of water during simulation, (3) methods for spatial distribution of climate boundary conditions, and (4) methods for simulation of soil-water flow and storage.

  11. Optical Information Processing for Aerospace Applications 2

    NASA Technical Reports Server (NTRS)

    Stermer, R. L. (Compiler)

    1984-01-01

    Current research in optical processing, and determination of its role in future aerospace systems was reviewed. It is shown that optical processing offers significant potential for aircraft and spacecraft control, pattern recognition, and robotics. It is demonstrated that the development of optical devices and components can be implemented in practical aerospace configurations.

  12. Direct process estimation from tomographic data using artificial neural systems

    NASA Astrophysics Data System (ADS)

    Mohamad-Saleh, Junita; Hoyle, Brian S.; Podd, Frank J.; Spink, D. M.

    2001-07-01

    The paper deals with the goal of component fraction estimation in multicomponent flows, a critical measurement in many processes. Electrical capacitance tomography (ECT) is a well-researched sensing technique for this task, due to its low-cost, non-intrusion, and fast response. However, typical systems, which include practicable real-time reconstruction algorithms, give inaccurate results, and existing approaches to direct component fraction measurement are flow-regime dependent. In the investigation described, an artificial neural network approach is used to directly estimate the component fractions in gas-oil, gas-water, and gas-oil-water flows from ECT measurements. A 2D finite- element electric field model of a 12-electrode ECT sensor is used to simulate ECT measurements of various flow conditions. The raw measurements are reduced to a mutually independent set using principal components analysis and used with their corresponding component fractions to train multilayer feed-forward neural networks (MLFFNNs). The trained MLFFNNs are tested with patterns consisting of unlearned ECT simulated and plant measurements. Results included in the paper have a mean absolute error of less than 1% for the estimation of various multicomponent fractions of the permittivity distribution. They are also shown to give improved component fraction estimation compared to a well known direct ECT method.

  13. Suppression of cognitive function in hyperthermia; From the viewpoint of executive and inhibitive cognitive processing

    NASA Astrophysics Data System (ADS)

    Shibasaki, Manabu; Namba, Mari; Oshiro, Misaki; Kakigi, Ryusuke; Nakata, Hiroki

    2017-03-01

    Climate change has had a widespread impact on humans and natural systems. Heat stroke is a life-threatening condition in severe environments. The execution or inhibition of decision making is critical for survival in a hot environment. We hypothesized that, even with mild heat stress, not only executive processing, but also inhibitory processing may be impaired, and investigated the effectiveness of body cooling approaches on these processes using the Go/No-go task with electroencephalographic event-related potentials. Passive heat stress increased esophageal temperature (Tes) by 1.30 ± 0.24 °C and decreased cerebral perfusion and thermal comfort. Mild heat stress reduced the amplitudes of the Go-P300 component (i.e. execution) and No-go-P300 component (i.e. inhibition). Cerebral perfusion and thermal comfort recovered following face/head cooling, however, the amplitudes of the Go-P300 and No-go-P300 components remained reduced. During whole-body cooling, the amplitude of the Go-P300 component returned to the pre-heat baseline, whereas that of the No-go-P300 component remained reduced. These results suggest that local cooling of the face and head does not restore impaired cognitive processing during mild heat stress, and response inhibition remains impaired despite the return to normothermia.

  14. Probabilistic Round Trip Contamination Analysis of a Mars Sample Acquisition and Handling Process Using Markovian Decompositions

    NASA Technical Reports Server (NTRS)

    Hudson, Nicolas; Lin, Ying; Barengoltz, Jack

    2010-01-01

    A method for evaluating the probability of a Viable Earth Microorganism (VEM) contaminating a sample during the sample acquisition and handling (SAH) process of a potential future Mars Sample Return mission is developed. A scenario where multiple core samples would be acquired using a rotary percussive coring tool, deployed from an arm on a MER class rover is analyzed. The analysis is conducted in a structured way by decomposing sample acquisition and handling process into a series of discrete time steps, and breaking the physical system into a set of relevant components. At each discrete time step, two key functions are defined: The probability of a VEM being released from each component, and the transport matrix, which represents the probability of VEM transport from one component to another. By defining the expected the number of VEMs on each component at the start of the sampling process, these decompositions allow the expected number of VEMs on each component at each sampling step to be represented as a Markov chain. This formalism provides a rigorous mathematical framework in which to analyze the probability of a VEM entering the sample chain, as well as making the analysis tractable by breaking the process down into small analyzable steps.

  15. Gray matter correlates of set-shifting among neurodegenerative disease, mild cognitive impairment, and healthy older adults

    PubMed Central

    PA, JUDY; POSSIN, KATHERINE L.; WILSON, STEPHEN M.; QUITANIA, LOVINGLY C.; KRAMER, JOEL H.; BOXER, ADAM L.; WEINER, MICHAEL W.; JOHNSON, JULENE K.

    2010-01-01

    There is increasing recognition that set-shifting, a form of cognitive control, is mediated by different neural structures. However, these regions have not yet been carefully identified as many studies do not account for the influence of component processes (e.g., motor speed). We investigated gray matter correlates of set-shifting while controlling for component processes. Using the Design Fluency (DF), Trail Making Test (TMT), and Color Word Interference (CWI) subtests from the Delis-Kaplan Executive Function System (D-KEFS), we investigated the correlation between set-shifting performance and gray matter volume in 160 subjects with neurodegenerative disease, mild cognitive impairment, and healthy older adults using voxel-based morphometry. All three set-shifting tasks correlated with multiple, widespread gray matter regions. After controlling for the component processes, set-shifting performance correlated with focal regions in prefrontal and posterior parietal cortices. We also identified bilateral prefrontal cortex and the right posterior parietal lobe as common sites for set-shifting across the three tasks. There was a high degree of multicollinearity between the set-shifting conditions and the component processes of TMT and CWI, suggesting DF may better isolate set-shifting regions. Overall, these findings highlight the neuroanatomical correlates of set-shifting and the importance of controlling for component processes when investigating complex cognitive tasks. PMID:20374676

  16. C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component

    NASA Astrophysics Data System (ADS)

    Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.

    2018-06-01

    The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.

  17. C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component

    NASA Astrophysics Data System (ADS)

    Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.

    2018-02-01

    The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.

  18. EMC analysis of MOS-1

    NASA Astrophysics Data System (ADS)

    Ishizawa, Y.; Abe, K.; Shirako, G.; Takai, T.; Kato, H.

    The electromagnetic compatibility (EMC) control method, system EMC analysis method, and system test method which have been applied to test the components of the MOS-1 satellite are described. The merits and demerits of the problem solving, specification, and system approaches to EMC control are summarized, and the data requirements of the SEMCAP (specification and electromagnetic compatibility analysis program) computer program for verifying the EMI safety margin of the components are sumamrized. Examples of EMC design are mentioned, and the EMC design process and selection method for EMC critical points are shown along with sample EMC test results.

  19. Applying graphics user interface ot group technology classification and coding at the Boeing aerospace company

    NASA Astrophysics Data System (ADS)

    Ness, P. H.; Jacobson, H.

    1984-10-01

    The thrust of 'group technology' is toward the exploitation of similarities in component design and manufacturing process plans to achieve assembly line flow cost efficiencies for small batch production. The systematic method devised for the identification of similarities in component geometry and processing steps is a coding and classification scheme implemented by interactive CAD/CAM systems. This coding and classification scheme has led to significant increases in computer processing power, allowing rapid searches and retrievals on the basis of a 30-digit code together with user-friendly computer graphics.

  20. Aviation System Analysis Capability Executive Assistant Design

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen; Villani, James A.; Osman, Mohammed; Godso, David; King, Brent; Ricciardi, Michael

    1998-01-01

    In this technical document, we describe the design developed for the Aviation System Analysis Capability (ASAC) Executive Assistant (EA) Proof of Concept (POC). We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models within the ASAC system, and describe the design process and the results of the ASAC EA POC system design. We also describe the evaluation process and results for applicable COTS software. The document has six chapters, a bibliography, three appendices and one attachment.

  1. Aviation System Analysis Capability Executive Assistant Development

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen; Villani, James A.; Anderson, Kevin; Book, Paul

    1999-01-01

    In this technical document, we describe the development of the Aviation System Analysis Capability (ASAC) Executive Assistant (EA) Proof of Concept (POC) and Beta version. We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models in the ASAC system, and describe the design process and the results of the ASAC EA POC and Beta system development. We also describe the evaluation process and results for applicable COTS software. The document has seven chapters, a bibliography, and two appendices.

  2. System design and operation of a 100 kilovolt, 2 kilohertz pulse modulator for plasma source ion implantation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reass, W.A.

    1994-07-01

    This paper describes the electrical design and operation of a high power modulator system implemented for the Los Alamos Plasma Source Ion Implantation (PSII) facility. To test the viability of the PSII process for various automotive components, the modulator must accept wide variations of load impedance. Components have varying area and composition which must be processed with different plasmas. Additionally, the load impedance may change by large factors during the typical 20 uS pulse, due to plasma displacement currents and sheath growth. As a preliminary design to test the system viability for automotive component implantation, suitable for a manufacturing environment,more » circuit topology must be able to directly scale to high power versions, for increased component through-put. We have chosen an evolutionary design approach with component families of characterized performance, which should Ion result in a reliable modulator system with component lifetimes. The modulator utilizes a pair of Litton L-3408 hollow beam amplifier tubes as switching elements in a ``hot-deck`` configuration. Internal to the main of planar triode hot deck, an additional pair decks, configured in a totem pole circuit, provide input drive to the L-3408 mod-anodes. The modulator can output over 2 amps average current (at 100 kV) with 1 kW of modanode drive. Diagnostic electronics monitor the load and stops pulses for 100 mS when a load arcs occur. This paper, in addition to providing detailed engineering design information, will provide operational characteristics and reliability data that direct the design to the higher power, mass production line capable modulators.« less

  3. Computer Disaster Recovery Planning.

    ERIC Educational Resources Information Center

    Clark, Orvin R.

    Arguing that complete, reliable, up-to-date system documentation is critical for every data processing environment, this paper on computer disaster recovery planning begins by discussing the importance of such documentation both for recovering from a systems crash, and for system maintenance and enhancement. The various components of system…

  4. 45 CFR 310.1 - What definitions apply to this part?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...

  5. 45 CFR 310.1 - What definitions apply to this part?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...

  6. 45 CFR 310.1 - What definitions apply to this part?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...

  7. 45 CFR 310.1 - What definitions apply to this part?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...

  8. 45 CFR 310.1 - What definitions apply to this part?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...

  9. Engineering Design Thinking

    ERIC Educational Resources Information Center

    Lammi, Matthew; Becker, Kurt

    2013-01-01

    Engineering design thinking is "a complex cognitive process" including divergence-convergence, a systems perspective, ambiguity, and collaboration (Dym, Agogino, Eris, Frey, & Leifer, 2005, p. 104). Design is often complex, involving multiple levels of interacting components within a system that may be nested within or connected to other systems.…

  10. Automatic Processing of Metallurgical Abstracts for the Purpose of Information Retrieval. Final Report.

    ERIC Educational Resources Information Center

    Melton, Jessica S.

    Objectives of this project were to develop and test a method for automatically processing the text of abstracts for a document retrieval system. The test corpus consisted of 768 abstracts from the metallurgical section of Chemical Abstracts (CA). The system, based on a subject indexing rational, had two components: (1) a stored dictionary of words…

  11. Adaptive Distributed Intelligent Control Architecture for Future Propulsion Systems (Preprint)

    DTIC Science & Technology

    2007-04-01

    weight will be reduced by replacing heavy harness assemblies and FADECs , with distributed processing elements interconnected. This paper reviews...Digital Electronic Controls ( FADECs ), with distributed processing elements interconnected through a serial bus. Efficient data flow throughout the...because intelligence is embedded in components while overall control is maintained in the FADEC . The need for Distributed Control Systems in

  12. The distributed agent-based approach in the e-manufacturing environment

    NASA Astrophysics Data System (ADS)

    Sękala, A.; Kost, G.; Dobrzańska-Danikiewicz, A.; Banaś, W.; Foit, K.

    2015-11-01

    The deficiency of a coherent flow of information from a production department causes unplanned downtime and failures of machines and their equipment, which in turn results in production planning process based on incorrect and out-of-date information. All of these factors entail, as the consequence, the additional difficulties associated with the process of decision-making. They concern, among other, the coordination of components of a distributed system and providing the access to the required information, thereby generating unnecessary costs. The use of agent technology significantly speeds up the flow of information within the virtual enterprise. This paper includes the proposal of a multi-agent approach for the integration of processes within the virtual enterprise concept. The presented concept was elaborated to investigate the possible solutions of the ways of transmission of information in the production system taking into account the self-organization of constituent components. Thus it implicated the linking of the concept of multi-agent system with the system of managing the production information, based on the idea of e-manufacturing. The paper presents resulting scheme that should be the base for elaborating an informatics model of the target virtual system. The computer system itself is intended to be developed next.

  13. Development of integrated, zero-G pneumatic transporter/rotating paddle incinerator/catalytic afterburner subsystem for processing human wastes on board spacecraft

    NASA Technical Reports Server (NTRS)

    Fields, S. F.; Labak, L. J.; Honegger, R. J.

    1974-01-01

    A four component system was developed which consists of a particle size reduction mechanism, a pneumatic waste transport system, a rotating-paddle incinerator, and a catalytic afterburner to be integrated into a six-man, zero-g subsystem for processing human wastes on board spacecraft. The study included the development of different concepts or functions, the establishment of operational specifications, and a critical evaluation for each of the four components. A series of laboratory tests was run, and a baseline subsystem design was established. An operational specification was also written in preparation for detailed design and testing of this baseline subsystem.

  14. Desktop publishing and medical imaging: paper as hardcopy medium for digital images.

    PubMed

    Denslow, S

    1994-08-01

    Desktop-publishing software and hardware has progressed to the point that many widely used word-processing programs are capable of printing high-quality digital images with many shades of gray from black to white. Accordingly, it should be relatively easy to print digital medical images on paper for reports, instructional materials, and in research notes. Components were assembled that were necessary for extracting image data from medical imaging devices and converting the data to a form usable by word-processing software. A system incorporating these components was implemented in a medical setting and has been operating for 18 months. The use of this system by medical staff has been monitored.

  15. Observing System Simulation Experiment (OSSE) for the HyspIRI Spectrometer Mission

    NASA Technical Reports Server (NTRS)

    Turmon, Michael J.; Block, Gary L.; Green, Robert O.; Hua, Hook; Jacob, Joseph C.; Sobel, Harold R.; Springer, Paul L.; Zhang, Qingyuan

    2010-01-01

    The OSSE software provides an integrated end-to-end environment to simulate an Earth observing system by iteratively running a distributed modeling workflow based on the HyspIRI Mission, including atmospheric radiative transfer, surface albedo effects, detection, and retrieval for agile exploration of the mission design space. The software enables an Observing System Simulation Experiment (OSSE) and can be used for design trade space exploration of science return for proposed instruments by modeling the whole ground truth, sensing, and retrieval chain and to assess retrieval accuracy for a particular instrument and algorithm design. The OSSE in fra struc ture is extensible to future National Research Council (NRC) Decadal Survey concept missions where integrated modeling can improve the fidelity of coupled science and engineering analyses for systematic analysis and science return studies. This software has a distributed architecture that gives it a distinct advantage over other similar efforts. The workflow modeling components are typically legacy computer programs implemented in a variety of programming languages, including MATLAB, Excel, and FORTRAN. Integration of these diverse components is difficult and time-consuming. In order to hide this complexity, each modeling component is wrapped as a Web Service, and each component is able to pass analysis parameterizations, such as reflectance or radiance spectra, on to the next component downstream in the service workflow chain. In this way, the interface to each modeling component becomes uniform and the entire end-to-end workflow can be run using any existing or custom workflow processing engine. The architecture lets users extend workflows as new modeling components become available, chain together the components using any existing or custom workflow processing engine, and distribute them across any Internet-accessible Web Service endpoints. The workflow components can be hosted on any Internet-accessible machine. This has the advantages that the computations can be distributed to make best use of the available computing resources, and each workflow component can be hosted and maintained by their respective domain experts.

  16. Systems medicine: a new approach to clinical practice.

    PubMed

    Cardinal-Fernández, Pablo; Nin, Nicolás; Ruíz-Cabello, Jesús; Lorente, José A

    2014-10-01

    Most respiratory diseases are considered complex diseases as their susceptibility and outcomes are determined by the interaction between host-dependent factors (genetic factors, comorbidities, etc.) and environmental factors (exposure to microorganisms or allergens, treatments received, etc.) The reductionist approach in the study of diseases has been of fundamental importance for the understanding of the different components of a system. Systems biology or systems medicine is a complementary approach aimed at analyzing the interactions between the different components within one organizational level (genome, transcriptome, proteome), and then between the different levels. Systems medicine is currently used for the interpretation and understanding of the pathogenesis and pathophysiology of different diseases, biomarker discovery, design of innovative therapeutic targets, and the drawing up of computational models for different biological processes. In this review we discuss the most relevant concepts of the theory underlying systems medicine, as well as its applications in the various biological processes in humans. Copyright © 2013 SEPAR. Published by Elsevier Espana. All rights reserved.

  17. KENNEDY SPACE CENTER, FLA. - During power-up of the orbiter Discovery in the Orbiter Processing Facility, a technician moves a switch. Discovery has been undergoing Orbiter Major Modifications in the past year, ranging from wiring, control panels and black boxes to gaseous and fluid systems tubing and components. These systems were deserviced, disassembled, inspected, modified, reassembled, checked out and reserviced, as were most other systems onboard. The work includes the installation of the Multifunction Electronic Display Subsystem (MEDS) - a state-of-the-art “glass cockpit.”

    NASA Image and Video Library

    2003-08-27

    KENNEDY SPACE CENTER, FLA. - During power-up of the orbiter Discovery in the Orbiter Processing Facility, a technician moves a switch. Discovery has been undergoing Orbiter Major Modifications in the past year, ranging from wiring, control panels and black boxes to gaseous and fluid systems tubing and components. These systems were deserviced, disassembled, inspected, modified, reassembled, checked out and reserviced, as were most other systems onboard. The work includes the installation of the Multifunction Electronic Display Subsystem (MEDS) - a state-of-the-art “glass cockpit.”

  18. KENNEDY SPACE CENTER, FLA. - During power-up of the orbiter Discovery in the Orbiter Processing Facility, a technician turns on a switch. Discovery has been undergoing Orbiter Major Modifications in the past year, ranging from wiring, control panels and black boxes to gaseous and fluid systems tubing and components. These systems were deserviced, disassembled, inspected, modified, reassembled, checked out and reserviced, as were most other systems onboard. The work includes the installation of the Multifunction Electronic Display Subsystem (MEDS) - a state-of-the-art “glass cockpit.”

    NASA Image and Video Library

    2003-08-27

    KENNEDY SPACE CENTER, FLA. - During power-up of the orbiter Discovery in the Orbiter Processing Facility, a technician turns on a switch. Discovery has been undergoing Orbiter Major Modifications in the past year, ranging from wiring, control panels and black boxes to gaseous and fluid systems tubing and components. These systems were deserviced, disassembled, inspected, modified, reassembled, checked out and reserviced, as were most other systems onboard. The work includes the installation of the Multifunction Electronic Display Subsystem (MEDS) - a state-of-the-art “glass cockpit.”

  19. Enhancing links between visual short term memory, visual attention and cognitive control processes through practice: An electrophysiological insight.

    PubMed

    Fuggetta, Giorgio; Duke, Philip A

    2017-05-01

    The operation of attention on visible objects involves a sequence of cognitive processes. The current study firstly aimed to elucidate the effects of practice on neural mechanisms underlying attentional processes as measured with both behavioural and electrophysiological measures. Secondly, it aimed to identify any pattern in the relationship between Event-Related Potential (ERP) components which play a role in the operation of attention in vision. Twenty-seven participants took part in two recording sessions one week apart, performing an experimental paradigm which combined a match-to-sample task with a memory-guided efficient visual-search task within one trial sequence. Overall, practice decreased behavioural response times, increased accuracy, and modulated several ERP components that represent cognitive and neural processing stages. This neuromodulation through practice was also associated with an enhanced link between behavioural measures and ERP components and with an enhanced cortico-cortical interaction of functionally interconnected ERP components. Principal component analysis (PCA) of the ERP amplitude data revealed three components, having different rostro-caudal topographic representations. The first component included both the centro-parietal and parieto-occipital mismatch triggered negativity - involved in integration of visual representations of the target with current task-relevant representations stored in visual working memory - loaded with second negative posterior-bilateral (N2pb) component, involved in categorising specific pop-out target features. The second component comprised the amplitude of bilateral anterior P2 - related to detection of a specific pop-out feature - loaded with bilateral anterior N2, related to detection of conflicting features, and fronto-central mismatch triggered negativity. The third component included the parieto-occipital N1 - related to early neural responses to the stimulus array - which loaded with the second negative posterior-contralateral (N2pc) component, mediating the process of orienting and focusing covert attention on peripheral target features. We discussed these three components as representing different neurocognitive systems modulated with practice within which the input selection process operates. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  20. Low Cost, Upper Stage-Class Propulsion

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    The low cost, upper stage-class propulsion (LCUSP) element will develop a high strength copper alloy additive manufacturing (AM) process as well as critical components for an upper stage-class propulsion system that will be demonstrated with testing. As manufacturing technologies have matured, it now appears possible to build all the major components and subsystems of an upper stage-class rocket engine for substantially less money and much faster than traditionally done. However, several enabling technologies must be developed before that can happen. This activity will address these technologies and demonstrate the concept by designing, manufacturing, and testing the critical components of a rocket engine. The processes developed and materials' property data will be transitioned to industry upon completion of the activity. Technologies to enable the concept are AM copper alloy process development, AM post-processing finishing to minimize surface roughness, AM material deposition on existing copper alloy substrate, and materials characterization.

Top