Sample records for inherent operating reliability

  1. Space Transportation System Availability Relationships to Life Cycle Cost

    NASA Technical Reports Server (NTRS)

    Rhodes, Russel E.; Donahue, Benjamin B.; Chen, Timothy T.

    2009-01-01

    Future space transportation architectures and designs must be affordable. Consequently, their Life Cycle Cost (LCC) must be controlled. For the LCC to be controlled, it is necessary to identify all the requirements and elements of the architecture at the beginning of the concept phase. Controlling LCC requires the establishment of the major operational cost drivers. Two of these major cost drivers are reliability and maintainability, in other words, the system's availability (responsiveness). Potential reasons that may drive the inherent availability requirement are the need to control the number of unique parts and the spare parts required to support the transportation system's operation. For more typical space transportation systems used to place satellites in space, the productivity of the system will drive the launch cost. This system productivity is the resultant output of the system availability. Availability is equal to the mean uptime divided by the sum of the mean uptime plus the mean downtime. Since many operational factors cannot be projected early in the definition phase, the focus will be on inherent availability which is equal to the mean time between a failure (MTBF) divided by the MTBF plus the mean time to repair (MTTR) the system. The MTBF is a function of reliability or the expected frequency of failures. When the system experiences failures the result is added operational flow time, parts consumption, and increased labor with an impact to responsiveness resulting in increased LCC. The other function of availability is the MTTR, or maintainability. In other words, how accessible is the failed hardware that requires replacement and what operational functions are required before and after change-out to make the system operable. This paper will describe how the MTTR can be equated to additional labor, additional operational flow time, and additional structural access capability, all of which drive up the LCC. A methodology will be presented that provides the decision makers with the understanding necessary to place constraints on the design definition. This methodology for the major drivers will determine the inherent availability, safety, reliability, maintainability, and the life cycle cost of the fielded system. This methodology will focus on the achievement of an affordable, responsive space transportation system. It is the intent of this paper to not only provide the visibility of the relationships of these major attribute drivers (variables) to each other and the resultant system inherent availability, but also to provide the capability to bound the variables, thus providing the insight required to control the system's engineering solution. An example of this visibility is the need to provide integration of similar discipline functions to allow control of the total parts count of the space transportation system. Also, selecting a reliability requirement will place a constraint on parts count to achieve a given inherent availability requirement, or require accepting a larger parts count with the resulting higher individual part reliability requirements. This paper will provide an understanding of the relationship of mean repair time (mean downtime) to maintainability (accessibility for repair), and both mean time between failure (reliability of hardware) and the system inherent availability.

  2. Using software metrics and software reliability models to attain acceptable quality software for flight and ground support software for avionic systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Stella

    1992-01-01

    This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.

  3. Photovoltaic power system reliability considerations

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.

    1980-01-01

    An example of how modern engineering and safety techniques can be used to assure the reliable and safe operation of photovoltaic power systems is presented. This particular application is for a solar cell power system demonstration project designed to provide electric power requirements for remote villages. The techniques utilized involve a definition of the power system natural and operating environment, use of design criteria and analysis techniques, an awareness of potential problems via the inherent reliability and FMEA methods, and use of fail-safe and planned spare parts engineering philosophy.

  4. Photovoltaic power system reliability considerations

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.

    1980-01-01

    This paper describes an example of how modern engineering and safety techniques can be used to assure the reliable and safe operation of photovoltaic power systems. This particular application was for a solar cell power system demonstration project in Tangaye, Upper Volta, Africa. The techniques involve a definition of the power system natural and operating environment, use of design criteria and analysis techniques, an awareness of potential problems via the inherent reliability and FMEA methods, and use of a fail-safe and planned spare parts engineering philosophy.

  5. Aerospace Safety Advisory Panel

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The results of the Panel's activities are presented in a set of findings and recommendations. Highlighted here are both improvements in NASA's safety and reliability activities and specific areas where additional gains might be realized. One area of particular concern involves the curtailment or elimination of Space Shuttle safety and reliability enhancements. Several findings and recommendations address this area of concern, reflecting the opinion that safety and reliability enhancements are essential to the continued successful operation of the Space Shuttle. It is recommended that a comprehensive and continuing program of safety and reliability improvements in all areas of Space Shuttle hardware/software be considered an inherent component of ongoing Space Shuttle operations.

  6. Inherently safe in situ uranium recovery

    DOEpatents

    Krumhansl, James L; Brady, Patrick V

    2014-04-29

    An in situ recovery of uranium operation involves circulating reactive fluids through an underground uranium deposit. These fluids contain chemicals that dissolve the uranium ore. Uranium is recovered from the fluids after they are pumped back to the surface. Chemicals used to accomplish this include complexing agents that are organic, readily degradable, and/or have a predictable lifetime in an aquifer. Efficiency is increased through development of organic agents targeted to complexing tetravalent uranium rather than hexavalent uranium. The operation provides for in situ immobilization of some oxy-anion pollutants under oxidizing conditions as well as reducing conditions. The operation also artificially reestablishes reducing conditions on the aquifer after uranium recovery is completed. With the ability to have the impacted aquifer reliably remediated, the uranium recovery operation can be considered inherently safe.

  7. Space Transportation System Availability Requirements and Its Influencing Attributes Relationships

    NASA Technical Reports Server (NTRS)

    Rhodes, Russel E.; Adams, TImothy C.

    2008-01-01

    It is essential that management and engineering understand the need for an availability requirement for the customer's space transportation system as it enables the meeting of his needs, goal, and objectives. There are three types of availability, e.g., operational availability, achieved availability, or inherent availability. The basic definition of availability is equal to the mean uptime divided by the sum of the mean uptime plus the mean downtime. The major difference is the inclusiveness of the functions within the mean downtime and the mean uptime. This paper will address tIe inherent availability which only addresses the mean downtime as that mean time to repair or the time to determine the failed article, remove it, install a replacement article and verify the functionality of the repaired system. The definitions of operational availability include the replacement hardware supply or maintenance delays and other non-design factors in the mean downtime. Also with inherent availability the mean uptime will only consider the mean time between failures (other availability definitions consider this as mean time between maintenance - preventive and corrective maintenance) that requires the repair of the system to be functional. It is also essential that management and engineering understand all influencing attributes relationships to each other and to the resultant inherent availability requirement. This visibility will provide the decision makers with the understanding necessary to place constraints on the design definition for the major drivers that will determine the inherent availability, safety, reliability, maintainability, and the life cycle cost of the fielded system provided the customer. This inherent availability requirement may be driven by the need to use a multiple launch approach to placing humans on the moon or the desire to control the number of spare parts required to support long stays in either orbit or on the surface of the moon or mars. It is the intent of this paper to provide the visibility of relationships of these major attribute drivers (variables) to each other and the resultant system inherent availability, but also provide the capability to bound the variables providing engineering the insight required to control the system's engineering solution. An example of this visibility will be the need to provide integration of similar discipline functions to allow control of the total parts count of the space transportation system. Also the relationship visibility of selecting a reliability requirement will place a constraint on parts count to achieve a given inherent availability requirement or accepting a larger parts count with the resulting higher reliability requirement. This paper will provide an understanding for the relationship of mean repair time (mean downtime) to maintainability, e.g., accessibility for repair, and both mean time between failure, e.g., reliability of hardware and the system inherent availability. Having an understanding of these relationships and resulting requirements before starting the architectural design concept definition will avoid considerable time and money required to iterate the design to meet the redesign and assessment process required to achieve the results required of the customer's space transportation system. In fact the impact to the schedule to being able to deliver the system that meets the customer's needs, goals, and objectives may cause the customer to compromise his desired operational goal and objectives resulting in considerable increased life cycle cost of the fielded space transportation system.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katz, Jessica; Denholm, Paul; Pless, Jacquelyn

    Wind and solar are inherently more variable and uncertain than the traditional dispatchable thermal and hydro generators that have historically provided a majority of grid-supplied electricity. The unique characteristics of variable renewable energy (VRE) resources have resulted in many misperceptions regarding their contribution to a low-cost and reliable power grid. Common areas of concern include: 1) The potential need for increased operating reserves, 2) The impact of variability and uncertainty on operating costs and pollutant emissions of thermal plants, and 3) The technical limits of VRE penetration rates to maintain grid stability and reliability. This fact sheet corrects misperceptions inmore » these areas.« less

  9. A study on the real-time reliability of on-board equipment of train control system

    NASA Astrophysics Data System (ADS)

    Zhang, Yong; Li, Shiwei

    2018-05-01

    Real-time reliability evaluation is conducive to establishing a condition based maintenance system for the purpose of guaranteeing continuous train operation. According to the inherent characteristics of the on-board equipment, the connotation of reliability evaluation of on-board equipment is defined and the evaluation index of real-time reliability is provided in this paper. From the perspective of methodology and practical application, the real-time reliability of the on-board equipment is discussed in detail, and the method of evaluating the realtime reliability of on-board equipment at component level based on Hidden Markov Model (HMM) is proposed. In this method the performance degradation data is used directly to realize the accurate perception of the hidden state transition process of on-board equipment, which can achieve a better description of the real-time reliability of the equipment.

  10. Solar power satellite system definition study. Volume 1, phase 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A systems definition study of the solar satellite system (SPS) is presented. The technical feasibility of solar power satellites based on forecasts of technical capability in the various applicable technologies is assessed. The performance, cost, operational characteristics, reliability, and the suitability of SPS's as power generators for typical commercial electricity grids are discussed. The uncertainties inherent in the system characteristics forecasts are assessed.

  11. Effects of Response Bias and Judgment Framing on Operator Use of an Automated Aid in a Target Detection Task

    ERIC Educational Resources Information Center

    Rice, Stephen; McCarley, Jason S.

    2011-01-01

    Automated diagnostic aids prone to false alarms often produce poorer human performance in signal detection tasks than equally reliable miss-prone aids. However, it is not yet clear whether this is attributable to differences in the perceptual salience of the automated aids' misses and false alarms or is the result of inherent differences in…

  12. Reliability of COPVs Accounting for Margin of Safety on Design Burst

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L.N.

    2012-01-01

    In this paper, the stress rupture reliability of Carbon/Epoxy Composite Overwrapped Pressure Vessels (COPVs) is examined utilizing the classic Phoenix model and accounting for the differences between the design and the actual burst pressure, and the liner contribution effects. Stress rupture life primarily depends upon the fiber stress ratio which is defined as the ratio of stress in fibers at the maximum expected operating pressure to actual delivered fiber strength. The actual delivered fiber strength is calculated using the actual burst pressures of vessels established through burst tests. However, during the design phase the actual burst pressure is generally not known and to estimate the reliability of the vessels calculations are usually performed based upon the design burst pressure only. Since the design burst is lower than the actual burst, this process yields a much higher value for the stress ratio and consequently a conservative estimate for the reliability. Other complications arise due to the fact that the actual burst pressure and the liner contributions have inherent variability and therefore must be treated as random variables in order to compute the stress rupture reliability. Furthermore, the model parameters, which have to be established based on stress rupture tests of subscale vessels or coupons, have significant variability as well due to limited available data and hence must be properly accounted for. In this work an assessment of reliability of COPVs including both parameter uncertainties and physical variability inherent in liner and overwrap material behavior is made and estimates are provided in terms of degree of uncertainty in the actual burst pressure and the liner load sharing.

  13. Wind and Solar on the Power Grid: Myths and Misperceptions, Greening the Grid (Spanish Version)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Authors: Denholm, Paul; Cochran, Jaquelin; Brancucci Martinez-Anido, Carlo

    This is the Spanish version of the 'Greening the Grid - Wind and Solar on the Power Grid: Myths and Misperceptions'. Wind and solar are inherently more variable and uncertain than the traditional dispatchable thermal and hydro generators that have historically provided a majority of grid-supplied electricity. The unique characteristics of variable renewable energy (VRE) resources have resulted in many misperceptions regarding their contribution to a low-cost and reliable power grid. Common areas of concern include: 1) The potential need for increased operating reserves, 2) The impact of variability and uncertainty on operating costs and pollutant emissions of thermal plants,more » and 3) The technical limits of VRE penetration rates to maintain grid stability and reliability. This fact sheet corrects misperceptions in these areas.« less

  14. Stability of a Crystal Oscillator, Type Si530, Inside and Beyond its Specified Operating Temperature Range

    NASA Technical Reports Server (NTRS)

    Patterson, Richard L.; Hammoud, Ahmad

    2011-01-01

    Data acquisition and control systems depend on timing signals for proper operation and required accuracy. These clocked signals are typically provided by some form of an oscillator set to produce a repetitive, defined signal at a given frequency. Crystal oscillators are commonly used because they are less expensive, smaller, and more reliable than other types of oscillators. Because of the inherent characteristics of the crystal, the oscillators exhibit excellent frequency stability within the specified range of operational temperature. In some cases, however, some compensation techniques are adopted to further improve the thermal stability of a crystal oscillator. Very limited data exist on the performance and reliability of commercial-off-the-shelf (COTS) crystal oscillators at temperatures beyond the manufacturer's specified operating temperature range. This information is very crucial if any of these parts were to be used in circuits designed for use in space exploration missions where extreme temperature swings and thermal cycling are encountered. This report presents the results of the work obtained on the operation of Silicon Laboratories crystal oscillator, type Si530, under specified and extreme ambient temperatures.

  15. Communication System Architecture for Planetary Exploration

    NASA Technical Reports Server (NTRS)

    Braham, Stephen P.; Alena, Richard; Gilbaugh, Bruce; Glass, Brian; Norvig, Peter (Technical Monitor)

    2001-01-01

    Future human missions to Mars will require effective communications supporting exploration activities and scientific field data collection. Constraints on cost, size, weight and power consumption for all communications equipment make optimization of these systems very important. These information and communication systems connect people and systems together into coherent teams performing the difficult and hazardous tasks inherent in planetary exploration. The communication network supporting vehicle telemetry data, mission operations, and scientific collaboration must have excellent reliability, and flexibility.

  16. The Vocabulary of Brain Potentials: Inferring Cognitive Events from Brain Potentials in Operational Settings

    DTIC Science & Technology

    1976-08-01

    can easily change any of the parameters controlling the r, experimenter. B.2.3.3 The PLATO Laborato >• A block diagram of the laboratory is...the parameters of an adaptive filter, or to perform the computations required by the more complex displays. In addition to its role as the prime...by the inherent response variability which precludes reliable estimates of attention-sensitive parameters from a single observation. Thus

  17. Reliable and Fault-Tolerant Software-Defined Network Operations Scheme for Remote 3D Printing

    NASA Astrophysics Data System (ADS)

    Kim, Dongkyun; Gil, Joon-Min

    2015-03-01

    The recent wide expansion of applicable three-dimensional (3D) printing and software-defined networking (SDN) technologies has led to a great deal of attention being focused on efficient remote control of manufacturing processes. SDN is a renowned paradigm for network softwarization, which has helped facilitate remote manufacturing in association with high network performance, since SDN is designed to control network paths and traffic flows, guaranteeing improved quality of services by obtaining network requests from end-applications on demand through the separated SDN controller or control plane. However, current SDN approaches are generally focused on the controls and automation of the networks, which indicates that there is a lack of management plane development designed for a reliable and fault-tolerant SDN environment. Therefore, in addition to the inherent advantage of SDN, this paper proposes a new software-defined network operations center (SD-NOC) architecture to strengthen the reliability and fault-tolerance of SDN in terms of network operations and management in particular. The cooperation and orchestration between SDN and SD-NOC are also introduced for the SDN failover processes based on four principal SDN breakdown scenarios derived from the failures of the controller, SDN nodes, and connected links. The abovementioned SDN troubles significantly reduce the network reachability to remote devices (e.g., 3D printers, super high-definition cameras, etc.) and the reliability of relevant control processes. Our performance consideration and analysis results show that the proposed scheme can shrink operations and management overheads of SDN, which leads to the enhancement of responsiveness and reliability of SDN for remote 3D printing and control processes.

  18. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  19. Process for the physical segregation of minerals

    DOEpatents

    Yingling, Jon C.; Ganguli, Rajive

    2004-01-06

    With highly heterogeneous groups or streams of minerals, physical segregation using online quality measurements is an economically important first stage of the mineral beneficiation process. Segregation enables high quality fractions of the stream to bypass processing, such as cleaning operations, thereby reducing the associated costs and avoiding the yield losses inherent in any downstream separation process. The present invention includes various methods for reliably segregating a mineral stream into at least one fraction meeting desired quality specifications while at the same time maximizing yield of that fraction.

  20. Online Cable Tester and Rerouter

    NASA Technical Reports Server (NTRS)

    Lewis, Mark; Medelius, Pedro

    2012-01-01

    Hardware and algorithms have been developed to transfer electrical power and data connectivity safely, efficiently, and automatically from an identified damaged/defective wire in a cable to an alternate wire path. The combination of online cable testing capabilities, along with intelligent signal rerouting algorithms, allows the user to overcome the inherent difficulty of maintaining system integrity and configuration control, while autonomously rerouting signals and functions without introducing new failure modes. The incorporation of this capability will increase the reliability of systems by ensuring system availability during operations.

  1. Many-objective optimization and visual analytics reveal key trade-offs for London's water supply

    NASA Astrophysics Data System (ADS)

    Matrosov, Evgenii S.; Huskova, Ivana; Kasprzyk, Joseph R.; Harou, Julien J.; Lambert, Chris; Reed, Patrick M.

    2015-12-01

    In this study, we link a water resource management simulator to multi-objective search to reveal the key trade-offs inherent in planning a real-world water resource system. We consider new supplies and demand management (conservation) options while seeking to elucidate the trade-offs between the best portfolios of schemes to satisfy projected water demands. Alternative system designs are evaluated using performance measures that minimize capital and operating costs and energy use while maximizing resilience, engineering and environmental metrics, subject to supply reliability constraints. Our analysis shows many-objective evolutionary optimization coupled with state-of-the art visual analytics can help planners discover more diverse water supply system designs and better understand their inherent trade-offs. The approach is used to explore future water supply options for the Thames water resource system (including London's water supply). New supply options include a new reservoir, water transfers, artificial recharge, wastewater reuse and brackish groundwater desalination. Demand management options include leakage reduction, compulsory metering and seasonal tariffs. The Thames system's Pareto approximate portfolios cluster into distinct groups of water supply options; for example implementing a pipe refurbishment program leads to higher capital costs but greater reliability. This study highlights that traditional least-cost reliability constrained design of water supply systems masks asset combinations whose benefits only become apparent when more planning objectives are considered.

  2. Space Station Freedom power supply commonality via modular design

    NASA Technical Reports Server (NTRS)

    Krauthamer, S.; Gangal, M. D.; Das, R.

    1990-01-01

    At mature operations, Space Station Freedom will need more than 2000 power supplies to feed housekeeping and user loads. Advanced technology power supplies from 20 to 250 W have been hybridized for terrestrial, aerospace, and industry applications in compact, efficient, reliable, lightweight packages compatible with electromagnetic interference requirements. The use of these hybridized packages as modules, either singly or in parallel, to satisfy the wide range of user power supply needs for all elements of the station is proposed. Proposed characteristics for the power supplies include common mechanical packaging, digital control, self-protection, high efficiency at full and partial loads, synchronization capability to reduce electromagnetic interference, redundancy, and soft-start capability. The inherent reliability is improved compared with conventional discrete component power supplies because the hybrid circuits use high-reliability components such as ceramic capacitors. Reliability is further improved over conventional supplies because the hybrid packages, which may be treated as a single part, reduce the parts count in the power supply.

  3. Effects of response bias and judgment framing on operator use of an automated aid in a target detection task.

    PubMed

    Rice, Stephen; McCarley, Jason S

    2011-12-01

    Automated diagnostic aids prone to false alarms often produce poorer human performance in signal detection tasks than equally reliable miss-prone aids. However, it is not yet clear whether this is attributable to differences in the perceptual salience of the automated aids' misses and false alarms or is the result of inherent differences in operators' cognitive responses to different forms of automation error. The present experiments therefore examined the effects of automation false alarms and misses on human performance under conditions in which the different forms of error were matched in their perceptual characteristics. Young adult participants performed a simulated baggage x-ray screening task while assisted by an automated diagnostic aid. Judgments from the aid were rendered as text messages presented at the onset of each trial, and every trial was followed by a second text message providing response feedback. Thus, misses and false alarms from the aid were matched for their perceptual salience. Experiment 1 found that even under these conditions, false alarms from the aid produced poorer human performance and engendered lower automation use than misses from the aid. Experiment 2, however, found that the asymmetry between misses and false alarms was reduced when the aid's false alarms were framed as neutral messages rather than explicit misjudgments. Results suggest that automation false alarms and misses differ in their inherent cognitive salience and imply that changes in diagnosis framing may allow designers to encourage better use of imperfectly reliable automated aids.

  4. Asymptotically reliable transport of multimedia/graphics over wireless channels

    NASA Astrophysics Data System (ADS)

    Han, Richard Y.; Messerschmitt, David G.

    1996-03-01

    We propose a multiple-delivery transport service tailored for graphics and video transported over connections with wireless access. This service operates at the interface between the transport and application layers, balancing the subjective delay and image quality objectives of the application with the low reliability and limited bandwidth of the wireless link. While techniques like forward-error correction, interleaving and retransmission improve reliability over wireless links, they also increase latency substantially when bandwidth is limited. Certain forms of interactive multimedia datatypes can benefit from an initial delivery of a corrupt packet to lower the perceptual latency, as long as reliable delivery occurs eventually. Multiple delivery of successively refined versions of the received packet, terminating when a sufficiently reliable version arrives, exploits the redundancy inherently required to improve reliability without a traffic penalty. Modifications to acknowledgment-repeat-request (ARQ) methods to implement this transport service are proposed, which we term `leaky ARQ'. For the specific case of pixel-coded window-based text/graphics, we describe additional functions needed to more effectively support urgent delivery and asymptotic reliability. X server emulation suggests that users will accept a multi-second delay between a (possibly corrupt) packet and the ultimate reliably-delivered version. The relaxed delay for reliable delivery can be exploited for traffic capacity improvement using scheduling of retransmissions.

  5. Processing and representation of meta-data for sleep apnea diagnosis with an artificial intelligence approach.

    PubMed

    Nettleton, D; Muñiz, J

    2001-09-01

    In this article, we revise and try to resolve some of the problems inherent in questionnaire screening of sleep apnea cases and apnea diagnosis based on attributes which are relevant and reliable. We present a way of learning information about the relevance of the data, comparing this with the definition of the information by the medical expert. We generate a predictive data model using a data aggregation operator which takes relevance and reliability information about the data into account to produce a diagnosis for each case. We also introduce a grade of membership for each question response which allows the patient to indicate a level of confidence or doubt in their own judgement. The method is tested with data collected from patients in a Sleep Clinic using questionnaires specially designed for the study. Other artificial intelligence predictive modeling algorithms are also tested on the same data and their predictive accuracy compared to that of the aggregation operator.

  6. Operation of a New COTS Crystal Oscillator - CXOMHT over a Wide Temperature Range

    NASA Technical Reports Server (NTRS)

    Patterson, Richard; Hammoud, Ahmad

    2011-01-01

    Crystal oscillators are extensively used in electronic circuits to provide timing or clocking signals in data acquisition, communications links, and control systems, to name a few. They are affordable, small in size, and reliable. Because of the inherent characteristics of the crystal, the oscillator usually exhibits extreme accuracy in its output frequency within the intrinsic crystal stability. Stability of the frequency could be affected under varying load levels or other operational conditions. Temperature is one of those important factors that influence the frequency stability of an oscillator; as it does to the functionality of other electronic components. Electronics designed for use in NASA deep space and planetary exploration missions are expected to be exposed to extreme temperatures and thermal cycling over a wide range. Thus, it is important to design and develop circuits that are able to operate efficiently and reliably under in these harsh temperature environments. Most of the commercial-off-the-shelf (COTS) devices are very limited in terms of their specified operational temperature while very few custom-made commercial and military-grade parts have the ability to operate in a slightly wider range of temperature than those of the COTS parts. These parts are usually designed for operation under one temperature extreme, i.e. hot or cold, and do not address the wide swing in the operational temperature, which is typical of the space environment. For safe and successful space missions, electronic systems must therefore be designed not only to withstand the extreme temperature exposure but also to operate efficiently and reliably. This report presents the results obtained on the evaluation of a new COTS crystal oscillator under extreme temperatures.

  7. Improving Security for SCADA Sensor Networks with Reputation Systems and Self-Organizing Maps.

    PubMed

    Moya, José M; Araujo, Alvaro; Banković, Zorana; de Goyeneche, Juan-Mariano; Vallejo, Juan Carlos; Malagón, Pedro; Villanueva, Daniel; Fraga, David; Romero, Elena; Blesa, Javier

    2009-01-01

    The reliable operation of modern infrastructures depends on computerized systems and Supervisory Control and Data Acquisition (SCADA) systems, which are also based on the data obtained from sensor networks. The inherent limitations of the sensor devices make them extremely vulnerable to cyberwarfare/cyberterrorism attacks. In this paper, we propose a reputation system enhanced with distributed agents, based on unsupervised learning algorithms (self-organizing maps), in order to achieve fault tolerance and enhanced resistance to previously unknown attacks. This approach has been extensively simulated and compared with previous proposals.

  8. Improving Security for SCADA Sensor Networks with Reputation Systems and Self-Organizing Maps

    PubMed Central

    Moya, José M.; Araujo, Álvaro; Banković, Zorana; de Goyeneche, Juan-Mariano; Vallejo, Juan Carlos; Malagón, Pedro; Villanueva, Daniel; Fraga, David; Romero, Elena; Blesa, Javier

    2009-01-01

    The reliable operation of modern infrastructures depends on computerized systems and Supervisory Control and Data Acquisition (SCADA) systems, which are also based on the data obtained from sensor networks. The inherent limitations of the sensor devices make them extremely vulnerable to cyberwarfare/cyberterrorism attacks. In this paper, we propose a reputation system enhanced with distributed agents, based on unsupervised learning algorithms (self-organizing maps), in order to achieve fault tolerance and enhanced resistance to previously unknown attacks. This approach has been extensively simulated and compared with previous proposals. PMID:22291569

  9. NASA flight cell and battery issues

    NASA Technical Reports Server (NTRS)

    Schulze, N. R.

    1989-01-01

    The author presents the important battery and cell problems, encompassing both test failures and accidents, which were encountered during the past year. Practical issues facing programs, which have to be considered in the development of a battery program strategy, are addressed. The problems of one program, the GRO (Gamma Ray Observatory), during the past year are focused on to illustrate the fundamental types of battery problems that occur. Problems encountered by other programs are briefly mentioned to complete the accounting. Two major categories of issues are defined, namely, whose which are quality and design related, i.e., problems having inherent manufacturing-process-related aspects with an impact on cell reliability, and these which are accident triggered or man induced, i.e., those operational issues having an impact on battery and cell reliability.

  10. Electronic synoptic operative reporting: assessing the reliability and completeness of synoptic reports for pancreatic resection.

    PubMed

    Park, Jason; Pillarisetty, Venu G; Brennan, Murray F; Jarnagin, William R; D'Angelica, Michael I; Dematteo, Ronald P; G Coit, Daniel; Janakos, Maria; Allen, Peter J

    2010-09-01

    Electronic synoptic operative reports (E-SORs) have replaced dictated reports at many institutions, but whether E-SORs adequately document the components and findings of an operation has received limited study. This study assessed the reliability and completeness of E-SORs for pancreatic surgery developed at our institution. An attending surgeon and surgical fellow prospectively and independently completed an E-SOR after each of 112 major pancreatic resections (78 proximal, 29 distal, and 5 central) over a 10-month period (September 2008 to June 2009). Reliability was assessed by calculating the interobserver agreement between attending physician and fellow reports. Completeness was assessed by comparing E-SORs to a case-matched (surgeon and procedure) historical control of dictated reports, using a 39-item checklist developed through an internal and external query of 13 high-volume pancreatic surgeons. Interobserver agreement between attending and fellow was moderate to very good for individual categorical E-SOR items (kappa = 0.65 to 1.00, p < 0.001 for all items). Compared with dictated reports, E-SORs had significantly higher completeness checklist scores (mean 88.8 +/- 5.4 vs 59.6 +/- 9.2 [maximum possible score, 100], p < 0.01) and were available in patients' electronic records in a significantly shorter interval of time (median 0.5 vs 5.8 days from case end, p < 0.01). The mean time taken to complete E-SORs was 4.0 +/- 1.6 minutes per case. E-SORs for pancreatic surgery are reliable, complete in data collected, and rapidly available, all of which support their clinical implementation. The inherent strengths of E-SORs offer real promise of a new standard for operative reporting and health communication. Copyright 2010 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  11. Reliability analysis of a wastewater treatment plant using fault tree analysis and Monte Carlo simulation.

    PubMed

    Taheriyoun, Masoud; Moradinejad, Saber

    2015-01-01

    The reliability of a wastewater treatment plant is a critical issue when the effluent is reused or discharged to water resources. Main factors affecting the performance of the wastewater treatment plant are the variation of the influent, inherent variability in the treatment processes, deficiencies in design, mechanical equipment, and operational failures. Thus, meeting the established reuse/discharge criteria requires assessment of plant reliability. Among many techniques developed in system reliability analysis, fault tree analysis (FTA) is one of the popular and efficient methods. FTA is a top down, deductive failure analysis in which an undesired state of a system is analyzed. In this study, the problem of reliability was studied on Tehran West Town wastewater treatment plant. This plant is a conventional activated sludge process, and the effluent is reused in landscape irrigation. The fault tree diagram was established with the violation of allowable effluent BOD as the top event in the diagram, and the deficiencies of the system were identified based on the developed model. Some basic events are operator's mistake, physical damage, and design problems. The analytical method is minimal cut sets (based on numerical probability) and Monte Carlo simulation. Basic event probabilities were calculated according to available data and experts' opinions. The results showed that human factors, especially human error had a great effect on top event occurrence. The mechanical, climate, and sewer system factors were in subsequent tier. Literature shows applying FTA has been seldom used in the past wastewater treatment plant (WWTP) risk analysis studies. Thus, the developed FTA model in this study considerably improves the insight into causal failure analysis of a WWTP. It provides an efficient tool for WWTP operators and decision makers to achieve the standard limits in wastewater reuse and discharge to the environment.

  12. Inherently Safe and Long-Life Fission Power System for Lunar Outposts

    NASA Astrophysics Data System (ADS)

    Schriener, T. M.; El-Genk, Mohamed S.

    Power requirements for future lunar outposts, of 10's to 100's kWe, can be fulfilled using nuclear reactor power systems. In addition to the long life and operation reliability, safety is paramount in all phases, including fabrication and assembly, launch, emplacement below grade on the lunar surface, operation, post-operation decay heat removal and long-term storage and eventual retrieval. This paper introduces the Solid Core-Sectored Compact Reactor (SC-SCoRe) and power system with static components and no single point failures. They ensure reliable continuous operation for ~21 years and fulfill the safety requirements. The SC-SCoRe nominally generates 1.0 MWth at liquid NaK-56 coolant inlet and exit temperatures of 850 K and 900 K and the power system provides 38 kWe at high DC voltage using SiGe thermoelectric (TE) conversion assemblies. In case of a loss of coolant or cooling in a reactor core sector, the power system continues to operate; generating ~4 kWe to the outpost for emergency life support needs. The post-operation storage of the reactor below grade on the lunar surface is a safe and practical choice. The total radioactivity in the reactor drops from ~1 million Ci, immediately at shutdown, to below 164 Ci after 300 years of storage. At such time, the reactor is retrieved safely with no contamination or environmental concerns.

  13. Government/Industry Workshop on Payload Loads Technology

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A fully operational space shuttle is discussed which will offer science the opportunity to explore near earth orbit and finally interplanetary space on nearly a limitless basis. This multiplicity of payload/experiment combinations and frequency of launches places many burdens on dynamicists to predict launch and landing environments accurately and efficiently. Two major problems are apparent in the attempt to design for the diverse environments: (1) balancing the design criteria (loads, etc.) between launch and orbit operations, and (2) developing analytical techniques that are reliable, accurate, efficient, and low cost to meet the challenge of multiple launches and payloads. This paper deals with the key issues inherent in these problems, the key trades required, the basic approaches needed, and a summary of the state-of-the-art techniques.

  14. Developing a model for hospital inherent safety assessment: Conceptualization and validation.

    PubMed

    Yari, Saeed; Akbari, Hesam; Gholami Fesharaki, Mohammad; Khosravizadeh, Omid; Ghasemi, Mohammad; Barsam, Yalda; Akbari, Hamed

    2018-01-01

    Paying attention to the safety of hospitals, as the most crucial institute for providing medical and health services wherein a bundle of facilities, equipment, and human resource exist, is of significant importance. The present research aims at developing a model for assessing hospitals' safety based on principles of inherent safety design. Face validity (30 experts), content validity (20 experts), construct validity (268 examples), convergent validity, and divergent validity have been employed to validate the prepared questionnaire; and the items analysis, the Cronbach's alpha test, ICC test (to measure reliability of the test), composite reliability coefficient have been used to measure primary reliability. The relationship between variables and factors has been confirmed at 0.05 significance level by conducting confirmatory factor analysis (CFA) and structural equations modeling (SEM) technique with the use of Smart-PLS. R-square and load factors values, which were higher than 0.67 and 0.300 respectively, indicated the strong fit. Moderation (0.970), simplification (0.959), substitution (0.943), and minimization (0.5008) have had the most weights in determining the inherent safety of hospital respectively. Moderation, simplification, and substitution, among the other dimensions, have more weight on the inherent safety, while minimization has the less weight, which could be due do its definition as to minimize the risk.

  15. Making statistical inferences about software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1988-01-01

    Failure times of software undergoing random debugging can be modelled as order statistics of independent but nonidentically distributed exponential random variables. Using this model inferences can be made about current reliability and, if debugging continues, future reliability. This model also shows the difficulty inherent in statistical verification of very highly reliable software such as that used by digital avionics in commercial aircraft.

  16. Reliability Evaluation of Machine Center Components Based on Cascading Failure Analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Ying-Zhi; Liu, Jin-Tong; Shen, Gui-Xiang; Long, Zhe; Sun, Shu-Guang

    2017-07-01

    In order to rectify the problems that the component reliability model exhibits deviation, and the evaluation result is low due to the overlook of failure propagation in traditional reliability evaluation of machine center components, a new reliability evaluation method based on cascading failure analysis and the failure influenced degree assessment is proposed. A direct graph model of cascading failure among components is established according to cascading failure mechanism analysis and graph theory. The failure influenced degrees of the system components are assessed by the adjacency matrix and its transposition, combined with the Pagerank algorithm. Based on the comprehensive failure probability function and total probability formula, the inherent failure probability function is determined to realize the reliability evaluation of the system components. Finally, the method is applied to a machine center, it shows the following: 1) The reliability evaluation values of the proposed method are at least 2.5% higher than those of the traditional method; 2) The difference between the comprehensive and inherent reliability of the system component presents a positive correlation with the failure influenced degree of the system component, which provides a theoretical basis for reliability allocation of machine center system.

  17. Effects Of Environmental And Operational Stresses On RF MEMS Switch Technologies For Space Applications

    NASA Technical Reports Server (NTRS)

    Jah, Muzar; Simon, Eric; Sharma, Ashok

    2003-01-01

    Micro Electro Mechanical Systems (MEMS) have been heralded for their ability to provide tremendous advantages in electronic systems through increased electrical performance, reduced power consumption, and higher levels of device integration with a reduction of board real estate. RF MEMS switch technology offers advantages such as low insertion loss (0.1- 0.5 dB), wide bandwidth (1 GHz-100 GHz), and compatibility with many different process technologies (quartz, high resistivity Si, GaAs) which can replace the use of traditional electronic switches, such as GaAs FETS and PIN Diodes, in microwave systems for low signal power (x < 500 mW) applications. Although the electrical characteristics of RF MEMS switches far surpass any existing technologies, the unknown reliability, due to the lack of information concerning failure modes and mechanisms inherent to MEMS devices, create an obstacle to insertion of MEMS technology into high reliability applications. All MEMS devices are sensitive to moisture and contaminants, issues easily resolved by hermetic or near-hermetic packaging. Two well-known failure modes of RF MEMS switches are charging in the dielectric layer of capacitive membrane switches and contact interface stiction of metal-metal switches. Determining the integrity of MEMS devices when subjected to the shock, vibration, temperature extremes, and radiation of the space environment is necessary to facilitate integration into space systems. This paper will explore the effects of different environmental stresses, operational life cycling, temperature, mechanical shock, and vibration on the first commercially available RF MEMS switches to identify relevant failure modes and mechanisms inherent to these device and packaging schemes for space applications. This paper will also describe RF MEMS Switch technology under development at NASA GSFC.

  18. Small space station electrical power system design concepts

    NASA Technical Reports Server (NTRS)

    Jones, G. M.; Mercer, L. N.

    1976-01-01

    A small manned facility, i.e., a small space station, placed in earth orbit by the Shuttle transportation system would be a viable, cost effective addition to the basic Shuttle system to provide many opportunities for R&D programs, particularly in the area of earth applications. The small space station would have many similarities with Skylab. This paper presents design concepts for an electrical power system (EPS) for the small space station based on Skylab experience, in-house work at Marshall Space Flight Center, SEPS (Solar Electric Propulsion Stage) solar array development studies, and other studies sponsored by MSFC. The proposed EPS would be a solar array/secondary battery system. Design concepts expressed are based on maximizing system efficiency and five year operational reliability. Cost, weight, volume, and complexity considerations are inherent in the concepts presented. A small space station EPS based on these concepts would be highly efficient, reliable, and relatively inexpensive.

  19. Modeling reality

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1990-01-01

    Although powerful computers have allowed complex physical and manmade hardware systems to be modeled successfully, we have encountered persistent problems with the reliability of computer models for systems involving human learning, human action, and human organizations. This is not a misfortune; unlike physical and manmade systems, human systems do not operate under a fixed set of laws. The rules governing the actions allowable in the system can be changed without warning at any moment, and can evolve over time. That the governing laws are inherently unpredictable raises serious questions about the reliability of models when applied to human situations. In these domains, computers are better used, not for prediction and planning, but for aiding humans. Examples are systems that help humans speculate about possible futures, offer advice about possible actions in a domain, systems that gather information from the networks, and systems that track and support work flows in organizations.

  20. Design of a dispersion interferometer combined with a polarimeter to increase the electron density measurement reliability on ITER

    NASA Astrophysics Data System (ADS)

    Akiyama, T.; Sirinelli, A.; Watts, C.; Shigin, P.; Vayakis, G.; Walsh, M.

    2016-11-01

    A dispersion interferometer is a reliable density measurement system and is being designed as a complementary density diagnostic on ITER. The dispersion interferometer is inherently insensitive to mechanical vibrations, and a combined polarimeter with the same line of sight can correct fringe jump errors. A proof of the principle of the CO2 laser dispersion interferometer combined with the PEM polarimeter was recently conducted, where the phase shift and the polarization angle were successfully measured simultaneously. Standard deviations of the line-average density and the polarization angle measurements over 1 s are 9 × 1016 m-2 and 0.19°, respectively, with a time constant of 100 μs. Drifts of the zero point, which determine the resolution in steady-state operation, correspond to 0.25% and 1% of the phase shift and the Faraday rotation angle expected on ITER.

  1. Design of a dispersion interferometer combined with a polarimeter to increase the electron density measurement reliability on ITER.

    PubMed

    Akiyama, T; Sirinelli, A; Watts, C; Shigin, P; Vayakis, G; Walsh, M

    2016-11-01

    A dispersion interferometer is a reliable density measurement system and is being designed as a complementary density diagnostic on ITER. The dispersion interferometer is inherently insensitive to mechanical vibrations, and a combined polarimeter with the same line of sight can correct fringe jump errors. A proof of the principle of the CO 2 laser dispersion interferometer combined with the PEM polarimeter was recently conducted, where the phase shift and the polarization angle were successfully measured simultaneously. Standard deviations of the line-average density and the polarization angle measurements over 1 s are 9 × 10 16 m -2 and 0.19°, respectively, with a time constant of 100 μs. Drifts of the zero point, which determine the resolution in steady-state operation, correspond to 0.25% and 1% of the phase shift and the Faraday rotation angle expected on ITER.

  2. STGT program: Ada coding and architecture lessons learned

    NASA Technical Reports Server (NTRS)

    Usavage, Paul; Nagurney, Don

    1992-01-01

    STGT (Second TDRSS Ground Terminal) is currently halfway through the System Integration Test phase (Level 4 Testing). To date, many software architecture and Ada language issues have been encountered and solved. This paper, which is the transcript of a presentation at the 3 Dec. meeting, attempts to define these lessons plus others learned regarding software project management and risk management issues, training, performance, reuse, and reliability. Observations are included regarding the use of particular Ada coding constructs, software architecture trade-offs during the prototyping, development and testing stages of the project, and dangers inherent in parallel or concurrent systems, software, hardware, and operations engineering.

  3. Synthetic Space Vector Modulation

    DTIC Science & Technology

    2013-06-01

    especially batteries without fancy controls. Inherently, DC machine commutation is environmentally sensitive and maintenance intensive at well as...reliable DC power supplies especially batteries without fancy controls. Inherently, DC machine commutation is environmentally sensitive and maintenance...Drives and Energy Systems, New Delhi, India , 20-23 December, 2010. [12] PIC18F2331/2431/4331/4431 datasheet DS39616B, Microchip Technology Inc

  4. Inherent Conservatism in Deterministic Quasi-Static Structural Analysis

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1997-01-01

    The cause of the long-suspected excessive conservatism in the prevailing structural deterministic safety factor has been identified as an inherent violation of the error propagation laws when reducing statistical data to deterministic values and then combining them algebraically through successive structural computational processes. These errors are restricted to the applied stress computations, and because mean and variations of the tolerance limit format are added, the errors are positive, serially cumulative, and excessively conservative. Reliability methods circumvent these errors and provide more efficient and uniform safe structures. The document is a tutorial on the deficiencies and nature of the current safety factor and of its improvement and transition to absolute reliability.

  5. On Applications of Disruption Tolerant Networking to Optical Networking in Space

    NASA Technical Reports Server (NTRS)

    Hylton, Alan Guy; Raible, Daniel E.; Juergens, Jeffrey; Iannicca, Dennis

    2012-01-01

    The integration of optical communication links into space networks via Disruption Tolerant Networking (DTN) is a largely unexplored area of research. Building on successful foundational work accomplished at JPL, we discuss a multi-hop multi-path network featuring optical links. The experimental test bed is constructed at the NASA Glenn Research Center featuring multiple Ethernet-to-fiber converters coupled with free space optical (FSO) communication channels. The test bed architecture models communication paths from deployed Mars assets to the deep space network (DSN) and finally to the mission operations center (MOC). Reliable versus unreliable communication methods are investigated and discussed; including reliable transport protocols, custody transfer, and fragmentation. Potential commercial applications may include an optical communications infrastructure deployment to support developing nations and remote areas, which are unburdened with supporting an existing heritage means of telecommunications. Narrow laser beam widths and control of polarization states offer inherent physical layer security benefits with optical communications over RF solutions. This paper explores whether or not DTN is appropriate for space-based optical networks, optimal payload sizes, reliability, and a discussion on security.

  6. Capabilities and constraints of combustion diagnostics in microgravity

    NASA Technical Reports Server (NTRS)

    Greenberg, Paul S.

    1993-01-01

    A significant scientific return from both existing and proposed microgravity combustion science experiments is substantially dependent on the availability of diagnostic systems for the collection of the required scientific data. To date, the available diagnostic instrumentation has consisted primarily of conventional photographic media and intrusive temperature and velocity probes, such as thermocouples and hot wire anemometers. This situation has arisen primarily due to the unique and severe operational constraints inherent in reduced gravity experimentation. Each of the various reduced gravity facilities is accompanied by its own peculiar envelope of capabilities and constraints. Drop towers, for example, pose strict limitations on available working volume and power, as well as autonomy of operation. In contrast, hardware developed for space flight applications can be somewhat less constrained in regards to the aforementioned quantities, but is additionally concerned with numerous issues involving safety and reliability.

  7. Digital Sensor Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, Ken D.; Quinn, Edward L.; Mauck, Jerry L.

    The nuclear industry has been slow to incorporate digital sensor technology into nuclear plant designs due to concerns with digital qualification issues. However, the benefits of digital sensor technology for nuclear plant instrumentation are substantial in terms of accuracy and reliability. This paper, which refers to a final report issued in 2013, demonstrates these benefits in direct comparisons of digital and analog sensor applications. Improved accuracy results from the superior operating characteristics of digital sensors. These include improvements in sensor accuracy and drift and other related parameters which reduce total loop uncertainty and thereby increase safety and operating margins. Anmore » example instrument loop uncertainty calculation for a pressure sensor application is presented to illustrate these improvements. This is a side-by-side comparison of the instrument loop uncertainty for both an analog and a digital sensor in the same pressure measurement application. Similarly, improved sensor reliability is illustrated with a sample calculation for determining the probability of failure on demand, an industry standard reliability measure. This looks at equivalent analog and digital temperature sensors to draw the comparison. The results confirm substantial reliability improvement with the digital sensor, due in large part to ability to continuously monitor the health of a digital sensor such that problems can be immediately identified and corrected. This greatly reduces the likelihood of a latent failure condition of the sensor at the time of a design basis event. Notwithstanding the benefits of digital sensors, there are certain qualification issues that are inherent with digital technology and these are described in the report. One major qualification impediment for digital sensor implementation is software common cause failure (SCCF).« less

  8. Computing Reliabilities Of Ceramic Components Subject To Fracture

    NASA Technical Reports Server (NTRS)

    Nemeth, N. N.; Gyekenyesi, J. P.; Manderscheid, J. M.

    1992-01-01

    CARES calculates fast-fracture reliability or failure probability of macroscopically isotropic ceramic components. Program uses results from commercial structural-analysis program (MSC/NASTRAN or ANSYS) to evaluate reliability of component in presence of inherent surface- and/or volume-type flaws. Computes measure of reliability by use of finite-element mathematical model applicable to multiple materials in sense model made function of statistical characterizations of many ceramic materials. Reliability analysis uses element stress, temperature, area, and volume outputs, obtained from two-dimensional shell and three-dimensional solid isoparametric or axisymmetric finite elements. Written in FORTRAN 77.

  9. A reliability analysis framework with Monte Carlo simulation for weld structure of crane's beam

    NASA Astrophysics Data System (ADS)

    Wang, Kefei; Xu, Hongwei; Qu, Fuzheng; Wang, Xin; Shi, Yanjun

    2018-04-01

    The reliability of the crane product in engineering is the core competitiveness of the product. This paper used Monte Carlo method analyzed the reliability of the weld metal structure of the bridge crane whose limit state function is mathematical expression. Then we obtained the minimum reliable welding feet height value for the welds between cover plate and web plate on main beam in different coefficients of variation. This paper provides a new idea and reference for the growth of the inherent reliability of crane.

  10. Sustained performance of 8 MeV Microtron

    NASA Astrophysics Data System (ADS)

    Sanjeev, Ganesh

    2012-11-01

    Energetic electrons and intense bremsstrahlung radiation from 8 MeV Microtron are being utilized in variety of collaborative research programs in radiation physics and allied sciences involving premier institutions of the country and sister universities of the region. The first of its kind electron accelerator in the country, set up at Mangalore University in collaboration with RRCAT Indore and BARC Mumbai, has been facilitating researchers since its inception with its inherent simplicity, ease of construction, low cost and excellent beam quality. A bird's eye view on the reliable aspects of the machine, efforts behind the continuous operation of the accelerator and important applications of the accelerator in physical and biological sciences are presented in this paper.

  11. Wind Power Plant Evaluation Naval Auxiliary Landing Field, San Clemente Island, California: Period of Performance 24 September 1999--15 December 2000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olsen, T.L.; Gulman, P.J.; McKenna, E.

    2000-12-11

    The purpose of this report is to evaluate the wind power benefits and impacts to the San Clement Island wind power system, including energy savings, emissions reduction, system stability, and decreased naval dependence on fossil fuel at the island. The primary goal of the SCI wind power system has been to operate with the existing diesel power plant and provide equivalent or better power quality and system reliability than the existing diesel system. The wind system is intended to reduce, as far as possible, the use of diesel fuel and the inherent generation of nitrogen oxide emissions and other pollutants.

  12. The inherent weaknesses in industrial control systems devices; hacking and defending SCADA systems

    NASA Astrophysics Data System (ADS)

    Bianco, Louis J.

    The North American Electric Reliability Corporation (NERC) is about to enforce their NERC Critical Infrastructure Protection (CIP) Version Five and Six requirements on July 1st 2016. The NERC CIP requirements are a set of cyber security standards designed to protect cyber assets essential the reliable operation of the electric grid. The new Version Five and Six requirements are a major revision to the Version Three (currently enforced) requirements. The new requirements also bring substations into scope alongside Energy Control Centers. When the Version Five requirements were originally drafted they were vague, causing in depth discussions throughout the industry. The ramifications of these requirements has made owners look at their systems in depth, questioning how much money it will take to meet these requirements. Some owners saw backing down from routable networks to non-routable as a means to save money as they would be held to less requirements within the standards. Some owners saw removing routable connections as a proper security move. The purpose of this research was to uncover the inherent weaknesses in Industrial Control Systems (ICS) devices; to show how ICS devices can be hacked and figure out potential protections for these Critical Infrastructure devices. In addition, this research also aimed to validate the decision to move from External Routable connectivity to Non-Routable connectivity, as a security measure and not as a means of savings. The results reveal in order to ultimately protect Industrial Control Systems they must be removed from the Internet and all bi-directional external routable connections must be removed. Furthermore; non-routable serial connections should be utilized, and these non-routable serial connections should be encrypted on different layers of the OSI model. The research concluded that most weaknesses in SCADA systems are due to the inherent weaknesses in ICS devices and because of these weaknesses, human intervention is the biggest threat to SCADA systems.

  13. A stellar tracking reference system

    NASA Technical Reports Server (NTRS)

    Klestadt, B.

    1971-01-01

    A stellar attitude reference system concept for satellites was studied which promises to permit continuous precision pointing of payloads with accuracies of 0.001 degree without the use of gyroscopes. It is accomplished with the use of a single, clustered star tracker assembly mounted on a non-orthogonal, two gimbal mechanism, driven so as to unwind satellite orbital and orbit precession rates. A set of eight stars was found which assures the presence of an adequate inertial reference on a continuous basis in an arbitrary orbit. Acquisition and operational considerations were investigated and inherent reference redundancy/reliability was established. Preliminary designs for the gimbal mechanism, its servo drive, and the star tracker cluster with its associated signal processing were developed for a baseline sun-synchronous, noon-midnight orbit. The functions required of the onboard computer were determined and the equations to be solved were found. In addition detailed error analyses were carried out, based on structural, thermal and other operational considerations.

  14. Resistance controllability and variability improvement in a TaO{sub x}-based resistive memory for multilevel storage application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prakash, A., E-mail: amitknp@postech.ac.kr, E-mail: amit.knp02@gmail.com, E-mail: hwanghs@postech.ac.kr; Song, J.; Hwang, H., E-mail: amitknp@postech.ac.kr, E-mail: amit.knp02@gmail.com, E-mail: hwanghs@postech.ac.kr

    In order to obtain reliable multilevel cell (MLC) characteristics, resistance controllability between the different resistance levels is required especially in resistive random access memory (RRAM), which is prone to resistance variability mainly due to its intrinsic random nature of defect generation and filament formation. In this study, we have thoroughly investigated the multilevel resistance variability in a TaO{sub x}-based nanoscale (<30 nm) RRAM operated in MLC mode. It is found that the resistance variability not only depends on the conductive filament size but also is a strong function of oxygen vacancy concentration in it. Based on the gained insights through experimentalmore » observations and simulation, it is suggested that forming thinner but denser conductive filament may greatly improve the temporal resistance variability even at low operation current despite the inherent stochastic nature of resistance switching process.« less

  15. Photovoltaics as an operating energy system

    NASA Astrophysics Data System (ADS)

    Jones, G. J.; Post, H. N.; Thomas, M. G.

    In the short time since the discovery of the modern solar cell in 1954, terrestrial photovoltaic power system technology has matured in all areas, from collector reliability to system and subsystem design and operations. Today's PV systems are finding widespread use in powering loads where conventional sources are either unavailable, unreliable, or too costly. A broad range of applications is possible because of the modularity of the technology---it can be used to power loads ranging from less than a watt to several megawatts. This inherent modularity makes PV an excellent choice to play a major role in rural electrification in the developing world. The future for grid-connected photovoltaic systems is also very promising. Indications are that several of today's technologies, at higher production rates and in megawatt-sized installations, will generate electricity in the vicinity of $0.12/kWh in the near future.

  16. Technology verification phase. Dynamic isotope power system. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halsey, D.G.

    1982-03-10

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight systemmore » design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance. (LCL)« less

  17. Photon extraction and conversion for scalable ion-trap quantum computing

    NASA Astrophysics Data System (ADS)

    Clark, Susan; Benito, Francisco; McGuinness, Hayden; Stick, Daniel

    2014-03-01

    Trapped ions represent one of the most mature and promising systems for quantum information processing. They have high-fidelity one- and two-qubit gates, long coherence times, and their qubit states can be reliably prepared and detected. Taking advantage of these inherent qualities in a system with many ions requires a means of entangling spatially separated ion qubits. One architecture achieves this entanglement through the use of emitted photons to distribute quantum information - a favorable strategy if photon extraction can be made efficient and reliable. Here I present results for photon extraction from an ion in a cavity formed by integrated optics on a surface trap, as well as results in frequency converting extracted photons for long distance transmission or interfering with photons from other types of optically active qubits. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U. S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  18. Molecular Filters for Noise Reduction.

    PubMed

    Laurenti, Luca; Csikasz-Nagy, Attila; Kwiatkowska, Marta; Cardelli, Luca

    2018-06-19

    Living systems are inherently stochastic and operate in a noisy environment, yet despite all these uncertainties, they perform their functions in a surprisingly reliable way. The biochemical mechanisms used by natural systems to tolerate and control noise are still not fully understood, and this issue also limits our capacity to engineer reliable, quantitative synthetic biological circuits. We study how representative models of biochemical systems propagate and attenuate noise, accounting for intrinsic as well as extrinsic noise. We investigate three molecular noise-filtering mechanisms, study their noise-reduction capabilities and limitations, and show that nonlinear dynamics such as complex formation are necessary for efficient noise reduction. We further suggest that the derived molecular filters are widespread in gene expression and regulation and, particularly, that microRNAs can serve as such noise filters. To our knowledge, our results provide new insight into how biochemical networks control noise and could be useful to build robust synthetic circuits. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  19. Probabilistic assessment of dynamic system performance. Part 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belhadj, Mohamed

    1993-01-01

    Accurate prediction of dynamic system failure behavior can be important for the reliability and risk analyses of nuclear power plants, as well as for their backfitting to satisfy given constraints on overall system reliability, or optimization of system performance. Global analysis of dynamic systems through investigating the variations in the structure of the attractors of the system and the domains of attraction of these attractors as a function of the system parameters is also important for nuclear technology in order to understand the fault-tolerance as well as the safety margins of the system under consideration and to insure a safemore » operation of nuclear reactors. Such a global analysis would be particularly relevant to future reactors with inherent or passive safety features that are expected to rely on natural phenomena rather than active components to achieve and maintain safe shutdown. Conventionally, failure and global analysis of dynamic systems necessitate the utilization of different methodologies which have computational limitations on the system size that can be handled. Using a Chapman-Kolmogorov interpretation of system dynamics, a theoretical basis is developed that unifies these methodologies as special cases and which can be used for a comprehensive safety and reliability analysis of dynamic systems.« less

  20. Pitfalls and Precautions When Using Predicted Failure Data for Quantitative Analysis of Safety Risk for Human Rated Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hatfield, Glen S.; Hark, Frank; Stott, James

    2016-01-01

    Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account risks attributable to manufacturing, assembly, and process controls. These sources often dominate component level reliability or risk of failure probability. While consequences of failure is often understood in assessing risk, using predicted values in a risk model to estimate the probability of occurrence will likely underestimate the risk. Managers and decision makers often use the probability of occurrence in determining whether to accept the risk or require a design modification. Due to the absence of system level test and operational data inherent in aerospace applications, the actual risk threshold for acceptance may not be appropriately characterized for decision making purposes. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.

  1. Fully Passive Wireless Acquisition of Neuropotentials

    NASA Astrophysics Data System (ADS)

    Schwerdt, Helen N.

    The ability to monitor electrophysiological signals from the sentient brain is requisite to decipher its enormously complex workings and initiate remedial solutions for the vast amount of neurologically-based disorders. Despite immense advancements in creating a variety of instruments to record signals from the brain, the translation of such neurorecording instrumentation to real clinical domains places heavy demands on their safety and reliability, both of which are not entirely portrayed by presently existing implantable recording solutions. In an attempt to lower these barriers, alternative wireless radar backscattering techniques are proposed to render the technical burdens of the implant chip to entirely passive neurorecording processes that transpire in the absence of formal integrated power sources or powering schemes along with any active circuitry. These radar-like wireless backscattering mechanisms are used to conceive of fully passive neurorecording operations of an implantable microsystem. The fully passive device potentially manifests inherent advantages over current wireless implantable and wired recording systems: negligible heat dissipation to reduce risks of brain tissue damage and minimal circuitry for long term reliability as a chronic implant. Fully passive neurorecording operations are realized via intrinsic nonlinear mixing properties of the varactor diode. These mixing and recording operations are directly activated by wirelessly interrogating the fully passive device with a microwave carrier signal. This fundamental carrier signal, acquired by the implant antenna, mixes through the varactor diode along with the internal targeted neuropotential brain signals to produce higher frequency harmonics containing the targeted neuropotential signals. These harmonics are backscattered wirelessly to the external interrogator that retrieves and recovers the original neuropotential brain signal. The passive approach removes the need for internal power sources and may alleviate heat trauma and reliability issues that limit practical implementation of existing implantable neurorecorders.

  2. Advanced Stirling Convertor Heater Head Durability and Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Krause, David L.; Shah, Ashwin R.; Korovaichuk, Igor; Kalluri, Sreeramesh

    2008-01-01

    The National Aeronautics and Space Administration (NASA) has identified the high efficiency Advanced Stirling Radioisotope Generator (ASRG) as a candidate power source for long duration Science missions, such as lunar applications, Mars rovers, and deep space missions, that require reliable design lifetimes of up to 17 years. Resistance to creep deformation of the MarM-247 heater head (HH), a structurally critical component of the ASRG Advanced Stirling Convertor (ASC), under high temperatures (up to 850 C) is a key design driver for durability. Inherent uncertainties in the creep behavior of the thin-walled HH and the variations in the wall thickness, control temperature, and working gas pressure need to be accounted for in the life and reliability prediction. Due to the availability of very limited test data, assuring life and reliability of the HH is a challenging task. The NASA Glenn Research Center (GRC) has adopted an integrated approach combining available uniaxial MarM-247 material behavior testing, HH benchmark testing and advanced analysis in order to demonstrate the integrity, life and reliability of the HH under expected mission conditions. The proposed paper describes analytical aspects of the deterministic and probabilistic approaches and results. The deterministic approach involves development of the creep constitutive model for the MarM-247 (akin to the Oak Ridge National Laboratory master curve model used previously for Inconel 718 (Special Metals Corporation)) and nonlinear finite element analysis to predict the mean life. The probabilistic approach includes evaluation of the effect of design variable uncertainties in material creep behavior, geometry and operating conditions on life and reliability for the expected life. The sensitivity of the uncertainties in the design variables on the HH reliability is also quantified, and guidelines to improve reliability are discussed.

  3. High-reliability gas-turbine combined-cycle development program: Phase II. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hecht, K.G.; Sanderson, R.A.; Smith, M.J.

    This three-volume report presents the results of Phase II of the multiphase EPRI-sponsored High-Reliability Gas Turbine Combined-Cycle Development Program whose goal is to achieve a highly reliable gas turbine combined-cycle power plant, available by the mid-1980s, which would be an economically attractive baseload generation alternative for the electric utility industry. The Phase II program objective was to prepare the preliminary design of this power plant. This volume presents information of the reliability, availability, and maintainability (RAM) analysis of a representative plant and the preliminary design of the gas turbine, the gas turbine ancillaries, and the balance of plant including themore » steam turbine generator. To achieve the program goals, a gas turbine was incorporated which combined proven reliability characteristics with improved performance features. This gas turbine, designated the V84.3, is the result of a cooperative effort between Kraftwerk Union AG and United Technologies Corporation. Gas turbines of similar design operating in Europe under baseload conditions have demonstrated mean time between failures in excess of 40,000 hours. The reliability characteristics of the gas turbine ancillaries and balance-of-plant equipment were improved through system simplification and component redundancy and by selection of component with inherent high reliability. A digital control system was included with logic, communications, sensor redundancy, and mandual backup. An independent condition monitoring and diagnostic system was also included. Program results provide the preliminary design of a gas turbine combined-cycle baseload power plant. This power plant has a predicted mean time between failure of nearly twice the 3000-hour EPRI goal. The cost of added reliability features is offset by improved performance, which results in a comparable specific cost and an 8% lower cost of electricity compared to present market offerings.« less

  4. Data for Preparedness Metrics: Legal, Economic, and Operational

    PubMed Central

    Potter, Margaret A.; Houck, Olivia C.; Miner, Kathleen; Shoaf, Kimberley

    2013-01-01

    Tracking progress toward the goal of preparedness for public health emergencies requires a foundation in evidence derived both from scientific inquiry and from preparedness officials and professionals. Proposed in this article is a conceptual model for this task from the perspective of the Centers for Disease Control and Prevention–funded Preparedness and Emergency Response Research Centers. The necessary data capture the areas of responsibility of not only preparedness professionals but also legislative and executive branch officials. It meets the criteria of geographic specificity, availability in standardized and reliable measures, parameterization as quantitative values or qualitative distinction, and content validity. The technical challenges inherent in preparedness tracking are best resolved through consultation with the jurisdictions and communities whose preparedness is at issue. PMID:23903389

  5. Developing an Approach for Analyzing and Verifying System Communication

    NASA Technical Reports Server (NTRS)

    Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally

    2009-01-01

    This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.

  6. Resilient Control Systems Practical Metrics Basis for Defining Mission Impact

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craig G. Rieger

    "Resilience” describes how systems operate at an acceptable level of normalcy despite disturbances or threats. In this paper we first consider the cognitive, cyber-physical interdependencies inherent in critical infrastructure systems and how resilience differs from reliability to mitigate these risks. Terminology and metrics basis are provided to integrate the cognitive, cyber-physical aspects that should be considered when defining solutions for resilience. A practical approach is taken to roll this metrics basis up to system integrity and business case metrics that establish “proper operation” and “impact.” A notional chemical processing plant is the use case for demonstrating how the system integritymore » metrics can be applied to establish performance, and« less

  7. Overview of NASA Ultracapacitor Technology

    NASA Technical Reports Server (NTRS)

    Hill, Curtis W.

    2017-01-01

    NASA needed a lower mass, reliable, and safe medium for energy storage for ground-based and space applications. Existing industry electrochemical systems are limited in weight, charge rate, energy density, reliability, and safety. We chose a ceramic perovskite material for development, due to its high inherent dielectric properties, long history of use in the capacitor industry, and the safety of a solid state material.

  8. High-reliability gas-turbine combined-cycle development program: Phase II, Volume 3. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hecht, K.G.; Sanderson, R.A.; Smith, M.J.

    This three-volume report presents the results of Phase II of the multiphase EPRI-sponsored High-Reliability Gas Turbine Combined-Cycle Development Program whose goal is to achieve a highly reliable gas turbine combined-cycle power plant, available by the mid-1980s, which would be an economically attractive baseload generation alternative for the electric utility industry. The Phase II program objective was to prepare the preliminary design of this power plant. The power plant was addressed in three areas: (1) the gas turbine, (2) the gas turbine ancillaries, and (3) the balance of plant including the steam turbine generator. To achieve the program goals, a gasmore » turbine was incorporated which combined proven reliability characteristics with improved performance features. This gas turbine, designated the V84.3, is the result of a cooperative effort between Kraftwerk Union AG and United Technologies Corporation. Gas turbines of similar design operating in Europe under baseload conditions have demonstrated mean time between failures in excess of 40,000. The reliability characteristics of the gas turbine ancillaries and balance-of-plant equipment were improved through system simplification and component redundancy and by selection of component with inherent high reliability. A digital control system was included with logic, communications, sensor redundancy, and manual backup. An independent condition monitoring and diagnostic system was also included. Program results provide the preliminary design of a gas turbine combined-cycle baseload power plant. This power plant has a predicted mean time between failure of nearly twice the 3000-h EPRI goal. The cost of added reliability features is offset by improved performance, which results in a comparable specific cost and an 8% lower cost of electricty compared to present market offerings.« less

  9. Reliability of a Parallel Pipe Network

    NASA Technical Reports Server (NTRS)

    Herrera, Edgar; Chamis, Christopher (Technical Monitor)

    2001-01-01

    The goal of this NASA-funded research is to advance research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction methods for improved aerospace and aircraft propulsion system components. Reliability methods are used to quantify response uncertainties due to inherent uncertainties in design variables. In this report, several reliability methods are applied to a parallel pipe network. The observed responses are the head delivered by a main pump and the head values of two parallel lines at certain flow rates. The probability that the flow rates in the lines will be less than their specified minimums will be discussed.

  10. Scheduler for multiprocessor system switch with selective pairing

    DOEpatents

    Gara, Alan; Gschwind, Michael Karl; Salapura, Valentina

    2015-01-06

    System, method and computer program product for scheduling threads in a multiprocessing system with selective pairing of processor cores for increased processing reliability. A selective pairing facility is provided that selectively connects, i.e., pairs, multiple microprocessor or processor cores to provide one highly reliable thread (or thread group). The method configures the selective pairing facility to use checking provide one highly reliable thread for high-reliability and allocate threads to corresponding processor cores indicating need for hardware checking. The method configures the selective pairing facility to provide multiple independent cores and allocate threads to corresponding processor cores indicating inherent resilience.

  11. Drawing conclusions: a re-examination of empirical and conceptual bases for psychological evaluation of children from their drawings.

    PubMed

    Thomas, G V; Jolley, R P

    1998-05-01

    Although consideration of children's art work (usually drawings) in clinical investigations of children referred to psychologists is fairly common, there is little evidence for the reliability and validity of such assessments. We consider a variety of possible mechanisms which could operate to influence the characteristics of children's drawings, and review the evidence that such mechanisms operate to allow meaningful psychological evaluations of children from their drawings. The problem for making a reliable interpretation of the significance of a drawing is that a given feature could plausibly support several very different interpretations, depending which of many possible processes was active or dominant in the production of the drawing. Evidence from studies of clinical populations and experimental studies with non-selected samples are reviewed in the light of these possibilities. The review indicates that drawings are inaccurate and unreliable as personality or state assessments but can be influenced by children's emotional attitudes towards the topics depicted. The form of that expression, however, may be personal and idiosyncratic. Analogue studies of these effects undertaken with non-clinical samples under controlled conditions have produced mixed results. At best the reported effects are small. Children's drawings on their own are too complexly determined and inherently ambiguous to be reliable sole indicators of the emotional experiences of the children who drew them. Further research is needed to establish the extent to which such drawings can usefully facilitate assessment of children by other means or provide useful support as one of several converging lines of evidence.

  12. "Long life" DC brush motor for use on the Mars surveyor program

    NASA Technical Reports Server (NTRS)

    Braun, David; Noon, Don

    1998-01-01

    DC brush motors have several qualities which make them very attractive for space flight applications. Their mechanical commutation is simple and lightweight, requiring no external sensing and control in order to function properly. They are extremely efficient in converting electrical energy into mechanical energy. Efficiencies over 80% are not uncommon, resulting in high power throughput to weight ratios. However, the inherent unreliability and short life of sliding electrical contacts, especially in vacuum, have driven previous programs to utilize complex brushless DC or the less efficient stepper motors. The Mars Surveyor Program (MSP'98) and the Shuttle Radar Topography Mission (SRTM) have developed a reliable "long life" brush type DC motor for operation in low temperature, low pressure CO2 and N2, utilizing silver-graphite brushes. The original intent was to utilize this same motor for SRTM's space operation, but the results thus far have been unsatisfactory in vacuum. This paper describes the design, test, and results of this development.

  13. Inherently Safe Fission Power System for Lunar Outposts

    NASA Astrophysics Data System (ADS)

    Schriener, Timothy M.; El-Genk, Mohamed S.

    2013-09-01

    This paper presents the Solid Core-Sectored Compact Reactor (SC-SCoRe) and power system for future lunar outposts. The power system nominally provides 38 kWe continuously for 21 years, employs static components and has no single point failures in reactor cooling or power generation. The reactor core has six sectors, each has a separate pair of primary and secondary loops with liquid NaK-56 working fluid, thermoelectric (TE) power conversion and heat-pipes radiator panels. The electromagnetic (EM) pumps in the primary and secondary loops, powered with separate TE power units, ensure operation reliability and passive decay heat removal from the reactor after shutdown. The reactor poses no radiological concerns during launch, and remains sufficiently subcritical, with the radial reflector dissembled, when submerged in wet sand and the core flooded with seawater, following a launch abort accident. After 300 years of storage below grade on the Moon, the total radioactivity in the post-operation reactor drops below 164 Ci, a low enough radioactivity for a recovery and safe handling of the reactor.

  14. A Mathematical Model of Marine Diesel Engine Speed Control System

    NASA Astrophysics Data System (ADS)

    Sinha, Rajendra Prasad; Balaji, Rajoo

    2018-02-01

    Diesel engine is inherently an unstable machine and requires a reliable control system to regulate its speed for safe and efficient operation. Also, the diesel engine may operate at fixed or variable speeds depending upon user's needs and accordingly the speed control system should have essential features to fulfil these requirements. This paper proposes a mathematical model of a marine diesel engine speed control system with droop governing function. The mathematical model includes static and dynamic characteristics of the control loop components. Model of static characteristic of the rotating fly weights speed sensing element provides an insight into the speed droop features of the speed controller. Because of big size and large time delay, the turbo charged diesel engine is represented as a first order system or sometimes even simplified to a pure integrator with constant gain which is considered acceptable in control literature. The proposed model is mathematically less complex and quick to use for preliminary analysis of the diesel engine speed controller performance.

  15. Properties and Performance Attributes of Novel Co-Extruded Polyolefin Battery Separator Materials. Part 1; Mechanical Properties

    NASA Technical Reports Server (NTRS)

    Baldwin, Richard S.; Guzik, Monica; Skierski, Michael

    2011-01-01

    As NASA prepares for its next era of manned spaceflight missions, advanced energy storage technologies are being developed and evaluated to address future mission needs and technical requirements and to provide new mission-enabling technologies. Cell-level components for advanced lithium-ion batteries possessing higher energy, more reliable performance and enhanced, inherent safety characteristics are actively under development within the NASA infrastructure. A key component for safe and reliable cell performance is the cell separator, which separates the two energetic electrodes and functions to prevent the occurrence of an internal short-circuit while enabling ionic transport. Recently, a new generation of co-extruded separator films has been developed by ExxonMobil Chemical and introduced into their battery separator product portfolio. Several grades of this new separator material have been evaluated with respect to dynamic mechanical properties and safety-related performance attributes. This paper presents the results of these evaluations in comparison to a current state-ofthe-practice separator material. The results are discussed with respect to potential opportunities to enhance the inherent safety characteristics and reliability of future, advanced lithium-ion cell chemistries.

  16. Legal Issues inherent in space shuttle operations. [reviewed by NASA Deputy General Counsel

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The legal issues inherent in NASA's proceeding into the day-to-day operations of the space shuttle and other elements of the Space Transportation System are considered in light of the National Aeronautics and Space Act of 1958. Based on this review, it was concluded that there is no immediate need for substantive amendments to that legislation.

  17. 16 CFR 1211.13 - Inherent force activated secondary door sensors.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Inherent force activated secondary door... § 1211.13 Inherent force activated secondary door sensors. (a) Normal operation test. (1) A force... when the door applies a 15 pound (66.7 N) or less force in the down or closing direction and when the...

  18. Interim reliability evaluation program, Browns Ferry fault trees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, M.E.

    1981-01-01

    An abbreviated fault tree method is used to evaluate and model Browns Ferry systems in the Interim Reliability Evaluation programs, simplifying the recording and displaying of events, yet maintaining the system of identifying faults. The level of investigation is not changed. The analytical thought process inherent in the conventional method is not compromised. But the abbreviated method takes less time, and the fault modes are much more visible.

  19. Projection Operator: A Step Towards Certification of Adaptive Controllers

    NASA Technical Reports Server (NTRS)

    Larchev, Gregory V.; Campbell, Stefan F.; Kaneshige, John T.

    2010-01-01

    One of the major barriers to wider use of adaptive controllers in commercial aviation is the lack of appropriate certification procedures. In order to be certified by the Federal Aviation Administration (FAA), an aircraft controller is expected to meet a set of guidelines on functionality and reliability while not negatively impacting other systems or safety of aircraft operations. Due to their inherent time-variant and non-linear behavior, adaptive controllers cannot be certified via the metrics used for linear conventional controllers, such as gain and phase margin. Projection Operator is a robustness augmentation technique that bounds the output of a non-linear adaptive controller while conforming to the Lyapunov stability rules. It can also be used to limit the control authority of the adaptive component so that the said control authority can be arbitrarily close to that of a linear controller. In this paper we will present the results of applying the Projection Operator to a Model-Reference Adaptive Controller (MRAC), varying the amount of control authority, and comparing controller s performance and stability characteristics with those of a linear controller. We will also show how adjusting Projection Operator parameters can make it easier for the controller to satisfy the certification guidelines by enabling a tradeoff between controller s performance and robustness.

  20. Probabilistic structural mechanics research for parallel processing computers

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Martin, William R.

    1991-01-01

    Aerospace structures and spacecraft are a complex assemblage of structural components that are subjected to a variety of complex, cyclic, and transient loading conditions. Significant modeling uncertainties are present in these structures, in addition to the inherent randomness of material properties and loads. To properly account for these uncertainties in evaluating and assessing the reliability of these components and structures, probabilistic structural mechanics (PSM) procedures must be used. Much research has focused on basic theory development and the development of approximate analytic solution methods in random vibrations and structural reliability. Practical application of PSM methods was hampered by their computationally intense nature. Solution of PSM problems requires repeated analyses of structures that are often large, and exhibit nonlinear and/or dynamic response behavior. These methods are all inherently parallel and ideally suited to implementation on parallel processing computers. New hardware architectures and innovative control software and solution methodologies are needed to make solution of large scale PSM problems practical.

  1. Reliable enumeration of malaria parasites in thick blood films using digital image analysis.

    PubMed

    Frean, John A

    2009-09-23

    Quantitation of malaria parasite density is an important component of laboratory diagnosis of malaria. Microscopy of Giemsa-stained thick blood films is the conventional method for parasite enumeration. Accurate and reproducible parasite counts are difficult to achieve, because of inherent technical limitations and human inconsistency. Inaccurate parasite density estimation may have adverse clinical and therapeutic implications for patients, and for endpoints of clinical trials of anti-malarial vaccines or drugs. Digital image analysis provides an opportunity to improve performance of parasite density quantitation. Accurate manual parasite counts were done on 497 images of a range of thick blood films with varying densities of malaria parasites, to establish a uniformly reliable standard against which to assess the digital technique. By utilizing descriptive statistical parameters of parasite size frequency distributions, particle counting algorithms of the digital image analysis programme were semi-automatically adapted to variations in parasite size, shape and staining characteristics, to produce optimum signal/noise ratios. A reliable counting process was developed that requires no operator decisions that might bias the outcome. Digital counts were highly correlated with manual counts for medium to high parasite densities, and slightly less well correlated with conventional counts. At low densities (fewer than 6 parasites per analysed image) signal/noise ratios were compromised and correlation between digital and manual counts was poor. Conventional counts were consistently lower than both digital and manual counts. Using open-access software and avoiding custom programming or any special operator intervention, accurate digital counts were obtained, particularly at high parasite densities that are difficult to count conventionally. The technique is potentially useful for laboratories that routinely perform malaria parasite enumeration. The requirements of a digital microscope camera, personal computer and good quality staining of slides are potentially reasonably easy to meet.

  2. Autonomous navigation system based on GPS and magnetometer data

    NASA Technical Reports Server (NTRS)

    Julie, Thienel K. (Inventor); Richard, Harman R. (Inventor); Bar-Itzhack, Itzhack Y. (Inventor)

    2004-01-01

    This invention is drawn to an autonomous navigation system using Global Positioning System (GPS) and magnetometers for low Earth orbit satellites. As a magnetometer is reliable and always provides information on spacecraft attitude, rate, and orbit, the magnetometer-GPS configuration solves GPS initialization problem, decreasing the convergence time for navigation estimate and improving the overall accuracy. Eventually the magnetometer-GPS configuration enables the system to avoid costly and inherently less reliable gyro for rate estimation. Being autonomous, this invention would provide for black-box spacecraft navigation, producing attitude, orbit, and rate estimates without any ground input with high accuracy and reliability.

  3. Dynamics of aesthetic appreciation

    NASA Astrophysics Data System (ADS)

    Carbon, Claus-Christian

    2012-03-01

    Aesthetic appreciation is a complex cognitive processing with inherent aspects of cold as well as hot cognition. Research from the last decades of empirical has shown that evaluations of aesthetic appreciation are highly reliable. Most frequently, facial attractiveness was used as the corner case for investigating aesthetic appreciation. Evaluating facial attractiveness shows indeed high internal consistencies and impressively high inter-rater reliabilities, even across cultures. Although this indicates general and stable mechanisms underlying aesthetic appreciation, it is also obvious that our taste for specific objects changes dynamically. Aesthetic appreciation on artificial object categories, such as fashion, design or art is inherently very dynamic. Gaining insights into the cognitive mechanisms that trigger and enable corresponding changes of aesthetic appreciation is of particular interest for research as this will provide possibilities to modeling aesthetic appreciation for longer durations and from a dynamic perspective. The present paper refers to a recent two-step model ("the dynamical two-step-model of aesthetic appreciation"), dynamically adapting itself, which accounts for typical dynamics of aesthetic appreciation found in different research areas such as art history, philosophy and psychology. The first step assumes singular creative sources creating and establishing innovative material towards which, in a second step, people adapt by integrating it into their visual habits. This inherently leads to dynamic changes of the beholders' aesthetic appreciation.

  4. Threshold-voltage modulated phase change heterojunction for application of high density memory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Baihan; Tong, Hao, E-mail: tonghao@hust.edu.cn; Qian, Hang

    2015-09-28

    Phase change random access memory is one of the most important candidates for the next generation non-volatile memory technology. However, the ability to reduce its memory size is compromised by the fundamental limitations inherent in the CMOS technology. While 0T1R configuration without any additional access transistor shows great advantages in improving the storage density, the leakage current and small operation window limit its application in large-scale arrays. In this work, phase change heterojunction based on GeTe and n-Si is fabricated to address those problems. The relationship between threshold voltage and doping concentration is investigated, and energy band diagrams and X-raymore » photoelectron spectroscopy measurements are provided to explain the results. The threshold voltage is modulated to provide a large operational window based on this relationship. The switching performance of the heterojunction is also tested, showing a good reverse characteristic, which could effectively decrease the leakage current. Furthermore, a reliable read-write-erase function is achieved during the tests. Phase change heterojunction is proposed for high-density memory, showing some notable advantages, such as modulated threshold voltage, large operational window, and low leakage current.« less

  5. Reliability of Wireless Sensor Networks

    PubMed Central

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2014-01-01

    Wireless Sensor Networks (WSNs) consist of hundreds or thousands of sensor nodes with limited processing, storage, and battery capabilities. There are several strategies to reduce the power consumption of WSN nodes (by increasing the network lifetime) and increase the reliability of the network (by improving the WSN Quality of Service). However, there is an inherent conflict between power consumption and reliability: an increase in reliability usually leads to an increase in power consumption. For example, routing algorithms can send the same packet though different paths (multipath strategy), which it is important for reliability, but they significantly increase the WSN power consumption. In this context, this paper proposes a model for evaluating the reliability of WSNs considering the battery level as a key factor. Moreover, this model is based on routing algorithms used by WSNs. In order to evaluate the proposed models, three scenarios were considered to show the impact of the power consumption on the reliability of WSNs. PMID:25157553

  6. Distributed Electrical Energy Systems: Needs, Concepts, Approaches and Vision (in Chinese)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yingchen; Zhang, Jun; Gao, Wenzhong

    Intelligent distributed electrical energy systems (IDEES) are featured by vast system components, diversifled component types, and difficulties in operation and management, which results in that the traditional centralized power system management approach no longer flts the operation. Thus, it is believed that the blockchain technology is one of the important feasible technical paths for building future large-scale distributed electrical energy systems. An IDEES is inherently with both social and technical characteristics, as a result, a distributed electrical energy system needs to be divided into multiple layers, and at each layer, a blockchain is utilized to model and manage its logicmore » and physical functionalities. The blockchains at difierent layers coordinate with each other and achieve successful operation of the IDEES. Speciflcally, the multi-layer blockchains, named 'blockchain group', consist of distributed data access and service blockchain, intelligent property management blockchain, power system analysis blockchain, intelligent contract operation blockchain, and intelligent electricity trading blockchain. It is expected that the blockchain group can be self-organized into a complex, autonomous and distributed IDEES. In this complex system, frequent and in-depth interactions and computing will derive intelligence, and it is expected that such intelligence can bring stable, reliable and efficient electrical energy production, transmission and consumption.« less

  7. Graphical programming: A systems approach for telerobotic servicing of space assets

    NASA Technical Reports Server (NTRS)

    Pinkerton, James T.; Mcdonald, Michael J.; Palmquist, Robert D.; Patten, Richard

    1994-01-01

    Satellite servicing is in many ways analogous to subsea robotic servicing in the late 1970's. A cost effective, reliable, telerobotic capability had to be demonstrated before the oil companies invested money in deep water robot serviceable production facilities. In the same sense, aeronautic engineers will not design satellites for telerobotic servicing until such a quantifiable capability has been demonstrated. New space servicing systems will be markedly different than existing space robot systems. Past space manipulator systems, including the Space Shuttle's robot arm, have used master/slave technologies with poor fidelity, slow operating speeds and most importantly, in-orbit human operators. In contrast, new systems will be capable of precision operations, conducted at higher rates of speed, and be commanded via ground-control communication links. Challenge presented by this environment include achieving a mandated level of robustness and dependability, radiation hardening, minimum weight and power consumption, and a system which accommodates the inherent communication delay between the ground station and the satellite. There is also a need for a user interface which is easy to use, ensures collision free motions, and is capable of adjusting to an unknown workcell (for repair operations the condition of the satellite may not be known in advance). This paper describes the novel technologies required to deliver such a capability.

  8. Designing robots for industrial environments. [economic factors and vulnerability

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Environmental hazards to industrial robots are summarized. The inherent reliability of the design of the Unimate robot is assessed and the data used in a management system to bring the reliability performance up to a level nearing what is theoretically available. The design is shown to be capable of a mean time between failure of 400 hours and an average up time of 98%. Specific design decisions made in view of application requirements are explored.

  9. The welfare effects of integrating renewable energy into electricity markets

    NASA Astrophysics Data System (ADS)

    Lamadrid, Alberto J.

    The challenges of deploying more renewable energy sources on an electric grid are caused largely by their inherent variability. In this context, energy storage can help make the electric delivery system more reliable by mitigating this variability. This thesis analyzes a series of models for procuring electricity and ancillary services for both individuals and social planners with high penetrations of stochastic wind energy. The results obtained for an individual decision maker using stochastic optimization are ambiguous, with closed form solutions dependent on technological parameters, and no consideration of the system reliability. The social planner models correctly reflect the effect of system reliability, and in the case of a Stochastic, Security Constrained Optimal Power Flow (S-SC-OPF or SuperOPF), determine reserve capacity endogenously so that system reliability is maintained. A single-period SuperOPF shows that including ramping costs in the objective function leads to more wind spilling and increased capacity requirements for reliability. However, this model does not reflect the inter temporal tradeoffs of using Energy Storage Systems (ESS) to improve reliability and mitigate wind variability. The results with the multiperiod SuperOPF determine the optimum use of storage for a typical day, and compare the effects of collocating ESS at wind sites with the same amount of storage (deferrable demand) located at demand centers. The collocated ESS has slightly lower operating costs and spills less wind generation compared to deferrable demand, but the total amount of conventional generating capacity needed for system adequacy is higher. In terms of the total system costs, that include the capital cost of conventional generating capacity, the costs with deferrable demand is substantially lower because the daily demand profile is flattened and less conventional generation capacity is then needed for reliability purposes. The analysis also demonstrates that the optimum daily pattern of dispatch and reserves is seriously distorted if the stochastic characteristics of wind generation are ignored.

  10. Interim Status Report for Risk Management for SFRs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jankovsky, Zachary Kyle; Denman, Matthew R.; Groth, Katrina

    2015-10-01

    Accident management is an important component to maintaining risk at acceptable levels for all complex systems, such as nuclear power plants. With the introduction of passive, or inherently safe, reactor designs the focus has shifted from management by operators to allowing the system's design to take advantage of natural phenomena to manage the accident. Inherently and passively safe designs are laudable, but nonetheless extreme boundary conditions can interfere with the design attributes which facilitate inherent safety, thus resulting in unanticipated and undesirable end states. This report examines an inherently safe and small sodium fast reactor experiencing a variety of beyondmore » design basis events with the intent of exploring the utility of a Dynamic Bayesian Network to infer the state of the reactor to inform the operator's corrective actions. These inferences also serve to identify the instruments most critical to informing an operator's actions as candidates for hardening against radiation and other extreme environmental conditions that may exist in an accident. This reduction in uncertainty serves to inform ongoing discussions of how small sodium reactors would be licensed and may serve to reduce regulatory risk and cost for such reactors.« less

  11. Space Shuttle RTOS Bayesian Network

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Beling, Peter A.

    2001-01-01

    With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores. Using a prioritization of measures from the decision-maker, trade-offs between the scores are used to rank order the available set of RTOS candidates.

  12. Advancement of a 30K W Solar Electric Propulsion System Capability for NASA Human and Robotic Exploration Missions

    NASA Technical Reports Server (NTRS)

    Smith, Bryan K.; Nazario, Margaret L.; Manzella, David H.

    2012-01-01

    Solar Electric Propulsion has evolved into a demonstrated operational capability performing station keeping for geosynchronous satellites, enabling challenging deep-space science missions, and assisting in the transfer of satellites from an elliptical orbit Geostationary Transfer Orbit (GTO) to a Geostationary Earth Orbit (GEO). Advancing higher power SEP systems will enable numerous future applications for human, robotic, and commercial missions. These missions are enabled by either the increased performance of the SEP system or by the cost reductions when compared to conventional chemical propulsion systems. Higher power SEP systems that provide very high payload for robotic missions also trade favorably for the advancement of human exploration beyond low Earth orbit. Demonstrated reliable systems are required for human space flight and due to their successful present day widespread use and inherent high reliability, SEP systems have progressively become a viable entrant into these future human exploration architectures. NASA studies have identified a 30 kW-class SEP capability as the next appropriate evolutionary step, applicable to wide range of both human and robotic missions. This paper describes the planning options, mission applications, and technology investments for representative 30kW-class SEP mission concepts under consideration by NASA

  13. Economics of small fully reusable launch systems (SSTO vs. TSTO)

    NASA Astrophysics Data System (ADS)

    Koelle, Dietrich E.

    1997-01-01

    The paper presents a design and cost comparison of an SSTO vehicle concept with two TSTO vehicle options. It is shown that the ballistic SSTO concept feasibility is NOT a subject of technology but of proper vehicle SIZING. This also allows to design for sufficient performance margin. The cost analysis has been performed with the TRANSCOST- Model, also using the "Standardized Cost per Flight" definition for the CpF comparison. The results show that a present-technology SSTO for LEO missions is about 30 % less expensive than any TSTO vehicle, based on Life-Cycle-Cost analysis, in addition to the inherent operational/ reliability advantages of a single-stage vehicle. In case of a commercial development and operation it is estimated that an SSTO vehicle with 400 Mg propellant mass can be flown for some 9 Million per mission (94/95) with 14 Mg payload to LEO, 7 Mg to the Space Station Orbit, or 2 Mg to a 200/800 km polar orbit. This means specific transportation cost of 650 /kg (300 $/lb), resp.3.2 MYr/Mg, to LEO which is 6 -10% of present expendable launch vehicles.

  14. Uncertainty management, spatial and temporal reasoning, and validation of intelligent environmental decision support systems

    USGS Publications Warehouse

    Sànchez-Marrè, Miquel; Gilbert, Karina; Sojda, Rick S.; Steyer, Jean Philippe; Struss, Peter; Rodríguez-Roda, Ignasi; Voinov, A.A.; Jakeman, A.J.; Rizzoli, A.E.

    2006-01-01

    There are inherent open problems arising when developing and running Intelligent Environmental Decision Support Systems (IEDSS). During daily operation of IEDSS several open challenge problems appear. The uncertainty of data being processed is intrinsic to the environmental system, which is being monitored by several on-line sensors and off-line data. Thus, anomalous data values at data gathering level or even uncertain reasoning process at later levels such as in diagnosis or decision support or planning can lead the environmental process to unsafe critical operation states. At diagnosis level or even at decision support level or planning level, spatial reasoning or temporal reasoning or both aspects can influence the reasoning processes undertaken by the IEDSS. Most of Environmental systems must take into account the spatial relationships between the environmental goal area and the nearby environmental areas and the temporal relationships between the current state and the past states of the environmental system to state accurate and reliable assertions to be used within the diagnosis process or decision support process or planning process. Finally, a related issue is a crucial point: are really reliable and safe the decisions proposed by the IEDSS? Are we sure about the goodness and performance of proposed solutions? How can we ensure a correct evaluation of the IEDSS? Main goal of this paper is to analyse these four issues, review some possible approaches and techniques to cope with them, and study new trends for future research within the IEDSS field.

  15. Performance analysis of automated evaluation of Crithidia luciliae-based indirect immunofluorescence tests in a routine setting - strengths and weaknesses.

    PubMed

    Hormann, Wymke; Hahn, Melanie; Gerlach, Stefan; Hochstrate, Nicola; Affeldt, Kai; Giesen, Joyce; Fechner, Kai; Damoiseaux, Jan G M C

    2017-11-27

    Antibodies directed against dsDNA are a highly specific diagnostic marker for the presence of systemic lupus erythematosus and of particular importance in its diagnosis. To assess anti-dsDNA antibodies, the Crithidia luciliae-based indirect immunofluorescence test (CLIFT) is one of the assays considered to be the best choice. To overcome the drawback of subjective result interpretation that inheres indirect immunofluorescence assays in general, automated systems have been introduced into the market during the last years. Among these systems is the EUROPattern Suite, an advanced automated fluorescence microscope equipped with different software packages, capable of automated pattern interpretation and result suggestion for ANA, ANCA and CLIFT analysis. We analyzed the performance of the EUROPattern Suite with its automated fluorescence interpretation for CLIFT in a routine setting, reflecting the everyday life of a diagnostic laboratory. Three hundred and twelve consecutive samples were collected, sent to the Central Diagnostic Laboratory of the Maastricht University Medical Centre with a request for anti-dsDNA analysis over a period of 7 months. Agreement between EUROPattern assay analysis and the visual read was 93.3%. Sensitivity and specificity were 94.1% and 93.2%, respectively. The EUROPattern Suite performed reliably and greatly supported result interpretation. Automated image acquisition is readily performed and automated image classification gives a reliable recommendation for assay evaluation to the operator. The EUROPattern Suite optimizes workflow and contributes to standardization between different operators or laboratories.

  16. Fiber laser platform for highest flexibility and reliability in industrial femtosecond micromachining: TruMicro Series 2000

    NASA Astrophysics Data System (ADS)

    Jansen, Florian; Kanal, Florian; Kahmann, Max; Tan, Chuong; Diekamp, Holger; Scelle, Raphael; Budnicki, Aleksander; Sutter, Dirk

    2018-02-01

    In this work we present an ultrafast laser system distinguished by its industry-ready reliability and its outstanding flexibility that allows for real-time process-inherent parameter. The robust system design and linear amplifier architecture make the all-fiber series TruMicro 2000 ideally suited for passive coupling to hollow-core delivery fibers. In addition to details on the laser system itself, various application examples are shown, including welding of different glasses and ablation of silicon carbide and silicon.

  17. Formative Assessment as Mediation

    ERIC Educational Resources Information Center

    De Vos, Mark; Belluigi, Dina Zoe

    2011-01-01

    Whilst principles of validity, reliability and fairness should be central concerns for the assessment of student learning in higher education, simplistic notions of "transparency" and "explicitness" in terms of assessment criteria should be critiqued more rigorously. This article examines the inherent tensions resulting from CRA's links to both…

  18. Improved Accuracy of the Inherent Shrinkage Method for Fast and More Reliable Welding Distortion Calculations

    NASA Astrophysics Data System (ADS)

    Mendizabal, A.; González-Díaz, J. B.; San Sebastián, M.; Echeverría, A.

    2016-07-01

    This paper describes the implementation of a simple strategy adopted for the inherent shrinkage method (ISM) to predict welding-induced distortion. This strategy not only makes it possible for the ISM to reach accuracy levels similar to the detailed transient analysis method (considered the most reliable technique for calculating welding distortion) but also significantly reduces the time required for these types of calculations. This strategy is based on the sequential activation of welding blocks to account for welding direction and transient movement of the heat source. As a result, a significant improvement in distortion prediction is achieved. This is demonstrated by experimentally measuring and numerically analyzing distortions in two case studies: a vane segment subassembly of an aero-engine, represented with 3D-solid elements, and a car body component, represented with 3D-shell elements. The proposed strategy proves to be a good alternative for quickly estimating the correct behaviors of large welded components and may have important practical applications in the manufacturing industry.

  19. Tailoring a Human Reliability Analysis to Your Industry Needs

    NASA Technical Reports Server (NTRS)

    DeMott, D. L.

    2016-01-01

    Companies at risk of accidents caused by human error that result in catastrophic consequences include: airline industry mishaps, medical malpractice, medication mistakes, aerospace failures, major oil spills, transportation mishaps, power production failures and manufacturing facility incidents. Human Reliability Assessment (HRA) is used to analyze the inherent risk of human behavior or actions introducing errors into the operation of a system or process. These assessments can be used to identify where errors are most likely to arise and the potential risks involved if they do occur. Using the basic concepts of HRA, an evolving group of methodologies are used to meet various industry needs. Determining which methodology or combination of techniques will provide a quality human reliability assessment is a key element to developing effective strategies for understanding and dealing with risks caused by human errors. There are a number of concerns and difficulties in "tailoring" a Human Reliability Assessment (HRA) for different industries. Although a variety of HRA methodologies are available to analyze human error events, determining the most appropriate tools to provide the most useful results can depend on industry specific cultures and requirements. Methodology selection may be based on a variety of factors that include: 1) how people act and react in different industries, 2) expectations based on industry standards, 3) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 4) type and availability of data, 5) how the industry views risk & reliability, and 6) types of emergencies, contingencies and routine tasks. Other considerations for methodology selection should be based on what information is needed from the assessment. If the principal concern is determination of the primary risk factors contributing to the potential human error, a more detailed analysis method may be employed versus a requirement to provide a numerical value as part of a probabilistic risk assessment. Industries involved with humans operating large equipment or transport systems (ex. railroads or airlines) would have more need to address the man machine interface than medical workers administering medications. Human error occurs in every industry; in most cases the consequences are relatively benign and occasionally beneficial. In cases where the results can have disastrous consequences, the use of Human Reliability techniques to identify and classify the risk of human errors allows a company more opportunities to mitigate or eliminate these types of risks and prevent costly tragedies.

  20. 76 FR 42534 - Mandatory Reliability Standards for Interconnection Reliability Operating Limits; System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-19

    ... Reliability Operating Limits; System Restoration Reliability Standards AGENCY: Federal Energy Regulatory... data necessary to analyze and monitor Interconnection Reliability Operating Limits (IROL) within its... Interconnection Reliability Operating Limits, Order No. 748, 134 FERC ] 61,213 (2011). \\2\\ The term ``Wide-Area...

  1. Variability in Non-Target Terrestrial Plant Studies Should Inform Endpoint Selection.

    PubMed

    Staveley, J P; Green, J W; Nusz, J; Edwards, D; Henry, K; Kern, M; Deines, A M; Brain, R; Glenn, B; Ehresman, N; Kung, T; Ralston-Hooper, K; Kee, F; McMaster, S

    2018-05-04

    Inherent variability in Non-Target Terrestrial Plant (NTTP) testing of pesticides creates challenges for using and interpreting these data for risk assessment. Standardized NTTP testing protocols were initially designed to calculate the application rate causing a 25% effect (ER25, used in the U.S.) or a 50% effect (ER50, used in Europe) for various measures based on the observed dose-response. More recently, the requirement to generate a no-observed-effect rate (NOER), or, in the absence of a NOER, the rate causing a 5% effect (ER05), has raised questions about the inherent variability in, and statistical detectability of, these tests. Statistically significant differences observed between test and control groups may be a product of this inherent variability and may not represent biological relevance. Attempting to derive an ER05 and the associated risk assessment conclusions drawn from these values can overestimate risk. To address these concerns, we evaluated historical data from approximately 100 seedling emergence and vegetative vigor guideline studies on pesticides to assess the variability of control results across studies for each plant species, examined potential causes for the variation in control results, and defined the minimum percent effect that can be reliably detected. The results indicate that with current test design and implementation, the ER05 cannot be reliably estimated. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  2. 16 CFR 1211.13 - Inherent force activated secondary door sensors.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... sensors. 1211.13 Section 1211.13 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT... § 1211.13 Inherent force activated secondary door sensors. (a) Normal operation test. (1) A force activated door sensor of a door system installed according to the installation instructions shall actuate...

  3. 16 CFR 1211.13 - Inherent force activated secondary door sensors.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... sensors. 1211.13 Section 1211.13 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT... § 1211.13 Inherent force activated secondary door sensors. (a) Normal operation test. (1) A force activated door sensor of a door system installed according to the installation instructions shall actuate...

  4. 16 CFR 1211.13 - Inherent force activated secondary door sensors.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... sensors. 1211.13 Section 1211.13 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT... § 1211.13 Inherent force activated secondary door sensors. (a) Normal operation test. (1) A force activated door sensor of a door system installed according to the installation instructions shall actuate...

  5. Performance characteristics of a nanoscale double-gate reconfigurable array

    NASA Astrophysics Data System (ADS)

    Beckett, Paul

    2008-12-01

    The double gate transistor is a promising device applicable to deep sub-micron design due to its inherent resistance to short-channel effects and superior subthreshold performance. Using both TCAD and SPICE circuit simulation, it is shown that the characteristics of fully depleted dual-gate thin-body Schottky barrier silicon transistors will not only uncouple the conflicting requirements of high performance and low standby power in digital logic, but will also allow the development of a locally-connected reconfigurable computing mesh. The magnitude of the threshold shift effect will scale with device dimensions and will remain compatible with oxide reliability constraints. A field-programmable architecture based on the double gate transistor is described in which the operating point of the circuit is biased via one gate while the other gate is used to form the logic array, such that complex heterogeneous computing functions may be developed from this homogeneous, mesh-connected organization.

  6. On-board B-ISDN fast packet switching architectures. Phase 2: Development. Proof-of-concept architecture definition report

    NASA Technical Reports Server (NTRS)

    Shyy, Dong-Jye; Redman, Wayne

    1993-01-01

    For the next-generation packet switched communications satellite system with onboard processing and spot-beam operation, a reliable onboard fast packet switch is essential to route packets from different uplink beams to different downlink beams. The rapid emergence of point-to-point services such as video distribution, and the large demand for video conference, distributed data processing, and network management makes the multicast function essential to a fast packet switch (FPS). The satellite's inherent broadcast features gives the satellite network an advantage over the terrestrial network in providing multicast services. This report evaluates alternate multicast FPS architectures for onboard baseband switching applications and selects a candidate for subsequent breadboard development. Architecture evaluation and selection will be based on the study performed in phase 1, 'Onboard B-ISDN Fast Packet Switching Architectures', and other switch architectures which have become commercially available as large scale integration (LSI) devices.

  7. Inherent overload protection for the series resonant converter

    NASA Technical Reports Server (NTRS)

    King, R. J.; Stuart, T. A.

    1983-01-01

    The overload characteristics of the full bridge series resonant power converter are considered. This includes analyses of the two most common control methods presently in use. The first of these uses a current zero crossing detector to synchronize the control signals and is referred to as the alpha controller. The second is driven by a voltage controlled oscillator and is referred to as the gamma controller. It is shown that the gamma controller has certain reliability advantages in that it can be designed with inherent short circuit protection. Experimental results are included for an 86 kHz converter using power metal-oxide-semiconductor field-effect transistors (MOSFETs).

  8. International English Language Testing: A Critical Response

    ERIC Educational Resources Information Center

    Hall, Graham

    2010-01-01

    Uysal's article provides a research agenda for IELTS and lists numerous issues concerning the test's reliability and validity. She asks useful questions, but her analysis ignores the uncertainties inherent in all language test development and the wider social and political context of international high-stakes language testing. In this response, I…

  9. Testing of printed circuit board solder joints by optical correlation

    NASA Technical Reports Server (NTRS)

    Espy, P. N.

    1975-01-01

    An optical correlation technique for the nondestructive evaluation of printed circuit board solder joints was evaluated. Reliable indications of induced stress levels in solder joint lead wires are achievable. Definite relations between the inherent strength of a solder joint, with its associated ability to survive stress, are demonstrable.

  10. Enhancing the Internet of Things Architecture with Flow Semantics

    ERIC Educational Resources Information Center

    DeSerranno, Allen Ronald

    2017-01-01

    Internet of Things ("IoT") systems are complex, asynchronous solutions often comprised of various software and hardware components developed in isolation of each other. These components function with different degrees of reliability and performance over an inherently unreliable network, the Internet. Many IoT systems are developed within…

  11. Simple algorithm for improved security in the FDDI protocol

    NASA Astrophysics Data System (ADS)

    Lundy, G. M.; Jones, Benjamin

    1993-02-01

    We propose a modification to the Fiber Distributed Data Interface (FDDI) protocol based on a simple algorithm which will improve confidential communication capability. This proposed modification provides a simple and reliable system which exploits some of the inherent security properties in a fiber optic ring network. This method differs from conventional methods in that end to end encryption can be facilitated at the media access control sublayer of the data link layer in the OSI network model. Our method is based on a variation of the bit stream cipher method. The transmitting station takes the intended confidential message and uses a simple modulo two addition operation against an initialization vector. The encrypted message is virtually unbreakable without the initialization vector. None of the stations on the ring will have access to both the encrypted message and the initialization vector except the transmitting and receiving stations. The generation of the initialization vector is unique for each confidential transmission and thus provides a unique approach to the key distribution problem. The FDDI protocol is of particular interest to the military in terms of LAN/MAN implementations. Both the Army and the Navy are considering the standard as the basis for future network systems. A simple and reliable security mechanism with the potential to support realtime communications is a necessary consideration in the implementation of these systems. The proposed method offers several advantages over traditional methods in terms of speed, reliability, and standardization.

  12. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, D.; Brunett, A.; Passerini, S.

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. Themore » mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.« less

  13. Impact of wind farms with energy storage on transient stability

    NASA Astrophysics Data System (ADS)

    Bowman, Douglas Allen

    Today's energy infrastructure will need to rapidly expand in terms of reliability and flexibility due to aging infrastructure, changing energy market conditions, projected load increases, and system reliability requirements. Over the few decades, several states in the U.S. are now requiring an increase in wind penetration. These requirements will have impacts on grid reliability given the inherent intermittency of wind generation and much research has been completed on the impact of wind on grid reliability. Energy storage has been proposed as a tool to provide greater levels of reliability; however, little research has occurred in the area of wind with storage and its impact on stability given different possible scenarios. This thesis addresses the impact of wind farm penetration on transient stability when energy storage is added. The results show that battery energy storage located at the wind energy site can improve the stability response of the system.

  14. 16 CFR § 1211.13 - Inherent force activated secondary door sensors.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... sensors. § 1211.13 Section § 1211.13 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER... Standard § 1211.13 Inherent force activated secondary door sensors. (a) Normal operation test. (1) A force activated door sensor of a door system installed according to the installation instructions shall actuate...

  15. Staffing for Unmanned Aircraft Systems (UAS) Operations

    DTIC Science & Technology

    2016-06-01

    Service’s staffing determination . Using these business rules and observations enabled the follow-on assessment of individual UAS mission elements... Determination Framework ......................................................13  B.  Inherently Governmental (IG...authorities, and the determination of inherently governmental (IG) activities. Table 1. Delta between Rank/Pay Grades Annual Total Cost O-4 O-3 W-3 W-2

  16. Functional Near-Infrared Spectroscopy Signals Measure Neuronal Activity in the Cortex

    NASA Technical Reports Server (NTRS)

    Harrivel, Angela; Hearn, Tristan

    2013-01-01

    Functional near infrared spectroscopy (fNIRS) is an emerging optical neuroimaging technology that indirectly measures neuronal activity in the cortex via neurovascular coupling. It quantifies hemoglobin concentration ([Hb]) and thus measures the same hemodynamic response as functional magnetic resonance imaging (fMRI), but is portable, non-confining, relatively inexpensive, and is appropriate for long-duration monitoring and use at the bedside. Like fMRI, it is noninvasive and safe for repeated measurements. Patterns of [Hb] changes are used to classify cognitive state. Thus, fNIRS technology offers much potential for application in operational contexts. For instance, the use of fNIRS to detect the mental state of commercial aircraft operators in near real time could allow intelligent flight decks of the future to optimally support human performance in the interest of safety by responding to hazardous mental states of the operator. However, many opportunities remain for improving robustness and reliability. It is desirable to reduce the impact of motion and poor optical coupling of probes to the skin. Such artifacts degrade signal quality and thus cognitive state classification accuracy. Field application calls for further development of algorithms and filters for the automation of bad channel detection and dynamic artifact removal. This work introduces a novel adaptive filter method for automated real-time fNIRS signal quality detection and improvement. The output signal (after filtering) will have had contributions from motion and poor coupling reduced or removed, thus leaving a signal more indicative of changes due to hemodynamic brain activations of interest. Cognitive state classifications based on these signals reflect brain activity more reliably. The filter has been tested successfully with both synthetic and real human subject data, and requires no auxiliary measurement. This method could be implemented as a real-time filtering option or bad channel rejection feature of software used with frequency domain fNIRS instruments for signal acquisition and processing. Use of this method could improve the reliability of any operational or real-world application of fNIRS in which motion is an inherent part of the functional task of interest. Other optical diagnostic techniques (e.g., for NIR medical diagnosis) also may benefit from the reduction of probe motion artifact during any use in which motion avoidance would be impractical or limit usability.

  17. Stochastic inversion of ocean color data using the cross-entropy method.

    PubMed

    Salama, Mhd Suhyb; Shen, Fang

    2010-01-18

    Improving the inversion of ocean color data is an ever continuing effort to increase the accuracy of derived inherent optical properties. In this paper we present a stochastic inversion algorithm to derive inherent optical properties from ocean color, ship and space borne data. The inversion algorithm is based on the cross-entropy method where sets of inherent optical properties are generated and converged to the optimal set using iterative process. The algorithm is validated against four data sets: simulated, noisy simulated in-situ measured and satellite match-up data sets. Statistical analysis of validation results is based on model-II regression using five goodness-of-fit indicators; only R2 and root mean square of error (RMSE) are mentioned hereafter. Accurate values of total absorption coefficient are derived with R2 > 0.91 and RMSE, of log transformed data, less than 0.55. Reliable values of the total backscattering coefficient are also obtained with R2 > 0.7 (after removing outliers) and RMSE < 0.37. The developed algorithm has the ability to derive reliable results from noisy data with R2 above 0.96 for the total absorption and above 0.84 for the backscattering coefficients. The algorithm is self contained and easy to implement and modify to derive the variability of chlorophyll-a absorption that may correspond to different phytoplankton species. It gives consistently accurate results and is therefore worth considering for ocean color global products.

  18. Cognitive mechanisms for explaining dynamics of aesthetic appreciation

    PubMed Central

    Carbon, Claus-Christian

    2011-01-01

    For many domains aesthetic appreciation has proven to be highly reliable. Evaluations of facial attractiveness, for instance, show high internal consistencies and impressively high inter-rater reliabilities, even across cultures. This indicates general mechanisms underlying such evaluations. It is, however, also obvious that our taste for specific objects is not always stable—in some realms such stability is hardly conceivable at all since aesthetic domains such as fashion, design, or art are inherently very dynamic. Gaining insights into the cognitive mechanisms that trigger and enable corresponding changes of aesthetic appreciation is of particular interest for psychologists as this will probably reveal essential mechanisms of aesthetic evaluations per se. The present paper develops a two-step model, dynamically adapting itself, which accounts for typical dynamics of aesthetic appreciation found in different research areas such as art history, philosophy, and psychology. The first step assumes singular creative sources creating and establishing innovative material towards which, in a second step, people adapt by integrating it into their visual habits. This inherently leads to dynamic changes of the beholders— aesthetic appreciation. PMID:23145254

  19. Long-Term Reliability of a Hard-Switched Boost Power Processing Unit Utilizing SiC Power MOSFETs

    NASA Technical Reports Server (NTRS)

    Ikpe, Stanley A.; Lauenstein, Jean-Marie; Carr, Gregory A.; Hunter, Don; Ludwig, Lawrence L.; Wood, William; Iannello, Christopher J.; Del Castillo, Linda Y.; Fitzpatrick, Fred D.; Mojarradi, Mohammad M.; hide

    2016-01-01

    Silicon carbide (SiC) power devices have demonstrated many performance advantages over their silicon (Si) counterparts. As the inherent material limitations of Si devices are being swiftly realized, wide-band-gap (WBG) materials such as SiC have become increasingly attractive for high power applications. In particular, SiC power metal oxide semiconductor field effect transistors' (MOSFETs) high breakdown field tolerance, superior thermal conductivity and low-resistivity drift regions make these devices an excellent candidate for power dense, low loss, high frequency switching applications in extreme environment conditions. In this paper, a novel power processing unit (PPU) architecture is proposed utilizing commercially available 4H-SiC power MOSFETs from CREE Inc. A multiphase straight boost converter topology is implemented to supply up to 10 kilowatts full-scale. High Temperature Gate Bias (HTGB) and High Temperature Reverse Bias (HTRB) characterization is performed to evaluate the long-term reliability of both the gate oxide and the body diode of the SiC components. Finally, susceptibility of the CREE SiC MOSFETs to damaging effects from heavy-ion radiation representative of the on-orbit galactic cosmic ray environment are explored. The results provide the baseline performance metrics of operation as well as demonstrate the feasibility of a hard-switched PPU in harsh environments.

  20. 75 FR 72664 - System Personnel Training Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-26

    ...Under section 215 of the Federal Power Act, the Commission approves two Personnel Performance, Training and Qualifications (PER) Reliability Standards, PER-004-2 (Reliability Coordination--Staffing) and PER-005-1 (System Personnel Training), submitted to the Commission for approval by the North American Electric Reliability Corporation, the Electric Reliability Organization certified by the Commission. The approved Reliability Standards require reliability coordinators, balancing authorities, and transmission operators to establish a training program for their system operators, verify each of their system operators' capability to perform tasks, and provide emergency operations training to every system operator. The Commission also approves NERC's proposal to retire two existing PER Reliability Standards that are replaced by the standards approved in this Final Rule.

  1. Marginal Cost Pricing in a World without Perfect Competition: Implications for Electricity Markets with High Shares of Low Marginal Cost Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frew, Bethany A.; Clark, Kara; Bloom, Aaron P.

    A common approach to regulating electricity is through auction-based competitive wholesale markets. The goal of this approach is to provide a reliable supply of power at the lowest reasonable cost to the consumer. This necessitates market structures and operating rules that ensure revenue sufficiency for all generators needed for resource adequacy purposes. Wholesale electricity markets employ marginal-cost pricing to provide cost-effective dispatch such that resources are compensated for their operational costs. However, marginal-cost pricing alone cannot guarantee cost recovery outside of perfect competition, and electricity markets have at least six attributes that preclude them from functioning as perfectly competitive markets.more » These attributes include market power, externalities, public good attributes, lack of storage, wholesale price caps, and ineffective demand curve. Until (and unless) these failures are ameliorated, some form of corrective action(s) will be necessary to improve market efficiency so that prices can correctly reflect the needed level of system reliability. Many of these options necessarily involve some form of administrative or out-of-market actions, such as scarcity pricing, capacity payments, bilateral or other out-of-market contracts, or some hybrid combination. A key focus with these options is to create a connection between the electricity market and long-term reliability/loss-of-load expectation targets, which are inherently disconnected in the native markets because of the aforementioned market failures. The addition of variable generation resources can exacerbate revenue sufficiency and resource adequacy concerns caused by these underlying market failures. Because variable generation resources have near-zero marginal costs, they effectively suppress energy prices and reduce the capacity factors of conventional generators through the merit-order effect in the simplest case of a convex market; non-convexities can also suppress prices.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pind, C.

    The SECURE heating reactor was designed by ASEA-ATOM as a realistic alternative for district heating in urban areas and for supplying heat to process industries. SECURE has unique safety characteristics, that are based on fundamental laws of physics. The safety does not depend on active components or operator intervention for shutdown and cooling of the reactor. The inherent safety characteristics of the plant cannot be affected by operator errors. Due to its very low environment impact, it can be sited close to heat consumers. The SECURE heating reactor has been shown to be competitive in comparison with other alternatives formore » heating Helsinki and Seoul. The SECURE heating reactor forms a basis for the power-producing SECURE-P reactor known as PIUS (Process Inherent Ultimate Safety), which is based on the same inherent safety principles. The thermohydraulic function and transient response have been demonstrated in a large electrically heated loop at the ASEA-ATOM laboratories.« less

  3. Landmark-based robust navigation for tactical UGV control in GPS-denied communication-degraded environments

    NASA Astrophysics Data System (ADS)

    Endo, Yoichiro; Balloch, Jonathan C.; Grushin, Alexander; Lee, Mun Wai; Handelman, David

    2016-05-01

    Control of current tactical unmanned ground vehicles (UGVs) is typically accomplished through two alternative modes of operation, namely, low-level manual control using joysticks and high-level planning-based autonomous control. Each mode has its own merits as well as inherent mission-critical disadvantages. Low-level joystick control is vulnerable to communication delay and degradation, and high-level navigation often depends on uninterrupted GPS signals and/or energy-emissive (non-stealth) range sensors such as LIDAR for localization and mapping. To address these problems, we have developed a mid-level control technique where the operator semi-autonomously drives the robot relative to visible landmarks that are commonly recognizable by both humans and machines such as closed contours and structured lines. Our novel solution relies solely on optical and non-optical passive sensors and can be operated under GPS-denied, communication-degraded environments. To control the robot using these landmarks, we developed an interactive graphical user interface (GUI) that allows the operator to select landmarks in the robot's view and direct the robot relative to one or more of the landmarks. The integrated UGV control system was evaluated based on its ability to robustly navigate through indoor environments. The system was successfully field tested with QinetiQ North America's TALON UGV and Tactical Robot Controller (TRC), a ruggedized operator control unit (OCU). We found that the proposed system is indeed robust against communication delay and degradation, and provides the operator with steady and reliable control of the UGV in realistic tactical scenarios.

  4. Perceiving numbers does not cause automatic shifts of spatial attention.

    PubMed

    Fattorini, Enrico; Pinto, Mario; Rotondaro, Francesca; Doricchi, Fabrizio

    2015-12-01

    It is frequently assumed that the brain codes number magnitudes according to an inherent left-to-right spatial organization. In support of this hypothesis it has been reported that in humans, perceiving small numbers induces automatic shifts of attention toward the left side of space whereas perceiving large numbers automatically shifts attention to the right side of space (i.e., Attentional SNARC: Att-SNARC; Fischer, Castel, Dodd, & Pratt, 2003). Nonetheless, the Att-SNARC has been often not replicated and its reliability never tested. To ascertain whether the mere perception of numbers causes shifts of spatial attention or whether number-space interaction takes place at a different stage of cognitive processing, we re-assessed the consistency and reliability of the Att-SNARC and investigated its role in the production of SNARC effects in Parity Judgement (PJ) and Magnitude Comparison (MC) tasks. In a first study in 60 participants, we found no Att-SNARC, despite finding strong PJ- and MC-SNARC effects. No correlation was present between the Att-SNARC and the SNARC. Split-half tests showed no reliability of the Att-SNARC and high reliability of the PJ- and MC-SNARC. In a second study, we re-assessed the Att-SNARC and tested its direct influence on a MC-SNARC task with laterally presented targets. No Att-SNARC and no influence of the Att-SNARC on the MC-SNARC were found. Also in this case, the SNARC was reliable whereas the Att-SNARC task was not. Finally, in a third study we observed a significant Att-SNARC when participants were asked to recall the position occupied on a ruler by the numbers presented in each trial: however the Att-SNARC task was not reliable. These results show that perceiving numbers does not cause automatic shifts of spatial attention and that whenever present, these shifts do not modulate the SNARC. The same results suggest that numbers have no inherent mental left-to-right organization and that, whenever present, this organization can have both response-related and strategically driven memory-related origins. Nonetheless, response-related factors generate more reliable and stable spatial representations of numbers. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Reliability-based optimization of maintenance scheduling of mechanical components under fatigue

    PubMed Central

    Beaurepaire, P.; Valdebenito, M.A.; Schuëller, G.I.; Jensen, H.A.

    2012-01-01

    This study presents the optimization of the maintenance scheduling of mechanical components under fatigue loading. The cracks of damaged structures may be detected during non-destructive inspection and subsequently repaired. Fatigue crack initiation and growth show inherent variability, and as well the outcome of inspection activities. The problem is addressed under the framework of reliability based optimization. The initiation and propagation of fatigue cracks are efficiently modeled using cohesive zone elements. The applicability of the method is demonstrated by a numerical example, which involves a plate with two holes subject to alternating stress. PMID:23564979

  6. The DSM-III personality disorders section: a commentary.

    PubMed

    Frances, A

    1980-09-01

    The author reviews the DSM-III section on personality disorders, discusses several of its more controversial diagnoses, and suggests some possible alternatives. He attributes the continued low reliability of personality diagnoses, compared with the other major sections of DSM-III, to two inherent obstacles: the lack of clear boundaries demarcating the personality disorders from normality and from one another, and the confounding influence of state and role factors. Nonetheless, the DSM-III multiaxial system highlights the importance of personality diagnosis and, together with the provision of clearly specified diagnostic criteria, achieves a considerably improved reliability compared with previous nomenclatures.

  7. Mars Exploration Rover: surface operations

    NASA Technical Reports Server (NTRS)

    Erickson, J. K.; Adler, M.; Crisp, J.; Mishkin, A.; Welch, R.

    2002-01-01

    This paper will provide an overview of the planned mission, and also focus on the different operations challenges inherent in operating these two very off road vehicles, and the solutions adopted to enable the best utilization of their capabilities for high science return and responsiveness to scientific discovery.

  8. Correction of Visual Perception Based on Neuro-Fuzzy Learning for the Humanoid Robot TEO.

    PubMed

    Hernandez-Vicen, Juan; Martinez, Santiago; Garcia-Haro, Juan Miguel; Balaguer, Carlos

    2018-03-25

    New applications related to robotic manipulation or transportation tasks, with or without physical grasping, are continuously being developed. To perform these activities, the robot takes advantage of different kinds of perceptions. One of the key perceptions in robotics is vision. However, some problems related to image processing makes the application of visual information within robot control algorithms difficult. Camera-based systems have inherent errors that affect the quality and reliability of the information obtained. The need of correcting image distortion slows down image parameter computing, which decreases performance of control algorithms. In this paper, a new approach to correcting several sources of visual distortions on images in only one computing step is proposed. The goal of this system/algorithm is the computation of the tilt angle of an object transported by a robot, minimizing image inherent errors and increasing computing speed. After capturing the image, the computer system extracts the angle using a Fuzzy filter that corrects at the same time all possible distortions, obtaining the real angle in only one processing step. This filter has been developed by the means of Neuro-Fuzzy learning techniques, using datasets with information obtained from real experiments. In this way, the computing time has been decreased and the performance of the application has been improved. The resulting algorithm has been tried out experimentally in robot transportation tasks in the humanoid robot TEO (Task Environment Operator) from the University Carlos III of Madrid.

  9. Correction of Visual Perception Based on Neuro-Fuzzy Learning for the Humanoid Robot TEO

    PubMed Central

    2018-01-01

    New applications related to robotic manipulation or transportation tasks, with or without physical grasping, are continuously being developed. To perform these activities, the robot takes advantage of different kinds of perceptions. One of the key perceptions in robotics is vision. However, some problems related to image processing makes the application of visual information within robot control algorithms difficult. Camera-based systems have inherent errors that affect the quality and reliability of the information obtained. The need of correcting image distortion slows down image parameter computing, which decreases performance of control algorithms. In this paper, a new approach to correcting several sources of visual distortions on images in only one computing step is proposed. The goal of this system/algorithm is the computation of the tilt angle of an object transported by a robot, minimizing image inherent errors and increasing computing speed. After capturing the image, the computer system extracts the angle using a Fuzzy filter that corrects at the same time all possible distortions, obtaining the real angle in only one processing step. This filter has been developed by the means of Neuro-Fuzzy learning techniques, using datasets with information obtained from real experiments. In this way, the computing time has been decreased and the performance of the application has been improved. The resulting algorithm has been tried out experimentally in robot transportation tasks in the humanoid robot TEO (Task Environment Operator) from the University Carlos III of Madrid. PMID:29587392

  10. CLON: Overlay Networks and Gossip Protocols for Cloud Environments

    NASA Astrophysics Data System (ADS)

    Matos, Miguel; Sousa, António; Pereira, José; Oliveira, Rui; Deliot, Eric; Murray, Paul

    Although epidemic or gossip-based multicast is a robust and scalable approach to reliable data dissemination, its inherent redundancy results in high resource consumption on both links and nodes. This problem is aggravated in settings that have costlier or resource constrained links as happens in Cloud Computing infrastructures composed by several interconnected data centers across the globe.

  11. Laser Capture Microdissection Revisited as a Tool for Transcriptomic Analysis: Application of an Excel-Based qPCR Preparation Software (PREXCEL-Q)

    USDA-ARS?s Scientific Manuscript database

    The ability to reliably analyze cellular and molecular profiles of normal or diseased tissues is frequently obfuscated by the inherent heterogeneous nature of tissues. Laser Capture Microdissection (LCM) is an innovative technique that allows the isolation and enrichment of pure subpopulations of c...

  12. Reconceptualisation of Approaches to Teaching Evaluation in Higher Education

    ERIC Educational Resources Information Center

    Tran, Nga D.

    2015-01-01

    The ubiquity of using Student Evaluation of Teaching (SET) in higher education is inherently controversial. Issues mostly resolve around whether the instrument is reliable and valid for the purpose for which it was intended. Controversies exist, in part, due to the lack of a theoretical framework upon which SETs can be based and tested for their…

  13. The Q-Sort method: use in landscape assessment research and landscape planning

    Treesearch

    David G. Pitt; Ervin H. Zube

    1979-01-01

    The assessment of visual quality inherently involves the measurement of perceptual response to landscape. The Q-Sort Method is a psychometric technique which produces reliable and valid interval measurements of people's perceptions of landscape visual quality as depicted in photographs. It is readily understood by participants across a wide range of age groups and...

  14. Students, History Textbooks, and the Hidden Dimension. Occasional Paper Number 77-1.

    ERIC Educational Resources Information Center

    Kingman, Barry

    Since history textbooks omit and/or emphasize certain data, students are left with a false sense of history. Although the "hard data" presented in history texts is generally regarded as reliable, the selection and organization of that data is inherently manipulative because other data has been excluded. Because authors do not begin with a…

  15. Reliability Analysis of Retaining Walls Subjected to Blast Loading by Finite Element Approach

    NASA Astrophysics Data System (ADS)

    GuhaRay, Anasua; Mondal, Stuti; Mohiuddin, Hisham Hasan

    2018-02-01

    Conventional design methods adopt factor of safety as per practice and experience, which are deterministic in nature. The limit state method, though not completely deterministic, does not take into account effect of design parameters, which are inherently variable such as cohesion, angle of internal friction, etc. for soil. Reliability analysis provides a measure to consider these variations into analysis and hence results in a more realistic design. Several studies have been carried out on reliability of reinforced concrete walls and masonry walls under explosions. Also, reliability analysis of retaining structures against various kinds of failure has been done. However, very few research works are available on reliability analysis of retaining walls subjected to blast loading. Thus, the present paper considers the effect of variation of geotechnical parameters when a retaining wall is subjected to blast loading. However, it is found that the variation of geotechnical random variables does not have a significant effect on the stability of retaining walls subjected to blast loading.

  16. Reliability Considerations for the Operation of Large Accelerator User Facilities

    DOE PAGES

    Willeke, F. J.

    2016-01-29

    The lecture provides an overview of considerations relevant for achieving highly reliable operation of accelerator based user facilities. The article starts with an overview of statistical reliability formalism which is followed by high reliability design considerations with examples. Finally, the article closes with operational aspects of high reliability such as preventive maintenance and spares inventory.

  17. Development of an inherently digital transducer

    NASA Technical Reports Server (NTRS)

    Richard, R. R.

    1972-01-01

    The term digital transducer normally implies the combination of conventional analog sensors with encoders or analog-to-digital converters. Because of the objectionable characteristics of most digital transducers, a program was instituted to investigate the possibility of producing a transducer that is inherently digital, instead of a transducer that is digital in the usual sense. Such a device would have improved accuracy and reliability and would have reduced power and bulk requirements because two processes, sensing and conditioning, would be combined into one processes. A Curie-point-temperature sensor is described that represents realization of the stated goal. Also, a metal-insulator semiconductor is described that does not conform precisely to the program goals but that appears to have applications as a new and interesting transduction device.

  18. Research on best practices for winter weather operations.

    DOT National Transportation Integrated Search

    2012-10-01

    There is a growing need to identify actionable practices relative to winter weather operations. Because of the : potential and inherent hazards during cold weather, it has become increasingly important to ensure that these : practices can be effectiv...

  19. 3D multifunctional integumentary membranes for spatiotemporal cardiac measurements and stimulation across the entire epicardium

    NASA Astrophysics Data System (ADS)

    Xu, Lizhi; Gutbrod, Sarah R.; Bonifas, Andrew P.; Su, Yewang; Sulkin, Matthew S.; Lu, Nanshu; Chung, Hyun-Joong; Jang, Kyung-In; Liu, Zhuangjian; Ying, Ming; Lu, Chi; Webb, R. Chad; Kim, Jong-Seon; Laughner, Jacob I.; Cheng, Huanyu; Liu, Yuhao; Ameen, Abid; Jeong, Jae-Woong; Kim, Gwang-Tae; Huang, Yonggang; Efimov, Igor R.; Rogers, John A.

    2014-02-01

    Means for high-density multiparametric physiological mapping and stimulation are critically important in both basic and clinical cardiology. Current conformal electronic systems are essentially 2D sheets, which cannot cover the full epicardial surface or maintain reliable contact for chronic use without sutures or adhesives. Here we create 3D elastic membranes shaped precisely to match the epicardium of the heart via the use of 3D printing, as a platform for deformable arrays of multifunctional sensors, electronic and optoelectronic components. Such integumentary devices completely envelop the heart, in a form-fitting manner, and possess inherent elasticity, providing a mechanically stable biotic/abiotic interface during normal cardiac cycles. Component examples range from actuators for electrical, thermal and optical stimulation, to sensors for pH, temperature and mechanical strain. The semiconductor materials include silicon, gallium arsenide and gallium nitride, co-integrated with metals, metal oxides and polymers, to provide these and other operational capabilities. Ex vivo physiological experiments demonstrate various functions and methodological possibilities for cardiac research and therapy.

  20. Exploring the Charge Transport in Conjugated Polymers.

    PubMed

    Xu, Yong; Sun, Huabin; Li, Wenwu; Lin, Yen-Fu; Balestra, Francis; Ghibaudo, Gerard; Noh, Yong-Young

    2017-11-01

    Conjugated polymers came to an unprecedented epoch that the charge transport is limited only by small disorder within aggregated domains. Accurate evaluation of transport performance is thus vital to optimizing further molecule design. Yet, the routine method by means of the conventional field-effect transistors may not satisfy such a requirement. Here, it is shown that the extrinsic effects of Schottky barrier, access transport through semiconductor bulk, and concurrent ambipolar conduction seriously influence transport analysis. The planar transistors incorporating ohmic contacts free of access and ambipolar conduction afford an ideal access to charge transport. It is found, however, that only the planar transistors operating in low-field regime are reliable to explore the inherent transport properties due to the energetic disorder lowering by the lateral field induced by high drain voltage. This work opens up a robust approach to comprehend the delicate charge transport in conjugated polymers so as to develop high-performance semiconducting polymers for promising plastic electronics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Review of two-phase flow liquid metal MHD and turbine energy conversion concepts for space applications

    NASA Technical Reports Server (NTRS)

    Fabris, Gracio

    1992-01-01

    Two-phase energy conversion systems could be liquid metal magnetohydrodynamic (LMMHD) with no moving parts or two-phase turbines. Both of them are inherently simple and reliable devices which can operate in a wide range of temperatures. Their thermal efficiency is significantly higher than for conventional cycles due to reheat of vapor by liquid phase during the energy converting expansion. Often they can be more easily coupled to heat sources. These features make two-phase systems particularly promising for space application. Insufficient research has been done in the past. So far achieved LMMHD generator and two-phase turbine efficiencies are in the 40 to 45 percent range. However if certain fluid dynamic and design problems are resolved these efficiencies could be brought into the range of 70 percent. This would make two-phase systems extremely competitive as compared to present or other proposed conversion system for space. Accordingly, well directed research effort on potential space applications of two-phase conversion systems would be a wise investment.

  2. Asymmetric Supercapacitor Electrodes and Devices.

    PubMed

    Choudhary, Nitin; Li, Chao; Moore, Julian; Nagaiah, Narasimha; Zhai, Lei; Jung, Yeonwoong; Thomas, Jayan

    2017-06-01

    The world is recently witnessing an explosive development of novel electronic and optoelectronic devices that demand more-reliable power sources that combine higher energy density and longer-term durability. Supercapacitors have become one of the most promising energy-storage systems, as they present multifold advantages of high power density, fast charging-discharging, and long cyclic stability. However, the intrinsically low energy density inherent to traditional supercapacitors severely limits their widespread applications, triggering researchers to explore new types of supercapacitors with improved performance. Asymmetric supercapacitors (ASCs) assembled using two dissimilar electrode materials offer a distinct advantage of wide operational voltage window, and thereby significantly enhance the energy density. Recent progress made in the field of ASCs is critically reviewed, with the main focus on an extensive survey of the materials developed for ASC electrodes, as well as covering the progress made in the fabrication of ASC devices over the last few decades. Current challenges and a future outlook of the field of ASCs are also discussed. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Ultrahigh Temperature Capacitive Pressure Sensor

    NASA Technical Reports Server (NTRS)

    Harsh, Kevin

    2014-01-01

    Robust, miniaturized sensing systems are needed to improve performance, increase efficiency, and track system health status and failure modes of advanced propulsion systems. Because microsensors must operate in extremely harsh environments, there are many technical challenges involved in developing reliable systems. In addition to high temperatures and pressures, sensing systems are exposed to oxidation, corrosion, thermal shock, fatigue, fouling, and abrasive wear. In these harsh conditions, sensors must be able to withstand high flow rates, vibration, jet fuel, and exhaust. In order for existing and future aeropropulsion turbine engines to improve safety and reduce cost and emissions while controlling engine instabilities, more accurate and complete sensor information is necessary. High-temperature (300 to 1,350 C) capacitive pressure sensors are of particular interest due to their high measurement bandwidth and inherent suitability for wireless readout schemes. The objective of this project is to develop a capacitive pressure sensor based on silicon carbon nitride (SiCN), a new class of high-temperature ceramic materials, which possesses excellent mechanical and electric properties at temperatures up to 1,600 C.

  4. Qubits, qutrits, and ququads stored in single photons from an atom-cavity system

    NASA Astrophysics Data System (ADS)

    Holleczek, Annemarie; Barter, Oliver; Langfahl-Klabes, Gunnar; Kuhn, Axel

    2015-03-01

    One of today's challenge to realize computing based on quantum mechanics is to reliably and scalably encode information in quantum systems. Here, we present a photon source to on-demand deliver photonic quantum bits of information based on a strongly coupled atom-cavity system. It operates intermittently for periods of up to 100μs, with a single-photon repetition rate of 1MHz, and an intra-cavity production e!ciency of up to 85%. Due to the photons inherent coherence time of 500ns and our ability to arbitrarily shape their amplitude and phase profile we time-bin encode information within one photon. To do so, the spatio-temporal envelope of a single photon is sub-divided in d time bins which allows for the delivery of arbitrary qu-d-its. The latter is done with a fidelity of > 95% for qubits, and 94% for qutrits verified using a newly developed time-resolved quantum-homodyne technique.

  5. Middle Eastern rhinoplasty.

    PubMed

    Azizzadeh, Babak; Mashkevich, Grigoriy

    2010-02-01

    The ethnic appearance of the Middle Eastern nose is defined by several unique visual features, particularly a high radix, wide overprojecting dorsum, and an amorphous hanging nasal tip. These external characteristics reflect distinct structural properties of the osseo-cartilaginous nasal framework and skin-soft tissue envelope in patients of Middle Eastern extraction. The goal, and the ultimate challenge, of rhinoplasty on Middle Eastern patients is to achieve balanced aesthetic refinement, while avoiding surgical westernization. Detailed understanding of the ethnic visual harmony in a Middle Eastern nose greatly assists in preserving native nasal-facial relationships during rhinoplasty on Middle Eastern patients. Esthetic alteration of a Middle Eastern nose follows a different set of goals and principles compared with rhinoplasties on white or other ethnic patients. This article highlights the inherent nasal features of the Middle Eastern nose and reviews pertinent concepts of rhinoplasty on Middle Eastern patients. Essential considerations in the process spanning the consultation and surgery are reviewed. Reliable operative techniques that achieve a successful aesthetic outcome are discussed in detail. Copyright 2010 Elsevier Inc. All rights reserved.

  6. Manual Torque Data Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mundt, Mark Osroe; Martinez, Matthew Ronald; Varela, Jeanette Judith

    At the Pantex Plant in Amarillo, TX, Production Technicians (PTs) build and disassemble nuclear weapon systems. The weapons are held in an integrated work stand for stability and to increase the safety environment for the workers and for the materials being processed. There are many occasions in which a knob must be turned to tighten an assembly part. This can help to secure or manipulate pieces of the system. As there are so many knobs to turn, the instructions given to the PTs are to twist the knob to a hand-tight setting, without the aid of a torque wrench. Theremore » are inherent risks in this procedure as the knobs can be tightened too loosely such that the apparatus falls apart or too tightly such that the force can crush or pinch components in the system that contain energetic materials. We want to study these operations at Pantex. Our goal is to collect torque data to assess the safety and reliability of humantooling interfaces.« less

  7. Work design and management in the manufacturing sector: development and validation of the Work Organisation Assessment Questionnaire.

    PubMed

    Griffiths, A; Cox, T; Karanika, M; Khan, S; Tomás, J M

    2006-10-01

    To examine the factor structure, reliability, and validity of a new context-specific questionnaire for the assessment of work and organisational factors. The Work Organisation Assessment Questionnaire (WOAQ) was developed as part of a risk assessment and risk reduction methodology for hazards inherent in the design and management of work in the manufacturing sector. Two studies were conducted. Data were collected from 524 white- and blue-collar employees from a range of manufacturing companies. Exploratory factor analysis was carried out on 28 items that described the most commonly reported failures of work design and management in companies in the manufacturing sector. Concurrent validity data were also collected. A reliability study was conducted with a further 156 employees. Principal component analysis, with varimax rotation, revealed a strong 28-item, five factor structure. The factors were named: quality of relationships with management, reward and recognition, workload, quality of relationships with colleagues, and quality of physical environment. Analyses also revealed a more general summative factor. Results indicated that the questionnaire has good internal consistency and test-retest reliability and validity. Being associated with poor employee health and changes in health related behaviour, the WOAQ factors are possible hazards. It is argued that the strength of those associations offers some estimation of risk. Feedback from the organisations involved indicated that the WOAQ was easy to use and meaningful for them as part of their risk assessment procedures. The studies reported here describe a model of the hazards to employee health and health related behaviour inherent in the design and management of work in the manufacturing sector. It offers an instrument for their assessment. The scales derived which form the WOAQ were shown to be reliable, valid, and meaningful to the user population.

  8. Cocurrent scrubber evaluation: TVA's Colbert lime-limestone wet-scrubbing pilot plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollinden, G.A.; Robards, R.F.; Moore, N.D.

    1979-01-01

    The Tennessee Valley Authority (TVA) is actively engaged in a pilot plant program to develop and/or evaluate wet-scrubbing processes for removing sulfur dioxide (SO/sub 2/) from boiler flue gas. The physical size and general arrangement of flue gas scrubbing systems have a major impact on capital investment and operating cost, as do potential operating and maintenance advantages inherent to some systems. The equipment configuration for a cocurrent scrubber reflects some of these advantages. EPRI funded TVA to perform preliminary screening tests at TVA's 1 MW pilot plant (Colbert Steam Plant) to develop operating data on the cocurrent design for usemore » in designing and operating a 10 MW prototype cocurrent scrubber at TVA's Shawnee Scrubber Test Facility. Results of Colbert tests showed excellent sulfur dioxide removal efficiencies, generally greater than 85%, low pressure drop, and high particulate removal efficiencies. This report covers these screening tests. The results indicate that commercial application of the cocurrent scrubber concept may save substantial capital investment by reducing the number of scrubber modules and auxiliary equipment. These evaluation tests provided the basis for the design and construction of the 10 MW cocurrent scrubber at the Shawnee Facility. Operation of this scrubber began in August 1978 to develop the scale-up similarities and differences between the Colbert test program (1 MW) and the Shawnee test program (10 MW). It also demonstrated the practicality and reliability of the 10 MW prototype. Detailed results of the prototype test series will be available in late 1979.« less

  9. Reliability Assessment for Low-cost Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Freeman, Paul Michael

    Existing low-cost unmanned aerospace systems are unreliable, and engineers must blend reliability analysis with fault-tolerant control in novel ways. This dissertation introduces the University of Minnesota unmanned aerial vehicle flight research platform, a comprehensive simulation and flight test facility for reliability and fault-tolerance research. An industry-standard reliability assessment technique, the failure modes and effects analysis, is performed for an unmanned aircraft. Particular attention is afforded to the control surface and servo-actuation subsystem. Maintaining effector health is essential for safe flight; failures may lead to loss of control incidents. Failure likelihood, severity, and risk are qualitatively assessed for several effector failure modes. Design changes are recommended to improve aircraft reliability based on this analysis. Most notably, the control surfaces are split, providing independent actuation and dual-redundancy. The simulation models for control surface aerodynamic effects are updated to reflect the split surfaces using a first-principles geometric analysis. The failure modes and effects analysis is extended by using a high-fidelity nonlinear aircraft simulation. A trim state discovery is performed to identify the achievable steady, wings-level flight envelope of the healthy and damaged vehicle. Tolerance of elevator actuator failures is studied using familiar tools from linear systems analysis. This analysis reveals significant inherent performance limitations for candidate adaptive/reconfigurable control algorithms used for the vehicle. Moreover, it demonstrates how these tools can be applied in a design feedback loop to make safety-critical unmanned systems more reliable. Control surface impairments that do occur must be quickly and accurately detected. This dissertation also considers fault detection and identification for an unmanned aerial vehicle using model-based and model-free approaches and applies those algorithms to experimental faulted and unfaulted flight test data. Flight tests are conducted with actuator faults that affect the plant input and sensor faults that affect the vehicle state measurements. A model-based detection strategy is designed and uses robust linear filtering methods to reject exogenous disturbances, e.g. wind, while providing robustness to model variation. A data-driven algorithm is developed to operate exclusively on raw flight test data without physical model knowledge. The fault detection and identification performance of these complementary but different methods is compared. Together, enhanced reliability assessment and multi-pronged fault detection and identification techniques can help to bring about the next generation of reliable low-cost unmanned aircraft.

  10. Fracture mechanics concepts in reliability analysis of monolithic ceramics

    NASA Technical Reports Server (NTRS)

    Manderscheid, Jane M.; Gyekenyesi, John P.

    1987-01-01

    Basic design concepts for high-performance, monolithic ceramic structural components are addressed. The design of brittle ceramics differs from that of ductile metals because of the inability of ceramic materials to redistribute high local stresses caused by inherent flaws. Random flaw size and orientation requires that a probabilistic analysis be performed in order to determine component reliability. The current trend in probabilistic analysis is to combine linear elastic fracture mechanics concepts with the two parameter Weibull distribution function to predict component reliability under multiaxial stress states. Nondestructive evaluation supports this analytical effort by supplying data during verification testing. It can also help to determine statistical parameters which describe the material strength variation, in particular the material threshold strength (the third Weibull parameter), which in the past was often taken as zero for simplicity.

  11. Vibration-free stirling cryocooler for high definition microscopy

    NASA Astrophysics Data System (ADS)

    Riabzev, S. V.; Veprik, A. M.; Vilenchik, H. S.; Pundak, N.; Castiel, E.

    2009-12-01

    The normal operation of high definition Scanning Electronic and Helium Ion microscope tools often relies on maintaining particular components at cryogenic temperatures. This has traditionally been accomplished by using liquid coolants such as liquid Nitrogen. This inherently limits the useful temperature range to above 77 K, produces various operational hazards and typically involves elevated ownership costs, inconvenient logistics and maintenance. Mechanical coolers, over-performing the above traditional method and capable of delivering required (even below 77 K) cooling to the above cooled components, have been well-known elsewhere for many years, but their typical drawbacks, such as high purchasing cost, cooler size, low reliability and high power consumption have so far prevented their wide-spreading. Additional critical drawback is inevitable degradation of imagery performance originated from the wideband vibration export as typical for the operation of the mechanical cooler incorporating numerous movable components. Recent advances in the development of reliable, compact, reasonably priced and dynamically quiet linear cryogenic coolers gave rise to so-called "dry cooling" technologies aimed at eventually replacing the traditional use of outdated liquid Nitrogen cooling facilities. Although much improved these newer cryogenic coolers still produce relatively high vibration export which makes them incompatible with modern high definition microscopy tools. This has motivated further research activity towards developing a vibration free closed-cycle mechanical cryocooler. The authors have successfully adapted the standard low vibration Stirling cryogenic refrigerator (Ricor model K535-LV) delivering 5 W@40 K heat lift for use in vibration-sensitive high definition microscopy. This has been achieved by using passive mechanical counterbalancing of the main portion of the low frequency vibration export in combination with an active feed-forward multi-axes suppression of the residual wideband vibration, thermo-conductive vibration isolation struts and soft vibration mounts. The attainable performance of the resulting vibration free linear Stirling cryocooler (Ricor model K535-ULV) is evaluated through a full-scale experimentation.

  12. Exercise Countermeasure Hardware Evolution on ISS: The First Decade.

    PubMed

    Korth, Deborah W

    2015-12-01

    The hardware systems necessary to support exercise countermeasures to the deconditioning associated with microgravity exposure have evolved and improved significantly during the first decade of the International Space Station (ISS), resulting in both new types of hardware and enhanced performance capabilities for initial hardware items. The original suite of countermeasure hardware supported the first crews to arrive on the ISS and the improved countermeasure system delivered in later missions continues to serve the astronauts today with increased efficacy. Due to aggressive hardware development schedules and constrained budgets, the initial approach was to identify existing spaceflight-certified exercise countermeasure equipment, when available, and modify it for use on the ISS. Program management encouraged the use of commercial-off-the-shelf (COTS) hardware, or hardware previously developed (heritage hardware) for the Space Shuttle Program. However, in many cases the resultant hardware did not meet the additional requirements necessary to support crew health maintenance during long-duration missions (3 to 12 mo) and anticipated future utilization activities in support of biomedical research. Hardware development was further complicated by performance requirements that were not fully defined at the outset and tended to evolve over the course of design and fabrication. Modifications, ranging from simple to extensive, were necessary to meet these evolving requirements in each case where heritage hardware was proposed. Heritage hardware was anticipated to be inherently reliable without the need for extensive ground testing, due to its prior positive history during operational spaceflight utilization. As a result, developmental budgets were typically insufficient and schedules were too constrained to permit long-term evaluation of dedicated ground-test units ("fleet leader" type testing) to identify reliability issues when applied to long-duration use. In most cases, the exercise unit with the most operational history was the unit installed on the ISS.

  13. Modeling regional and climatic variation of wood density and ring width in intensively managed Douglas-fir

    Treesearch

    Cosmin N. Filipescue; Eini C. Lowell; Ross Koppenaal; Al K. Mitchell

    2014-01-01

    Characteristics of annual rings are reliable indicators of growth and wood quality in trees. The main objective of our study was to model the variation in annual ring attributes due to intensive silviculture and inherent regional differences in climate and site across a wide geographic range of Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco)....

  14. Test-Retest Analyses of the Test of English as a Foreign Language. TOEFL Research Reports Report 45.

    ERIC Educational Resources Information Center

    Henning, Grant

    This study provides information about the total and component scores of the Test of English as a Foreign Language (TOEFL). First, the study provides comparative global and component estimates of test-retest, alternate-form, and internal-consistency reliability, controlling for sources of measurement error inherent in the examinees and the testing…

  15. Determining Optimal Machine Replacement Events with Periodic Inspection Intervals

    DTIC Science & Technology

    2013-03-01

    10 2.3 Remaining Useful Life Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.4...has some idea of the characteristic reliability inherent to that system. From assembly lines, to computers, to aircraft, quantities such as mean time...to failure, mean time to critical failure, and others have been quantified to a great extent. Further, any entity concerned with cost will also have an

  16. Theoretical Issues of Validity in the Measurement of Aided Speech Reception Threshold in Noise for Comparing Nonlinear Hearing Aid Systems.

    PubMed

    Naylor, Graham

    2016-07-01

    Adaptive Speech Reception Threshold in noise (SRTn) measurements are often used to make comparisons between alternative hearing aid (HA) systems. Such measurements usually do not constrain the signal-to-noise ratio (SNR) at which testing takes place. Meanwhile, HA systems increasingly include nonlinear features that operate differently in different SNRs, and listeners differ in their inherent SNR requirements. To show that SRTn measurements, as commonly used in comparisons of alternative HA systems, suffer from threats to their validity, to illustrate these threats with examples of potentially invalid conclusions in the research literature, and to propose ways to tackle these threats. An examination of the nature of SRTn measurements in the context of test theory, modern nonlinear HAs, and listener diversity. Examples from the audiological research literature were used to estimate typical interparticipant variation in SRTn and to illustrate cases where validity may have been compromised. There can be no doubt that SRTn measurements, when used to compare nonlinear HA systems, in principle, suffer from threats to their internal and external/ecological validity. Interactions between HA nonlinearities and SNR, and interparticipant differences in inherent SNR requirements, can act to generate misleading results. In addition, SRTn may lie at an SNR outside the range for which the HA system is designed or expected to operate in. Although the extent of invalid conclusions in the literature is difficult to evaluate, examples of studies were nevertheless identified where the risk of each form of invalidity is significant. Reliable data on ecological SNRs is becoming available, so that ecological validity can be assessed. Methodological developments that can reduce the risk of invalid conclusions include variations on the SRTn measurement procedure itself, manipulations of stimulus or scoring conditions to place SRTn in an ecologically relevant range, and design and analysis approaches that take account of interparticipant differences. American Academy of Audiology.

  17. Integrated performance and reliability specification for digital avionics systems

    NASA Technical Reports Server (NTRS)

    Brehm, Eric W.; Goettge, Robert T.

    1995-01-01

    This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.

  18. Study on evaluation of construction reliability for engineering project based on fuzzy language operator

    NASA Astrophysics Data System (ADS)

    Shi, Yu-Fang; Ma, Yi-Yi; Song, Ping-Ping

    2018-03-01

    System Reliability Theory is a research hotspot of management science and system engineering in recent years, and construction reliability is useful for quantitative evaluation of project management level. According to reliability theory and target system of engineering project management, the defination of construction reliability appears. Based on fuzzy mathematics theory and language operator, value space of construction reliability is divided into seven fuzzy subsets and correspondingly, seven membership function and fuzzy evaluation intervals are got with the operation of language operator, which provides the basis of corresponding method and parameter for the evaluation of construction reliability. This method is proved to be scientific and reasonable for construction condition and an useful attempt for theory and method research of engineering project system reliability.

  19. A Thermal Melt Probe System for Extensive, Low-Cost Instrument Deployment Within and Beneath Ice Sheets

    NASA Astrophysics Data System (ADS)

    Winebrenner, D. P.; Elam, W. T.; Carpenter, M.; Kintner, P., III

    2014-12-01

    More numerous observations within and beneath ice sheets are needed to address a broad variety of important questions concerning ice sheets and climate. However, emplacement of instruments continues to be constrained by logistical burdens, especially in cold ice a kilometer or more thick. Electrically powered thermal melt probes are inherently logistically light and efficient, especially for reaching greater depths in colder ice. They therefore offer a means of addressing current measurement problems, but have been limited historically by a lack of technology for reliable operation at the necessary voltages and powers. Here we report field tests in Greenland of two new melt probes. We operated one probe at 2.2 kilowatts (kW) and 1050 volts (V), achieving a depth of 400 m in the ice in ~ 120 hours, without electrical failure. That depth is the second greatest achieved thus far with a thermal melt probe, exceeded only by one deployment to 1005 m in Greenland in 1968, which ended in an electrical failure. Our test run took place in two intervals separated by a year, with the probe frozen at 65 m depth during the interim, after which we re-established communication, unfroze the probe, and proceeded to the greater depth. During the second field test we operated a higher-power probe, initially at 2.5 kW and 1500 V and progressing to 4.5 kW and 2000 V. Initial data indicate that this probe achieved a descent rate of 8 m/hr, which if correct would be the fastest rate yet achieved for such probes. Moreover, we observed maintenance of vertical probe travel using pendulum steering throughout both tests, as well as autonomous descent without operator-intervention after launch. The latter suggests potential for crews of 1-2 to operate several melt probes concurrently. However, the higher power probe did suffer electrical failure of a heating element after 7 hours of operation at 2000 V (24 hours after the start of the test), contrary to expectations based on laboratory component and system testing. We are therefore revising the probe heaters using a newer but more development-intensive technology. With probe systems now validated in our tests, this will result in a reliable means to emplace instruments for studies of subglacial hydrology, ice dynamics, and possible subglacial ecologies.

  20. Radiation Challenges for Electronics in the Vision for Space Exploration

    NASA Technical Reports Server (NTRS)

    LaBel, Kenneth A.

    2006-01-01

    The slides present a brief snapshot discussing electronics and exploration-related challenges. Radiation effects have been the prime target, however, electronic parts reliability issues must also be considered. Modern electronics are designed with a 3-5 year lifetime. Upscreening does not improve reliability, merely determines inherent levels. Testing costs are driven by device complexity; they increase tester complexity, beam requirements, and facility choices. Commercial devices may improve performance, but are not cost panaceas. There is need for a more cost-effective access to high energy heavy ion facilities such as NSCL and NSRL. Costs for capable test equipment can run more than $1M for full testing.

  1. GEOMAGIA50: An archeointensity database with PHP and MySQL

    NASA Astrophysics Data System (ADS)

    Korhonen, K.; Donadini, F.; Riisager, P.; Pesonen, L. J.

    2008-04-01

    The GEOMAGIA50 database stores 3798 archeomagnetic and paleomagnetic intensity determinations dated to the past 50,000 years. It also stores details of the measurement setup for each determination, which are used for ranking the data according to prescribed reliability criteria. The ranking system aims to alleviate the data reliability problem inherent in this kind of data. GEOMAGIA50 is based on two popular open source technologies. The MySQL database management system is used for storing the data, whereas the functionality and user interface are provided by server-side PHP scripts. This technical brief gives a detailed description of GEOMAGIA50 from a technical viewpoint.

  2. 76 FR 66328 - Callaway Golf Ball Operations, Inc., Including On-Site Leased Workers From Reliable Temp Services...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-26

    ... Operations, Inc., Including On-Site Leased Workers From Reliable Temp Services, Inc., Johnson & Hill Staffing... Golf Ball Operations, Inc., including on-site leased workers from Reliable Temp Services, Inc., and... Reliable Temp Services, Inc., Johnson & Hill Staffing and Apollo Security, Chicopee, Massachusetts, who...

  3. On the reliable probing of discrete ‘plasma bullet’ propagation

    NASA Astrophysics Data System (ADS)

    Svarnas, P.; Gazeli, K.; Gkelios, A.; Amanatides, E.; Mataras, D.

    2018-04-01

    This report is devoted to the imaging of the spatiotemporal evolution of ‘plasma bullets’ during their propagation at atmospheric pressure. Although numerous studies have been realized on this topic with high gating rate cameras, triggering issues and statistical analyses of single-shot events over different cycles of the driving high voltage have not been discussed properly. The present work demonstrates the related difficulties faced due to the inherently erratic propagation of the bullets. A way of capturing and statistically analysing discrete bullet events is introduced, which is reliable even when low gating rate cameras are used and multiple bullets are formed within the voltage cycle. The method is based on plasma observations by means of two photoelectron multiplier tubes. It is suggested that these signals correlate better with bullet propagation events than the driving voltage or bullet current waveforms do, and allow either the elimination of issues arising from erratic propagation and hardware delays or at least the quantification of certain uncertainties. Herein, the entire setup, the related concept and the limits of accuracy are discussed in detail. Snapshots of the bullets are captured and commented on, with the bullets being produced by a sinusoidally driven single-electrode plasma jet reactor operating with helium. Finally, the instantaneous velocities of bullets on the order of 104-105 m s-1 are measured and propagation phases are distinguished in good agreement with the bibliography.

  4. Linear-drive cryocoolers for the Department of Defense standard advanced dewar assembly (SADA)

    NASA Astrophysics Data System (ADS)

    Tate, Garin S.

    2005-05-01

    The Standard Advanced Dewar Assembly (SADA) is the critical module in the Department of Defense (DoD) standardization of scanning second-generation thermal imaging systems. The DoD has established a family of SADAs to fulfill a range of performance requirements for various platforms. The SADA consists of the Infrared Focal Plane Array (IRFPA), Dewar, Command & Control Electronics (C&CE), and the cryogenic cooler, and is used in platforms such as the Apache helicopter, the M1A2 Abrams main battle tank, the M2 Bradley Infantry Fighting Vehicle, and the Javelin Command Launch Unit (CLU). In support of the family of SADAs, the DoD defined a complementary family of tactical linear drive cryocoolers. The Stirling cycle linear drive cryocoolers are utilized to cool the Infrared Focal Plane Arrays (IRFPAs) in the SADAs. These coolers are required to have low input power, a quick cool-down time, low vibration output, low audible noise, and a higher reliability than currently fielded rotary coolers. These coolers must also operate in a military environment with its inherent high vibration level and temperature extremes. This paper will (1) outline the characteristics of each cryocooler, (2) present the status and results of qualification tests, (3) present the status of production efforts, and (4) present the status of efforts to increase linear drive cooler reliability.

  5. REQUEST A SPEAKER

    Science.gov Websites

    RESPONSIBILITY CENTCOM COALITION MEDIA SOCIAL MEDIA NEWS ARTICLES PRESS RELEASES IMAGERY VIDEOS TRANSCRIPTS VISITORS AND PERSONNEL FAMILY CENTER FAMILY READINESS CENTCOM WEBMAIL SOCIAL MEDIA SECURITY ACCOUNTABILITY CENTCOM Coalition Operations And Exercises Operation Inherent Resolve Resolute Support Media Social Media

  6. ABOUT US

    Science.gov Websites

    RESPONSIBILITY CENTCOM COALITION MEDIA SOCIAL MEDIA NEWS ARTICLES PRESS RELEASES IMAGERY VIDEOS TRANSCRIPTS VISITORS AND PERSONNEL FAMILY CENTER FAMILY READINESS CENTCOM WEBMAIL SOCIAL MEDIA SECURITY ACCOUNTABILITY Operations And Exercises Operation Inherent Resolve Resolute Support Media Social Media News Articles Press

  7. Method to Increase Performance of Foil Bearings Through Passive Thermal Management

    NASA Technical Reports Server (NTRS)

    Bruckner, Robert

    2013-01-01

    This invention is a new approach to designing foil bearings to increase their load capacity and improve their reliability through passive thermal management. In the present case, the bearing is designed in such a way as to prevent the carryover of lubricant from the exit of one sector to the inlet of the ensuing sector of the foil bearing. When such passive thermal management techniques are used, bearing load capacity is improved by multiples, and reliability is enhanced when compared to current foil bearings. This concept has recently been tested and validated, and shows that load capacity performance of foil bearings can be improved by a factor of two at relatively low speeds with potentially greater relative improvements at higher speeds. Such improvements in performance with respect to speed are typical of foil bearings. Additionally, operation of these newly conceived bearings shows much more reliability and repeatable performance. This trait can be exploited in machine design to enhance safety, reliability, and overall performance. Finally, lower frictional torque has been demonstrated when operating at lower (non-load capacity) loads, thus providing another improvement above the current state of the art. The objective of the invention is to incorporate features into a foil bearing that both enhance passive thermal management and temperature control, while at the same time improve the hydrodynamic (load capacity) performance of the foil bearing. Foil bearings are unique antifriction devices that can utilize the working fluid of a machine as a lubricant (typically air for turbines and motors, liquids for pumps), and as a coolant to remove excess energy due to frictional heating. The current state of the art of foil bearings utilizes forced cooling of the bearing and shaft, which represents poor efficiency and poor reliability. This invention embodies features that utilize the bearing geometry in such a manner as to both support load and provide an inherent and passive cooling mechanism. This cooling mechanism functions in such a way as to prevent used (higher temperature) lubricant from being carried over from the exit of one sector into the entry of the next sector of the foil bearing. The disclosed innovation is an improved foil bearing design that reduces or eliminates the need for force cooling of the bearing, while at the same time improving the load capacity of the bearing by at least a factor of two. These improvements are due to the elimination of lubricant carryover from the trailing edge of one sector into the leading edge of the next, and the mixing of used lubricant with the surrounding ambient fluid.

  8. On the ambiguity in relativistic tidal deformability

    NASA Astrophysics Data System (ADS)

    Gralla, Samuel E.

    2018-04-01

    The LIGO collaboration recently reported the first gravitational-wave constraints on the tidal deformability of neutron stars. I discuss an inherent ambiguity in the notion of relativistic tidal deformability that, while too small to affect the present measurement, may become important in the future. I propose a new way to understand the ambiguity and discuss future prospects for reliably linking observed gravitational waveforms to compact object microphysics.

  9. Tactical Medical Training for Police Officers: Lessons from U.S. Special Forces

    DTIC Science & Technology

    2012-12-01

    time is reliable, dangers inherent in law enforcement combined with police officers working locations is often problematic . In tactical situations, a...technique. Eight people were arrested in August 2012 and indicted on sex trafficking charges for allegedly forcing teenage girls into prostitution ...Six of the people detained were arrested by the Inland Child Exploitation/ Prostitution Task Force. The Inland Child Exploitation/ Prostitution Task

  10. Turbomachine Sealing and Secondary Flows. Part 2; Review of Rotordynamics Issues in Inherently Unsteady Flow Systems With Small Clearances

    NASA Technical Reports Server (NTRS)

    Hendricks, R. C.; Tam, L. T.; Muszynska, A.

    2004-01-01

    Today's computational methods enable the determination of forces in complex systems, but without field validation data, or feedback, there is a high risk of failure when the design envelope is challenged. The data of Childs and Bently and field data reported in NASA Conference Proceedings serve as sources of design information for the development of these computational codes. Over time all turbomachines degrade and instabilities often develop, requiring responsible, accurate, turbomachine diagnostics with proper decisions to prevent failures. Tam et al. (numerical) and Bently and Muszynska (analytical) models corroborate and implicate that destabilizing factors are related through increases in the fluid-force average circumferential velocity. The stability threshold can be controlled by external swirl and swirl brakes and increases in radial fluid film stiffness (e.g., hydrostatic and ambient pressures) to enhance rotor stability. Also cited are drum rotor self-excited oscillations, where the classic fix is to add a split or severed damper ring or cylindrical damper drum, and the Benkert-Wachter work that engendered swirl brake concepts. For a smooth-operating, reliable, long-lived machine, designers must pay very close attention to sealing dynamics and diagnostic methods. Correcting the seals enabled the space shuttle main engine high-pressure fuel turbopump (SSME HPFTP) to operate successfully.

  11. Portable head computed tomography scanner--technology and applications: experience with 3421 scans.

    PubMed

    Carlson, Andrew P; Yonas, Howard

    2012-10-01

    The use of head computed tomography (CT) is standard in the management of acute brain injury; however, there are inherent risks of transport of critically ill patients. Portable CT can be brought to the patient at any location. We describe the clinical use of a portable head CT scanner (CereTom: NeuroLogica: Danvers, MA) that can be brought to the patient's bedside or to other locations such as the operating room or angiography suite. Between June of 2006 and December of 2009, a total of 3421 portable CTs were performed. A total of 3278 (95.8%) were performed in the neuroscience intensive care unit (ICU) for an average of 2.6 neuroscience ICU CT scans per day. Other locations where CTs were performed included other ICUs (n = 97), the operating room (n = 53), the emergency department (n = 1), and the angiography suite (n = 2). Most studies were non-contrasted head CT, though other modalities including xenon/CT, contrasted CT, and CT angiography were performed. Portable head CT can reliably and consistently be performed at the patient's bedside. This should lead to decreased transportation-related morbidity and improved rapid decision making in the ICU, OR, and other locations. Further studies to confirm this clinical advantage are needed. Copyright © 2011 by the American Society of Neuroimaging.

  12. An effective rectification method for lenselet-based plenoptic cameras

    NASA Astrophysics Data System (ADS)

    Jin, Jing; Cao, Yiwei; Cai, Weijia; Zheng, Wanlu; Zhou, Ping

    2016-10-01

    The Lenselet-Based Plenoptic has recently drawn a lot of attention in the field of computational photography. The additional information inherent in light field allows a wide range of applications, but some preliminary processing of the raw image is necessary before further operations. In this paper, an effective method is presented for the rotation rectification of the raw image. The rotation is caused by imperfectly position of micro-lens array relative to the sensor plane in commercially available Lytro plenoptic cameras. The key to our method is locating the center of each microlens image, which is projected by a micro-lens. Because of vignetting, the pixel values at centers of the micro-lens image are higher than those at the peripheries. A mask is applied to probe the micro-lens image to locate the center area by finding the local maximum response. The error of the center coordinate estimate is corrected and the angle of rotation is computed via a subsequent line fitting. The algorithm is performed on two images captured by different Lytro cameras. The angles of rotation are -0.3600° and -0.0621° respectively and the rectified raw image is useful and reliable for further operations, such as extraction of the sub-aperture images. The experimental results demonstrate that our method is efficient and accurate.

  13. Regulatory Risk Reduction for Advanced Reactor Technologies – FY2016 Status and Work Plan Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moe, Wayne Leland

    2016-08-01

    Millions of public and private sector dollars have been invested over recent decades to realize greater efficiency, reliability, and the inherent and passive safety offered by advanced nuclear reactor technologies. However, a major challenge in experiencing those benefits resides in the existing U.S. regulatory framework. This framework governs all commercial nuclear plant construction, operations, and safety issues and is highly large light water reactor (LWR) technology centric. The framework must be modernized to effectively deal with non-LWR advanced designs if those designs are to become part of the U.S energy supply. The U.S. Department of Energy’s (DOE) Advanced Reactor Technologiesmore » (ART) Regulatory Risk Reduction (RRR) initiative, managed by the Regulatory Affairs Department at the Idaho National Laboratory (INL), is establishing a capability that can systematically retire extraneous licensing risks associated with regulatory framework incompatibilities. This capability proposes to rely heavily on the perspectives of the affected regulated community (i.e., commercial advanced reactor designers/vendors and prospective owner/operators) yet remain tuned to assuring public safety and acceptability by regulators responsible for license issuance. The extent to which broad industry perspectives are being incorporated into the proposed framework makes this initiative unique and of potential benefit to all future domestic non-LWR applicants« less

  14. A hybrid approach for nondestructive assessment and design optimisation and testing of in-service machinery

    NASA Astrophysics Data System (ADS)

    Rahman, Abdul Ghaffar Abdul; Noroozi, Siamak; Dupac, Mihai; Mahathir Syed Mohd Al-Attas, Syed; Vinney, John E.

    2013-03-01

    Complex rotating machinery requires regular condition monitoring inspections to assess their running conditions and their structural integrity to prevent catastrophic failures. Machine failures can be divided into two categories. First is the wear and tear during operation, they range from bearing defects, gear damage, misalignment, imbalance or mechanical looseness, for which simple condition-based maintenance techniques can easily detect the root cause and trigger remedial action process. The second factor in machine failure is caused by the inherent design faults that usually happened due to many reasons such as improper installation, poor servicing, bad workmanship and structural dynamics design deficiency. In fact, individual machines components are generally dynamically well designed and rigorously tested. However, when these machines are assembled on sight and linked together, their dynamic characteristics will change causing unexpected behaviour of the system. Since nondestructive evaluation provides an excellent alternative to the classical monitoring and proved attractive due to the possibility of performing reliable assessments of all types of machinery, the novel dynamic design verification procedure - based on the combination of in-service operation deflection shape measurement, experimental modal analysis and iterative inverse finite element analysis - proposed here allows quick identification of structural weakness, and helps to provide and verify the solutions.

  15. Non-contact temperature measurements in support of microgravity combustion experiments

    NASA Technical Reports Server (NTRS)

    Greenberg, Paul S.

    1989-01-01

    Recent conceptual advances in the understanding of combustion science fundamentals in the context of microgravity processes and phenomenology have resulted in an increased demand for diagnostic systems of greater sophistication. Owing primarily to the severe operational constraints that accompany the space flight environment, measurement systems to date remain fairly primative in nature. Qualitative pictures provided by photographic recording media comprise the majority of the existing data, the remainder consisting of the output of conventional transducers, such as thermocouples, hot wires, and pressure transducers. The absence of the rather strong influence of buoyant convection renders microgravity combustion phenomena more fragile than their 1-G counterparts. The emphasis was placed on nonperturbing optical diagnostics. Other factors such as limited supplies of expendable reactants, and periods of microgravity time of sufficient duration, coupled with more fundamental questions regarding inherent length and time scales and reproducibility have favored multipoint or multidimensional techniques. While the development of optical diagnostics for application to combustion science is an extremely active area at present, the peculiarities of space flight hardware severely restrict the feasibility of implementing the majority of techniques which are being utilized in terrestrial applications. The additional requirements for system reliability and operational simplicity have tended to promote somewhat less commonly emphasized techniques such as refractive index mapping and molecular Rayleigh scattering, which are briefly discussed.

  16. Modeling of unit operating considerations in generating-capacity reliability evaluation. Volume 1. Mathematical models, computing methods, and results. Final report. [GENESIS, OPCON and OPPLAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, A.D.; Ayoub, A.K.; Singh, C.

    1982-07-01

    Existing methods for generating capacity reliability evaluation do not explicitly recognize a number of operating considerations which may have important effects in system reliability performance. Thus, current methods may yield estimates of system reliability which differ appreciably from actual observed reliability. Further, current methods offer no means of accurately studying or evaluating alternatives which may differ in one or more operating considerations. Operating considerations which are considered to be important in generating capacity reliability evaluation include: unit duty cycles as influenced by load cycle shape, reliability performance of other units, unit commitment policy, and operating reserve policy; unit start-up failuresmore » distinct from unit running failures; unit start-up times; and unit outage postponability and the management of postponable outages. A detailed Monte Carlo simulation computer model called GENESIS and two analytical models called OPCON and OPPLAN have been developed which are capable of incorporating the effects of many operating considerations including those noted above. These computer models have been used to study a variety of actual and synthetic systems and are available from EPRI. The new models are shown to produce system reliability indices which differ appreciably from index values computed using traditional models which do not recognize operating considerations.« less

  17. Nanopatterned Quantum Dot Lasers for High Speed, High Efficiency, Operation

    DTIC Science & Technology

    2015-04-27

    significant inhomogeneous broadening of the spectral gain. SK QDs inherently form on top of a two-dimensional “ wetting layer”, leading to weak...QDs inherently form on top of a two-dimensional “ wetting layer”, leading to weak electron and hole confinement to the QD, which results in low gain...exhibit full three- dimensional nano-scale confinement and elimination of the wetting layer states. The objectives of this project were to develop

  18. Evaluation and display of polarimetric image data using long-wave cooled microgrid focal plane arrays

    NASA Astrophysics Data System (ADS)

    Bowers, David L.; Boger, James K.; Wellems, L. David; Black, Wiley T.; Ortega, Steve E.; Ratliff, Bradley M.; Fetrow, Matthew P.; Hubbs, John E.; Tyo, J. Scott

    2006-05-01

    Recent developments for Long Wave InfraRed (LWIR) imaging polarimeters include incorporating a microgrid polarizer array onto the focal plane array (FPA). Inherent advantages over typical polarimeters include packaging and instantaneous acquisition of thermal and polarimetric information. This allows for real time video of thermal and polarimetric products. The microgrid approach has inherent polarization measurement error due to the spatial sampling of a non-uniform scene, residual pixel to pixel variations in the gain corrected responsivity and in the noise equivalent input (NEI), and variations in the pixel to pixel micro-polarizer performance. The Degree of Linear Polarization (DoLP) is highly sensitive to these parameters and is consequently used as a metric to explore instrument sensitivities. Image processing and fusion techniques are used to take advantage of the inherent thermal and polarimetric sensing capability of this FPA, providing additional scene information in real time. Optimal operating conditions are employed to improve FPA uniformity and sensitivity. Data from two DRS Infrared Technologies, L.P. (DRS) microgrid polarizer HgCdTe FPAs are presented. One FPA resides in a liquid nitrogen (LN2) pour filled dewar with a 80°K nominal operating temperature. The other FPA resides in a cryogenic (cryo) dewar with a 60° K nominal operating temperature.

  19. Defect recognition in CFRP components using various NDT methods within a smart manufacturing process

    NASA Astrophysics Data System (ADS)

    Schumacher, David; Meyendorf, Norbert; Hakim, Issa; Ewert, Uwe

    2018-04-01

    The manufacturing process of carbon fiber reinforced polymer (CFRP) components is gaining a more and more significant role when looking at the increasing amount of CFRPs used in industries today. The monitoring of the manufacturing process and hence the reliability of the manufactured products, is one of the major challenges we need to face in the near future. Common defects which arise during manufacturing process are e.g. porosity and voids which may lead to delaminations during operation and under load. To find irregularities and classify them as possible defects in an early stage of the manufacturing process is of high importance for the safety and reliability of the finished products, as well as of significant impact from an economical point of view. In this study we compare various NDT methods which were applied to similar CFRP laminate samples in order to detect and characterize regions of defective volume. Besides ultrasound, thermography and eddy current, different X-ray methods like radiography, laminography and computed tomography are used to investigate the samples. These methods are compared with the intention to evaluate their capability to reliably detect and characterize defective volume. Beyond the detection and evaluation of defects, we also investigate possibilities to combine various NDT methods within a smart manufacturing process in which the decision which method shall be applied is inherent within the process. Is it possible to design an in-line or at-line testing process which can recognize defects reliably and reduce testing time and costs? This study aims to show up opportunities of designing a smart NDT process synchronized to the production based on the concepts of smart production (Industry 4.0). A set of defective CFRP laminate samples and different NDT methods were used to demonstrate how effective defects are recognized and how communication between interconnected NDT sensors and the manufacturing process could be organized.

  20. Failure Modes Effects and Criticality Analysis, an Underutilized Safety, Reliability, Project Management and Systems Engineering Tool

    NASA Astrophysics Data System (ADS)

    Mullin, Daniel Richard

    2013-09-01

    The majority of space programs whether manned or unmanned for science or exploration require that a Failure Modes Effects and Criticality Analysis (FMECA) be performed as part of their safety and reliability activities. This comes as no surprise given that FMECAs have been an integral part of the reliability engineer's toolkit since the 1950s. The reasons for performing a FMECA are well known including fleshing out system single point failures, system hazards and critical components and functions. However, in the author's ten years' experience as a space systems safety and reliability engineer, findings demonstrate that the FMECA is often performed as an afterthought, simply to meet contract deliverable requirements and is often started long after the system requirements allocation and preliminary design have been completed. There are also important qualitative and quantitative components often missing which can provide useful data to all of project stakeholders. These include; probability of occurrence, probability of detection, time to effect and time to detect and, finally, the Risk Priority Number. This is unfortunate as the FMECA is a powerful system design tool that when used effectively, can help optimize system function while minimizing the risk of failure. When performed as early as possible in conjunction with writing the top level system requirements, the FMECA can provide instant feedback on the viability of the requirements while providing a valuable sanity check early in the design process. It can indicate which areas of the system will require redundancy and which areas are inherently the most risky from the onset. Based on historical and practical examples, it is this author's contention that FMECAs are an immense source of important information for all involved stakeholders in a given project and can provide several benefits including, efficient project management with respect to cost and schedule, system engineering and requirements management, assembly integration and test (AI&T) and operations if applied early, performed to completion and updated along with system design.

  1. Operator adaptation to changes in system reliability under adaptable automation.

    PubMed

    Chavaillaz, Alain; Sauer, Juergen

    2017-09-01

    This experiment examined how operators coped with a change in system reliability between training and testing. Forty participants were trained for 3 h on a complex process control simulation modelling six levels of automation (LOA). In training, participants either experienced a high- (100%) or low-reliability system (50%). The impact of training experience on operator behaviour was examined during a 2.5 h testing session, in which participants either experienced a high- (100%) or low-reliability system (60%). The results showed that most operators did not often switch between LOA. Most chose an LOA that relieved them of most tasks but maintained their decision authority. Training experience did not have a strong impact on the outcome measures (e.g. performance, complacency). Low system reliability led to decreased performance and self-confidence. Furthermore, complacency was observed under high system reliability. Overall, the findings suggest benefits of adaptable automation because it accommodates different operator preferences for LOA. Practitioner Summary: The present research shows that operators can adapt to changes in system reliability between training and testing sessions. Furthermore, it provides evidence that each operator has his/her preferred automation level. Since this preference varies strongly between operators, adaptable automation seems to be suitable to accommodate these large differences.

  2. External quality-assurance programs managed by the U.S. Geological Survey in support of the National Atmospheric Deposition Program/National Trends Network

    USGS Publications Warehouse

    Latysh, Natalie E.; Wetherbee, Gregory A.

    2005-01-01

    The U.S. Geological Survey, Branch of Quality Systems, operates the external quality-assurance programs for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN). Beginning in 1978, six different programs have been implemented?the intersite-comparison program, the blind-audit program, the sample-handling evaluation program, the field-audit program, the interlaboratory-comparison program, and the collocated-sampler program. Each program was designed to measure error contributed by specific components in the data-collection process. The intersite-comparison program, which was discontinued in 2004, was designed to assess the accuracy and reliability of field pH and specific-conductance measurements made by site operators. The blind-audit and sample-handling evaluation programs, which also were discontinued in 2002 and 2004, respectively, assessed contamination that may result from sampling equipment and routine handling and processing of the wet-deposition samples. The field-audit program assesses the effects of sample handling, processing, and field exposure. The interlaboratory-comparison program evaluates bias and precision of analytical results produced by the contract laboratory for NADP, the Illinois State Water Survey, Central Analytical Laboratory, and compares its performance with the performance of international laboratories. The collocated-sampler program assesses the overall precision of wet-deposition data collected by NADP/NTN. This report documents historical operations and the operating procedures for each of these external quality-assurance programs. USGS quality-assurance information allows NADP/NTN data users to discern between actual environmental trends and inherent measurement variability.

  3. SOI N-Channel Field Effect Transistors, CHT-NMOS80, for Extreme Temperatures

    NASA Technical Reports Server (NTRS)

    Patterson, Richard L.; Hammoud, Almad

    2009-01-01

    Extreme temperatures, both hot and cold, are anticipated in many of NASA space exploration missions as well as in terrestrial applications. One can seldom find electronics that are capable of operation under both regimes. Even for operation under one (hot or cold) temperature extreme, some thermal controls need to be introduced to provide appropriate ambient temperatures so that spacecraft on-board or field on-site electronic systems work properly. The inclusion of these controls, which comprise of heating elements and radiators along with their associated structures, adds to the complexity in the design of the system, increases cost and weight, and affects overall reliability. Thus, it would be highly desirable and very beneficial to eliminate these thermal measures in order to simplify system's design, improve efficiency, reduce development and launch costs, and improve reliability. These requirements can only be met through the development of electronic parts that are designed for proper and efficient operation under extreme temperature conditions. Silicon-on-insulator (SOI) based devices are finding more use in harsh environments due to the benefits that their inherent design offers in terms of reduced leakage currents, less power consumption, faster switching speeds, good radiation tolerance, and extreme temperature operability. Little is known, however, about their performance at cryogenic temperatures and under wide thermal swings. The objective of this work was to evaluate the performance of a new commercial-off-the-shelf (COTS) SOI parts over an extended temperature range and to determine the effects of thermal cycling on their performance. The results will establish a baseline on the suitability of such devices for use in space exploration missions under extreme temperatures, and will aid mission planners and circuit designers in the proper selection of electronic parts and circuits. The electronic part investigated in this work comprised of a CHT-NMOS80 high temperature N-channel MOSFET (metal-oxide semiconductor field-effect transistor) device that was manufactured by CISSOID. This high voltage, medium-power transistor is fabricated using SOI processes and is designed for extreme wide temperature applications such as geothermal well logging, aerospace and avionics, and automotive industry. It has a high DC current capability and is specified for operation in the temperature range of -55 C to +225 C

  4. Adaptation of the ToxRTool to Assess the Reliability of Toxicology Studies Conducted with Genetically Modified Crops and Implications for Future Safety Testing.

    PubMed

    Koch, Michael S; DeSesso, John M; Williams, Amy Lavin; Michalek, Suzanne; Hammond, Bruce

    2016-01-01

    To determine the reliability of food safety studies carried out in rodents with genetically modified (GM) crops, a Food Safety Study Reliability Tool (FSSRTool) was adapted from the European Centre for the Validation of Alternative Methods' (ECVAM) ToxRTool. Reliability was defined as the inherent quality of the study with regard to use of standardized testing methodology, full documentation of experimental procedures and results, and the plausibility of the findings. Codex guidelines for GM crop safety evaluations indicate toxicology studies are not needed when comparability of the GM crop to its conventional counterpart has been demonstrated. This guidance notwithstanding, animal feeding studies have routinely been conducted with GM crops, but their conclusions on safety are not always consistent. To accurately evaluate potential risks from GM crops, risk assessors need clearly interpretable results from reliable studies. The development of the FSSRTool, which provides the user with a means of assessing the reliability of a toxicology study to inform risk assessment, is discussed. Its application to the body of literature on GM crop food safety studies demonstrates that reliable studies report no toxicologically relevant differences between rodents fed GM crops or their non-GM comparators.

  5. Integrated Application of Active Controls (IAAC) technology to an advanced subsonic transport project: Test act system description

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The engineering and fabrication of the test ACT system, produced in the third program element of the IAAC Project is documented. The system incorporates pitch-augmented stability and wing-load alleviation, plus full authority fly-by-wire control of the elevators. The pitch-augmented stability is designed to have reliability sufficient to allow flight with neutral or negative inherent longitudinal stability.

  6. Using fine-scale fuel measurements to assess wildland fuels, potential fire behavior and hazard mitigation treatments in the southeastern USA

    Treesearch

    Roger D. Ottmar; John I. Blake; William T. Crolly

    2012-01-01

    The inherent spatial and temporal heterogeneity of fuel beds in forests of the southeastern United States may require fine scale fuel measurements for providing reliable fire hazard and fuel treatment effectiveness estimates. In a series of five papers, an intensive, fine scale fuel inventory from the Savanna River Site in the southeastern United States is used for...

  7. Review of Literature on Probability of Detection for Liquid Penetrant Nondestructive Testing

    DTIC Science & Technology

    2011-11-01

    increased maintenance costs , or catastrophic failure of safety- critical structure. Knowledge of the reliability achieved by NDT methods, including...representative components to gather data for statistical analysis, which can be prohibitively expensive. To account for sampling variability inherent in any...Sioux City and Pensacola. (Those recommendations were discussed in Section 3.4.) Drury et al report on a factorial experiment aimed at identifying the

  8. 77 FR 24594 - Version 4 Critical Infrastructure Protection Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-25

    ... framework for the identification and protection of ``Critical Cyber Assets'' to support the reliable... documentation of Critical Cyber Assets associated with ``Critical Assets'' that support the reliable operation... ``Critical Cyber Assets'' that are associated with ``Critical Assets'' to support the reliable operation of...

  9. Inherent structures of crystalline pentacene

    NASA Astrophysics Data System (ADS)

    Della Valle, Raffaele Guido; Venuti, Elisabetta; Brillante, Aldo; Girlando, Alberto

    2003-01-01

    Using a quasi-Monte Carlo scheme, we search the potential energy surface of crystalline pentacene to sample its local minima, which represent the "inherent" structures, i.e., the possible configurations of mechanical equilibrium. The system is described in terms of rigid molecules interacting through a standard atom-atom potential model. Several hundreds of distinct minima are encountered, with a surprising variety of structural arrangements. We find that deep minima are easily accessible because they exhibit a favorable energy distribution and their attraction basins tend to be wide. Thanks to these features of the potential surface, the localization the global minimum becomes entirely feasible, allowing reliable a priori predictions of the crystallographic structures. The results for pentacene are very satisfactory. In fact, the two deepest minima correspond to the structures of the two known experimental polymorphs, which are described correctly. Further polymorphs are also likely to exist.

  10. Interdependent networks: the fragility of control

    PubMed Central

    Morris, Richard G.; Barthelemy, Marc

    2013-01-01

    Recent work in the area of interdependent networks has focused on interactions between two systems of the same type. However, an important and ubiquitous class of systems are those involving monitoring and control, an example of interdependence between processes that are very different. In this Article, we introduce a framework for modelling ‘distributed supervisory control' in the guise of an electrical network supervised by a distributed system of control devices. The system is characterised by degrees of freedom salient to real-world systems— namely, the number of control devices, their inherent reliability, and the topology of the control network. Surprisingly, the behavior of the system depends crucially on the reliability of control devices. When devices are completely reliable, cascade sizes are percolation controlled; the number of devices being the relevant parameter. For unreliable devices, the topology of the control network is important and can dramatically reduce the resilience of the system. PMID:24067404

  11. Optimal Implementations for Reliable Circadian Clocks

    NASA Astrophysics Data System (ADS)

    Hasegawa, Yoshihiko; Arita, Masanori

    2014-09-01

    Circadian rhythms are acquired through evolution to increase the chances for survival through synchronizing with the daylight cycle. Reliable synchronization is realized through two trade-off properties: regularity to keep time precisely, and entrainability to synchronize the internal time with daylight. We find by using a phase model with multiple inputs that achieving the maximal limit of regularity and entrainability entails many inherent features of the circadian mechanism. At the molecular level, we demonstrate the role sharing of two light inputs, phase advance and delay, as is well observed in mammals. At the behavioral level, the optimal phase-response curve inevitably contains a dead zone, a time during which light pulses neither advance nor delay the clock. We reproduce the results of phase-controlling experiments entrained by two types of periodic light pulses. Our results indicate that circadian clocks are designed optimally for reliable clockwork through evolution.

  12. Enabling High-Energy, High-Voltage Lithium-Ion Cells: Standardization of Coin-Cell Assembly, Electrochemical Testing, and Evaluation of Full Cells

    DOE PAGES

    Long, Brandon R.; Rinaldo, Steven G.; Gallagher, Kevin G.; ...

    2016-11-09

    Coin-cells are often the test format of choice for laboratories engaged in battery research and development as they provide a convenient platform for rapid testing of new materials on a small scale. However, reliable, reproducible data via the coin-cell format is inherently difficult, particularly in the full-cell configuration. In addition, statistical evaluation to prove the consistency and reliability of such data is often neglected. Herein we report on several studies aimed at formalizing physical process parameters and coin-cell construction related to full cells. Statistical analysis and performance benchmarking approaches are advocated as a means to more confidently track changes inmore » cell performance. Finally, we show that trends in the electrochemical data obtained from coin-cells can be reliable and informative when standardized approaches are implemented in a consistent manner.« less

  13. 78 FR 73112 - Monitoring System Conditions-Transmission Operations Reliability Standards; Interconnection...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-05

    ..., RM13-14-000 and RM13-15-000] Monitoring System Conditions--Transmission Operations Reliability...) 502-6817, [email protected] . Robert T. Stroh (Legal Information), Office of the General... Reliability Standards ``address the important reliability goal of ensuring that the transmission system is...

  14. Operation Reliability Assessment for Cutting Tools by Applying a Proportional Covariate Model to Condition Monitoring Information

    PubMed Central

    Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia

    2012-01-01

    The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools. PMID:23201980

  15. Work design and management in the manufacturing sector: development and validation of the Work Organisation Assessment Questionnaire

    PubMed Central

    Griffiths, A; Cox, T; Karanika, M; Khan, S; Tomás, J‐M

    2006-01-01

    Objectives To examine the factor structure, reliability, and validity of a new context‐specific questionnaire for the assessment of work and organisational factors. The Work Organisation Assessment Questionnaire (WOAQ) was developed as part of a risk assessment and risk reduction methodology for hazards inherent in the design and management of work in the manufacturing sector. Method Two studies were conducted. Data were collected from 524 white‐ and blue‐collar employees from a range of manufacturing companies. Exploratory factor analysis was carried out on 28 items that described the most commonly reported failures of work design and management in companies in the manufacturing sector. Concurrent validity data were also collected. A reliability study was conducted with a further 156 employees. Results Principal component analysis, with varimax rotation, revealed a strong 28‐item, five factor structure. The factors were named: quality of relationships with management, reward and recognition, workload, quality of relationships with colleagues, and quality of physical environment. Analyses also revealed a more general summative factor. Results indicated that the questionnaire has good internal consistency and test‐retest reliability and validity. Being associated with poor employee health and changes in health related behaviour, the WOAQ factors are possible hazards. It is argued that the strength of those associations offers some estimation of risk. Feedback from the organisations involved indicated that the WOAQ was easy to use and meaningful for them as part of their risk assessment procedures. Conclusions The studies reported here describe a model of the hazards to employee health and health related behaviour inherent in the design and management of work in the manufacturing sector. It offers an instrument for their assessment. The scales derived which form the WOAQ were shown to be reliable, valid, and meaningful to the user population. PMID:16858081

  16. Statistical sensor fusion analysis of near-IR polarimetric and thermal imagery for the detection of minelike targets

    NASA Astrophysics Data System (ADS)

    Weisenseel, Robert A.; Karl, William C.; Castanon, David A.; DiMarzio, Charles A.

    1999-02-01

    We present an analysis of statistical model based data-level fusion for near-IR polarimetric and thermal data, particularly for the detection of mines and mine-like targets. Typical detection-level data fusion methods, approaches that fuse detections from individual sensors rather than fusing at the level of the raw data, do not account rationally for the relative reliability of different sensors, nor the redundancy often inherent in multiple sensors. Representative examples of such detection-level techniques include logical AND/OR operations on detections from individual sensors and majority vote methods. In this work, we exploit a statistical data model for the detection of mines and mine-like targets to compare and fuse multiple sensor channels. Our purpose is to quantify the amount of knowledge that each polarimetric or thermal channel supplies to the detection process. With this information, we can make reasonable decisions about the usefulness of each channel. We can use this information to improve the detection process, or we can use it to reduce the number of required channels.

  17. Broadband Processing in a Noisy Shallow Ocean Environment: A Particle Filtering Approach

    DOE PAGES

    Candy, J. V.

    2016-04-14

    Here we report that when a broadband source propagates sound in a shallow ocean the received data can become quite complicated due to temperature-related sound-speed variations and therefore a highly dispersive environment. Noise and uncertainties disrupt this already chaotic environment even further because disturbances propagate through the same inherent acoustic channel. The broadband (signal) estimation/detection problem can be decomposed into a set of narrowband solutions that are processed separately and then combined to achieve more enhancement of signal levels than that available from a single frequency, thereby allowing more information to be extracted leading to a more reliable source detection.more » A Bayesian solution to the broadband modal function tracking, pressure-field enhancement, and source detection problem is developed that leads to nonparametric estimates of desired posterior distributions enabling the estimation of useful statistics and an improved processor/detector. In conclusion, to investigate the processor capabilities, we synthesize an ensemble of noisy, broadband, shallow-ocean measurements to evaluate its overall performance using an information theoretical metric for the preprocessor and the receiver operating characteristic curve for the detector.« less

  18. 3D multifunctional integumentary membranes for spatiotemporal cardiac measurements and stimulation across the entire epicardium

    PubMed Central

    Xu, Lizhi; Gutbrod, Sarah R.; Bonifas, Andrew P.; Su, Yewang; Sulkin, Matthew S.; Lu, Nanshu; Chung, Hyun-Joong; Jang, Kyung-In; Liu, Zhuangjian; Ying, Ming; Lu, Chi; Webb, R. Chad; Kim, Jong-Seon; Laughner, Jacob I.; Cheng, Huanyu; Liu, Yuhao; Ameen, Abid; Jeong, Jae-Woong; Kim, Gwang-Tae; Huang, Yonggang; Efimov, Igor R.; Rogers, John A.

    2015-01-01

    Means for high-density multiparametric physiological mapping and stimulation are critically important in both basic and clinical cardiology. Current conformal electronic systems are essentially 2D sheets, which cannot cover the full epicardial surface or maintain reliable contact for chronic use without sutures or adhesives. Here we create 3D elastic membranes shaped precisely to match the epicardium of the heart via the use of 3D printing, as a platform for deformable arrays of multifunctional sensors, electronic and optoelectronic components. Such integumentary devices completely envelop the heart, in a form-fitting manner, and possess inherent elasticity, providing a mechanically stable bioti-/abiotic interface during normal cardiac cycles. Component examples range from actuators for electrical, thermal and optical stimulation, to sensors for pH, temperature and mechanical strain. The semiconductor materials include silicon, gallium arsenide and gallium nitride, co-integrated with metals, metal oxides and polymers, to provide these and other operational capabilities. Ex vivo physiological experiments demonstrate various functions and methodological possibilities for cardiac research and therapy. PMID:24569383

  19. Using timing of ice retreat to predict timing of fall freeze-up in the Arctic

    NASA Astrophysics Data System (ADS)

    Stroeve, Julienne C.; Crawford, Alex D.; Stammerjohn, Sharon

    2016-06-01

    Reliable forecasts of the timing of sea ice advance are needed in order to reduce risks associated with operating in the Arctic as well as planning of human and environmental emergencies. This study investigates the use of a simple statistical model relating the timing of ice retreat to the timing of ice advance, taking advantage of the inherent predictive power supplied by the seasonal ice-albedo feedback and ocean heat uptake. Results show that using the last retreat date to predict the first advance date is applicable in some regions, such as Baffin Bay and the Laptev and East Siberian seas, where a predictive skill is found even after accounting for the long-term trend in both variables. Elsewhere, in the Arctic, there is some predictive skills depending on the year (e.g., Kara and Beaufort seas), but none in regions such as the Barents and Bering seas or the Sea of Okhotsk. While there is some suggestion that the relationship is strengthening over time, this may reflect that higher correlations are expected during periods when the underlying trend is strong.

  20. Self-actuated shutdown system for a commercial size LMFBR. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dupen, C.F.G.

    1978-08-01

    A Self-Actuated Shutdown System (SASS) is defined as a reactor shutdown system in which sensors, release mechanisms and neutron absorbers are contained entirely within the reactor core structure, where they respond inherently to abnormal local process conditions, by shutting down the reactor, independently of the plant protection system (PPS). It is argued that a SASS, having a response time similar to that of the PPS, would so reduce the already very low probability of a failure-to-scram event that costly design features, derived from core disruptive accident analysis, could be eliminated. However, the thrust of the report is the feasibility andmore » reliability of the in-core SASS hardware to achieve sufficiently rapid shutdown. A number of transient overpower and transient undercooling-responsive systems were investigated leading to the selection of a primary candidate and a backup concept. During a transient undercooling event, the recommended device is triggered by the associated rate of change of pressure, whereas the alternate concept responds to the reduction in core pressure drop and requires calibration and adjustment by the operators to accommodate changes in reactor power.« less

  1. Deflection monitoring for a box girder based on a modified conjugate beam method

    NASA Astrophysics Data System (ADS)

    Chen, Shi-Zhi; Wu, Gang; Xing, Tuo

    2017-08-01

    After several years of operation, a box girder bridge would commonly experience excessive deflection, which endangers the bridge’s life span as well as the safety of vehicles travelling on it. In order to avoid potential risks, it is essential to constantly monitor the defection of box girders. However, currently, the direct deflection monitoring methods are limited by the complicated environments beneath the bridges, such as rivers or other traffic lanes, which severely impede the layouts of the sensors. The other indirect deflection monitoring methods mostly do not thoroughly consider the inherent shear lag effect and shear deformation in the box girder, resulting in a rather large error. Under these circumstances, a deflection monitoring method suiting box girders is proposed in this article, based on the conjugate beam method and distributed long-gauge fibre Bragg grating (FBG) sensor. A lab experiment was conducted to verify the reliability and feasibility of this method under practical application. Further, the serviceability under different span-depth ratios and web thicknesses was examined through a finite element model.

  2. Rhesus macaques recognize unique multi-modal face-voice relations of familiar individuals and not of unfamiliar ones

    PubMed Central

    Habbershon, Holly M.; Ahmed, Sarah Z.; Cohen, Yale E.

    2013-01-01

    Communication signals in non-human primates are inherently multi-modal. However, for laboratory-housed monkeys, there is relatively little evidence in support of the use of multi-modal communication signals in individual recognition. Here, we used a preferential-looking paradigm to test whether laboratory-housed rhesus could “spontaneously” (i.e., in the absence of operant training) use multi-modal communication stimuli to discriminate between known conspecifics. The multi-modal stimulus was a silent movie of two monkeys vocalizing and an audio file of the vocalization from one of the monkeys in the movie. We found that the gaze patterns of those monkeys that knew the individuals in the movie were reliably biased toward the individual that did not produce the vocalization. In contrast, there was not a systematic gaze pattern for those monkeys that did not know the individuals in the movie. These data are consistent with the hypothesis that laboratory-housed rhesus can recognize and distinguish between conspecifics based on auditory and visual communication signals. PMID:23774779

  3. Apparatus for improving performance of electrical insulating structures

    DOEpatents

    Wilson, Michael J.; Goerz, David A.

    2004-08-31

    Removing the electrical field from the internal volume of high-voltage structures; e.g., bushings, connectors, capacitors, and cables. The electrical field is removed from inherently weak regions of the interconnect, such as between the center conductor and the solid dielectric, and places it in the primary insulation. This is accomplished by providing a conductive surface on the inside surface of the principal solid dielectric insulator surrounding the center conductor and connects the center conductor to this conductive surface. The advantage of removing the electric fields from the weaker dielectric region to a stronger area improves reliability, increases component life and operating levels, reduces noise and losses, and allows for a smaller compact design. This electric field control approach is currently possible on many existing products at a modest cost. Several techniques are available to provide the level of electric field control needed. Choosing the optimum technique depends on material, size, and surface accessibility. The simplest deposition method uses a standard electroless plating technique, but other metalization techniques include vapor and energetic deposition, plasma spraying, conductive painting, and other controlled coating methods.

  4. Apparatus for improving performance of electrical insulating structures

    DOEpatents

    Wilson, Michael J.; Goerz, David A.

    2002-01-01

    Removing the electrical field from the internal volume of high-voltage structures; e.g., bushings, connectors, capacitors, and cables. The electrical field is removed from inherently weak regions of the interconnect, such as between the center conductor and the solid dielectric, and places it in the primary insulation. This is accomplished by providing a conductive surface on the inside surface of the principal solid dielectric insulator surrounding the center conductor and connects the center conductor to this conductive surface. The advantage of removing the electric fields from the weaker dielectric region to a stronger area improves reliability, increases component life and operating levels, reduces noise and losses, and allows for a smaller compact design. This electric field control approach is currently possible on many existing products at a modest cost. Several techniques are available to provide the level of electric field control needed. Choosing the optimum technique depends on material, size, and surface accessibility. The simplest deposition method uses a standard electroless plating technique, but other metalization techniques include vapor and energetic deposition, plasma spraying, conductive painting, and other controlled coating methods.

  5. Method for improving performance of highly stressed electrical insulating structures

    DOEpatents

    Wilson, Michael J.; Goerz, David A.

    2002-01-01

    Removing the electrical field from the internal volume of high-voltage structures; e.g., bushings, connectors, capacitors, and cables. The electrical field is removed from inherently weak regions of the interconnect, such as between the center conductor and the solid dielectric, and places it in the primary insulation. This is accomplished by providing a conductive surface on the inside surface of the principal solid dielectric insulator surrounding the center conductor and connects the center conductor to this conductive surface. The advantage of removing the electric fields from the weaker dielectric region to a stronger area improves reliability, increases component life and operating levels, reduces noise and losses, and allows for a smaller compact design. This electric field control approach is currently possible on many existing products at a modest cost. Several techniques are available to provide the level of electric field control needed. Choosing the optimum technique depends on material, size, and surface accessibility. The simplest deposition method uses a standard electroless plating technique, but other metalization techniques include vapor and energetic deposition, plasma spraying, conductive painting, and other controlled coating methods.

  6. PEGASUS - A Flexible Launch Solution for Small Satellites with Unique Requirements

    NASA Astrophysics Data System (ADS)

    Richards, B. R.; Ferguson, M.; Fenn, P. D.

    The financial advantages inherent in building small satellites are negligible if an equally low cost launch service is not available to deliver them to the orbit they require. The weight range of small satellites puts them within the capability of virtually all launch vehicles. Initially, this would appear to help drive down costs through competition since, by one estimate, there are roughly 75 active space launch vehicles around the world that either have an established flight record or are planning to make an inaugural launch within the year. When reliability, budget constraints, and other issues such as inclination access are factored in, this list of available launch vehicles is often times reduced to a very limited few, if any at all. This is especially true for small satellites with unusual or low inclination launch requirements where the cost of launching on the heavy-lift launchers that have the capacity to execute the necessary plane changes or meet the mission requirements can be prohibitive. For any small satellite, reducing launch costs by flying as a secondary or even tertiary payload is only advantageous in the event that a primary payload can be found that either requires or is passing through the same final orbit and has a launch date that is compatible. If the satellite is able to find a ride on a larger vehicle that is only passing through the correct orbit, the budget and technical capability must exist to incorporate a propulsive system on the satellite to modify the orbit to that required for the mission. For these customers a launch vehicle such as Pegasus provides a viable alternative due to its proven flight record, relatively low cost, self- contained launch infrastructure, and mobility. Pegasus supplements the existing world-wide launch capability by providing additional services to a targeted niche of payloads that benefit greatly from Pegasus' mobility and flexibility. Pegasus can provide standard services to satellites that do not require the benefits inherent in a mobile platform. In this regard Pegasus is no different from a ground- launched vehicle in that it repeatedly launches from a fixed location at each range, albeit a location that is not on land. However, Pegasus can also offer services that avoid many of the restrictions inherent in being constrained to a particular launch site, few of which are trivial. They include inclination restrictions, large plane changes required to achieve low inclination orbits from high latitude launch sites, politically inopportune launch locations, and low frequency launch opportunities for missions that require phasing. Pegasus has repeatedly demonstrated this flexibility through the course of 31 flights, including 17 consecutive successes dating back to 1996, originating from seven different locations around the world including two outside the United States. Recently, Pegasus launched NASA's HETE-2 satellite in an operation that included satellite integration and vehicle mate in California, pre-launch staging operations from Kwajalein Island in the South Pacific, and launch operations controlled from over 7000 miles away in Florida. Pegasus has also used the Canary Islands as a launch point with the associated control room in Spain, and Florida as a launch point for a mission controlled from Virginia. This paper discusses the operational uniqueness of the Pegasus launch vehicle and the activities associated with establishing low-cost, flexible-inclination, low-risk launch operations that utilize Pegasus' greatest asset: its mobility.

  7. A comprehensive multi-scenario based approach for a reliable flood-hazard assessment: a case-study application

    NASA Astrophysics Data System (ADS)

    Lanni, Cristiano; Mazzorana, Bruno; Volcan, Claudio; Bertagnolli, Rudi

    2015-04-01

    Flood hazard is generally assessed by assuming the return period of the rainfall as a proxy for the return period of the discharge and the related hydrograph. Frequently this deterministic view is extended also to the straightforward application of hydrodynamic models. However, the climate (i.e. precipitation), the catchment (i.e. geology, soil and antecedent soil-moisture condition) and the anthropogenic (i.e. drainage system and its regulation) systems interact in a complex way, and the occurrence probability of a flood inundation event can significantly differ from the occurrence probability of the triggering event (i.e. rainfall). In order to reliably determine the spatial patterns of flood intensities and probabilities, the rigorous determination of flood event scenarios is beneficial because it provides a clear, rationale method to recognize and unveil the inherent stochastic behavior of natural processes. Therefore, a multi-scenario approach for hazard assessment should be applied and should consider the possible events taking place in the area potentially subject to flooding (i.e. floodplains). Here, we apply a multi-scenario approach for the assessment of the flood hazard around the Idro lake (Italy). We consider and estimate the probability of occurrence of several scenarios related to the initial (i.e. initial water level in the lake) and boundary (i.e. shape of the hydrograph, downslope drainage, spillway opening operations) conditions characterizing the lake. Finally, we discuss the advantages and issues of the presented methodological procedure compared to traditional (and essentially deterministic) approaches.

  8. Posttest analysis of the FFTF inherent safety tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Padilla, A. Jr.; Claybrook, S.W.

    Inherent safety tests were performed during 1986 in the 400-MW (thermal) Fast Flux Test Facility (FFTF) reactor to demonstrate the effectiveness of an inherent shutdown device called the gas expansion module (GEM). The GEM device provided a strong negative reactivity feedback during loss-of-flow conditions by increasing the neutron leakage as a result of an expanding gas bubble. The best-estimate pretest calculations for these tests were performed using the IANUS plant analysis code (Westinghouse Electric Corporation proprietary code) and the MELT/SIEX3 core analysis code. These two codes were also used to perform the required operational safety analyses for the FFTF reactormore » and plant. Although it was intended to also use the SASSYS systems (core and plant) analysis code, the calibration of the SASSYS code for FFTF core and plant analysis was not completed in time to perform pretest analyses. The purpose of this paper is to present the results of the posttest analysis of the 1986 FFTF inherent safety tests using the SASSYS code.« less

  9. Defense AT and L. Volume 38, Number 4

    DTIC Science & Technology

    2009-06-01

    accuracy at extended ranges. Today, Afghanistan- and Iraq-bound medics get realistic training on a Florida-based company’s Mini-Combat Trauma Patient ...school basketball team and drone on about how we miss 100 percent of the shots we don’t take. Fine. They may be right; failure might be good for us...be developed (or procured) that exhibits high inherent reliability and maintainability plus ad- vanced self- diagnostics . Do the ICD and Gate 1

  10. Development of Theoretical and Computational Methods for Single-Source Bathymetric Data

    DTIC Science & Technology

    2016-09-15

    Methods for Single-Source N00014-16-1-2035 Bathymetric Data Sb. GRANT NUMBER 11893686 Sc. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Sd. PROJECT NUMBER...A method is outlined for fusing the information inherent in such source documents, at different scales, into a single picture for the marine...algorithm reliability, which reflects the degree of inconsistency of the source documents, is also provided. A conceptual outline of the method , and a

  11. Does the Modified Gartland Classification Clarify Decision Making?

    PubMed

    Leung, Sophia; Paryavi, Ebrahim; Herman, Martin J; Sponseller, Paul D; Abzug, Joshua M

    2018-01-01

    The modified Gartland classification system for pediatric supracondylar fractures is often utilized as a communication tool to aid in determining whether or not a fracture warrants operative intervention. This study sought to determine the interobserver and intraobserver reliability of the Gartland classification system, as well as to determine whether there was agreement that a fracture warranted operative intervention regardless of the classification system. A total of 200 anteroposterior and lateral radiographs of pediatric supracondylar humerus fractures were retrospectively reviewed by 3 fellowship-trained pediatric orthopaedic surgeons and 2 orthopaedic residents and then classified as type I, IIa, IIb, or III. The surgeons then recorded whether they would treat the fracture nonoperatively or operatively. The κ coefficients were calculated to determine interobserver and intraobserver reliability. Overall, the Wilkins-modified Gartland classification has low-moderate interobserver reliability (κ=0.475) and high intraobserver reliability (κ=0.777). A low interobserver reliability was found when differentiating between type IIa and IIb (κ=0.240) among attendings. There was moderate-high interobserver reliability for the decision to operate (κ=0.691) and high intraobserver reliability (κ=0.760). Decreased interobserver reliability was present for decision to operate among residents. For fractures classified as type I, the decision to operate was made 3% of the time and 27% for type IIa. The decision was made to operate 99% of the time for type IIb and 100% for type III. There is almost full agreement for the nonoperative treatment of Type I fractures and operative treatment for type III fractures. There is agreement that type IIb fractures should be treated operatively and that the majority of type IIa fractures should be treated nonoperatively. However, the interobserver reliability for differentiating between type IIa and IIb fractures is low. Our results validate the Gartland classfication system as a method to help direct treatment of pediatric supracondylar humerus fractures, although the modification of the system, IIa versus IIb, seems to have limited reliability and utility. Terminology based on decision to treat may lead to a more clinically useful classification system in the evaluation and treatment of pediatric supracondylar humerus fractures. Level III-diagnostic studies.

  12. CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.

    2003-01-01

    This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.

  13. 49 CFR 800.4 - Operation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for the Board that are inherent in the staff's position in the organizational structure or that the... which govern the activities of employees and organizational components of the Board. The internal...

  14. Investigation of the impact of main control room digitalization on operators cognitive reliability in nuclear power plants.

    PubMed

    Zhou, Yong; Mu, Haiying; Jiang, Jianjun; Zhang, Li

    2012-01-01

    Currently, there is a trend in nuclear power plants (NPPs) toward introducing digital and computer technologies into main control rooms (MCRs). Safe generation of electric power in NPPs requires reliable performance of cognitive tasks such as fault detection, diagnosis, and response planning. The digitalization of MCRs has dramatically changed the whole operating environment, and the ways operators interact with the plant systems. If the design and implementation of the digital technology is incompatible with operators' cognitive characteristics, it may have negative effects on operators' cognitive reliability. Firstly, on the basis of three essential prerequisites for successful cognitive tasks, a causal model is constructed to reveal the typical human performance issues arising from digitalization. The cognitive mechanisms which they impact cognitive reliability are analyzed in detail. Then, Bayesian inference is used to quantify and prioritize the influences of these factors. It suggests that interface management and unbalanced workload distribution have more significant impacts on operators' cognitive reliability.

  15. System reliability, performance and trust in adaptable automation.

    PubMed

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-01-01

    The present study examined the effects of reduced system reliability on operator performance and automation management in an adaptable automation environment. 39 operators were randomly assigned to one of three experimental groups: low (60%), medium (80%), and high (100%) reliability of automation support. The support system provided five incremental levels of automation which operators could freely select according to their needs. After 3 h of training on a simulated process control task (AutoCAMS) in which the automation worked infallibly, operator performance and automation management were measured during a 2.5-h testing session. Trust and workload were also assessed through questionnaires. Results showed that although reduced system reliability resulted in lower levels of trust towards automation, there were no corresponding differences in the operators' reliance on automation. While operators showed overall a noteworthy ability to cope with automation failure, there were, however, decrements in diagnostic speed and prospective memory with lower reliability. Copyright © 2015. Published by Elsevier Ltd.

  16. Operation of SOI P-Channel Field Effect Transistors, CHT-PMOS30, under Extreme Temperatures

    NASA Technical Reports Server (NTRS)

    Patterson, Richard; Hammoud, Ahmad

    2009-01-01

    Electronic systems are required to operate under extreme temperatures in NASA planetary exploration and deep space missions. Electronics on-board spacecraft must also tolerate thermal cycling between extreme temperatures. Thermal management means are usually included in today s spacecraft systems to provide adequate temperature for proper operation of the electronics. These measures, which may include heating elements, heat pipes, radiators, etc., however add to the complexity in the design of the system, increases its cost and weight, and affects its performance and reliability. Electronic parts and circuits capable of withstanding and operating under extreme temperatures would reflect in improvement in system s efficiency, reducing cost, and improving overall reliability. Semiconductor chips based on silicon-on-insulator (SOI) technology are designed mainly for high temperature applications and find extensive use in terrestrial well-logging fields. Their inherent design offers advantages over silicon devices in terms of reduced leakage currents, less power consumption, faster switching speeds, and good radiation tolerance. Little is known, however, about their performance at cryogenic temperatures and under wide thermal swings. Experimental investigation on the operation of SOI, N-channel field effect transistors under wide temperature range was reported earlier [1]. This work examines the performance of P-channel devices of these SOI transistors. The electronic part investigated in this work comprised of a Cissoid s CHT-PMOS30, high temperature P-channel MOSFET (metal-oxide semiconductor field-effect transistor) device [2]. This high voltage, medium-power transistor is designed for geothermal well logging applications, aerospace and avionics, and automotive industry, and is specified for operation in the temperature range of -55 C to +225 C. Table I shows some specifications of this transistor [2]. The CHT-PMOS30 device was characterized at various temperatures over the range of -190 C to +225 C in terms of its voltage/current characteristic curves. The test temperatures included +22, -50, -100, -150, -175, -190, +50, +100, +150, +175, +200, and +225 C. Limited thermal cycling testing was also performed on the device. These tests consisted of subjecting the transistor to a total of twelve thermal cycles between -190 C and +225 C. A temperature rate of change of 10 C/min and a soak time at the test temperature of 10 minutes were used throughout this work. Post-cycling measurements were also performed at selected temperatures. In addition, re-start capability at extreme temperatures, i.e. power switched on while the device was soaking for a period of 20 minutes at the test temperatures of -190 C and +225 C, was investigated.

  17. Modeling and Simulation Reliable Spacecraft On-Board Computing

    NASA Technical Reports Server (NTRS)

    Park, Nohpill

    1999-01-01

    The proposed project will investigate modeling and simulation-driven testing and fault tolerance schemes for Spacecraft On-Board Computing, thereby achieving reliable spacecraft telecommunication. A spacecraft communication system has inherent capabilities of providing multipoint and broadcast transmission, connectivity between any two distant nodes within a wide-area coverage, quick network configuration /reconfiguration, rapid allocation of space segment capacity, and distance-insensitive cost. To realize the capabilities above mentioned, both the size and cost of the ground-station terminals have to be reduced by using reliable, high-throughput, fast and cost-effective on-board computing system which has been known to be a critical contributor to the overall performance of space mission deployment. Controlled vulnerability of mission data (measured in sensitivity), improved performance (measured in throughput and delay) and fault tolerance (measured in reliability) are some of the most important features of these systems. The system should be thoroughly tested and diagnosed before employing a fault tolerance into the system. Testing and fault tolerance strategies should be driven by accurate performance models (i.e. throughput, delay, reliability and sensitivity) to find an optimal solution in terms of reliability and cost. The modeling and simulation tools will be integrated with a system architecture module, a testing module and a module for fault tolerance all of which interacting through a centered graphical user interface.

  18. The Linac Coherent Light Source

    DOE PAGES

    White, William E.; Robert, Aymeric; Dunne, Mike

    2015-05-01

    The Linac Coherent Light Source (LCLS) at the SLAC National Accelerator Laboratory was the first hard X-ray free-electron laser (FEL) to operate as a user facility. After five years of operation, LCLS is now a mature FEL user facility. Our personal views about opportunities and challenges inherent to these unique light sources are discussed.

  19. 78 FR 42538 - Information Collection Activities: Sulphur Operations, Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-16

    ..., section 301(a) of the Federal Oil and Gas Royalty Management Act (FOGRMA), 30 U.S.C. 1751(a), grants... requirements. The BSEE uses the information collected to ascertain the condition of drilling sites for the purpose of preventing hazards inherent in sulphur drilling and production operations and to evaluate the...

  20. Washington State Community College Operating Budget, 1985-87. Management Summary.

    ERIC Educational Resources Information Center

    Washington State Board for Community Coll. Education, Olympia.

    A summary is presented of the 1985-87 community college operating budget request for the Washington State Community colleges, along with a description of the policy considerations inherent in the request and the anticipated effect of the request on community college programs. The philosophy and objectives underpinning the budget request are…

  1. 16 CFR 1211.7 - Inherent entrapment protection requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... installation and at various heights under the edge of the door and located in line with the driving point of... installation, the bottom edge of the door under the driving force of the operator is to be against the floor... that represents the most severe operating condition. Any accessories having an effect on the intended...

  2. Direct electrical arc ignition of hybrid rocket motors

    NASA Astrophysics Data System (ADS)

    Judson, Michael I., Jr.

    Hybrid rockets motors provide distinct safety advantages when compared to traditional liquid or solid propellant systems, due to the inherent stability and relative inertness of the propellants prior to established combustion. As a result of this inherent propellant stability, hybrid motors have historically proven difficult to ignite. State of the art hybrid igniter designs continue to require solid or liquid reactants distinct from the main propellants. These ignition methods however, reintroduce to the hybrid propulsion system the safety and complexity disadvantages associated with traditional liquid or solid propellants. The results of this study demonstrate the feasibility of a novel direct electrostatic arc ignition method for hybrid motors. A series of small prototype stand-alone thrusters demonstrating this technology were successfully designed and tested using Acrylonitrile Butadiene Styrene (ABS) plastic and Gaseous Oxygen (GOX) as propellants. Measurements of input voltage and current demonstrated that arc-ignition will occur using as little as 10 watts peak power and less than 5 joules total energy. The motor developed for the stand-alone small thruster was adapted as a gas generator to ignite a medium-scale hybrid rocket motor using nitrous oxide /and HTPB as propellants. Multiple consecutive ignitions were performed. A large data set as well as a collection of development `lessons learned' were compiled to guide future development and research. Since the completion of this original groundwork research, the concept has been developed into a reliable, operational igniter system for a 75mm hybrid motor using both gaseous oxygen and liquid nitrous oxide as oxidizers. A development map of the direct spark ignition concept is presented showing the flow of key lessons learned between this original work and later follow on development.

  3. Mitigating the Shortage of Special Operations Aviation By an Unconventional Approach

    DTIC Science & Technology

    2017-12-01

    Second World War, and the majority of air power theorists suggested that when technology finally caught up with the inherent ability of aviation, air...assessment of an American expert [Richard D. Newton, Joint Special Operations University] in air special operations at the Air Force’s annual Air Power ...scope and time in order to “seize, destroy, disrupt, capture, exploit, recover, or damage high value or high pay-off targets.”48 When these operations

  4. Hybrid time-variant reliability estimation for active control structures under aleatory and epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Xiong, Chuang; Wang, Xiaojun; Li, Yunlong; Xu, Menghui

    2018-04-01

    Considering that multi-source uncertainties from inherent nature as well as the external environment are unavoidable and severely affect the controller performance, the dynamic safety assessment with high confidence is of great significance for scientists and engineers. In view of this, the uncertainty quantification analysis and time-variant reliability estimation corresponding to the closed-loop control problems are conducted in this study under a mixture of random, interval, and convex uncertainties. By combining the state-space transformation and the natural set expansion, the boundary laws of controlled response histories are first confirmed with specific implementation of random items. For nonlinear cases, the collocation set methodology and fourth Rounge-Kutta algorithm are introduced as well. Enlightened by the first-passage model in random process theory as well as by the static probabilistic reliability ideas, a new definition of the hybrid time-variant reliability measurement is provided for the vibration control systems and the related solution details are further expounded. Two engineering examples are eventually presented to demonstrate the validity and applicability of the methodology developed.

  5. Laser-plasma-based Space Radiation Reproduction in the Laboratory

    PubMed Central

    Hidding, B.; Karger, O.; Königstein, T.; Pretzler, G.; Manahan, G. G.; McKenna, P.; Gray, R.; Wilson, R.; Wiggins, S. M.; Welsh, G. H.; Beaton, A.; Delinikolas, P.; Jaroszynski, D. A.; Rosenzweig, J. B.; Karmakar, A.; Ferlet-Cavrois, V.; Costantino, A.; Muschitiello, M.; Daly, E.

    2017-01-01

    Space radiation is a great danger to electronics and astronauts onboard space vessels. The spectral flux of space electrons, protons and ions for example in the radiation belts is inherently broadband, but this is a feature hard to mimic with conventional radiation sources. Using laser-plasma-accelerators, we reproduced relativistic, broadband radiation belt flux in the laboratory, and used this man-made space radiation to test the radiation hardness of space electronics. Such close mimicking of space radiation in the lab builds on the inherent ability of laser-plasma-accelerators to directly produce broadband Maxwellian-type particle flux, akin to conditions in space. In combination with the established sources, utilisation of the growing number of ever more potent laser-plasma-accelerator facilities worldwide as complementary space radiation sources can help alleviate the shortage of available beamtime and may allow for development of advanced test procedures, paving the way towards higher reliability of space missions. PMID:28176862

  6. Perceptions and culture of safety among helicopter emergency medical service personnel in the UK.

    PubMed

    Chesters, Adam; Grieve, Philip H; Hodgetts, Timothy J

    2016-11-01

    The use of helicopter emergency medical services (HEMS) has increased significantly in the UK since 1987. To date there has been no research that addresses HEMS pilots and medical crews' own ideas on the risks that they view as inherent in their line of work and how to mitigate these risks. The aim of this survey is to describe and compare the attitudes and perceptions towards risk in HEMS operations of these staff. A questionnaire was administered electronically to a representative selection of HEMS doctors, paramedics and pilots in the UK. A number of questions were grouped into common themes, and presented as Likert scales and ranking where appropriate. Descriptive and comparative results were presented and statistically analysed. The target sample of 100 consecutive respondents was achieved. All questionnaires were entirely completed. Respondents attributed the most risk to night HEMS operations without the use of night vision goggles, commercial pressure and mechanical aircraft failure. There was no statistical difference in overall perception of safety and years of experience (p=0.58) or between professions (p=0.08). Those who had experienced a crash were more likely to believe that HEMS operations are not inherently safe (p=0.05). We have surveyed a cross-section of the HEMS operational community in the UK in order to describe their perceptions of safety and risk within their professional life. Two-thirds of respondents believed that HEMS operations were inherently safe. Those who did not seemed to be influenced by personal experience of a crash or serious incident. We support increased operational training for clinical crewmembers, an increased emphasis on incident reporting and a culture of safety, and careful attention to minimum training and equipment requirements for all HEMS missions. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  7. Thermodynamic Performance and Cost Optimization of a Novel Hybrid Thermal-Compressed Air Energy Storage System Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houssainy, Sammy; Janbozorgi, Mohammad; Kavehpour, Pirouz

    Compressed Air Energy Storage (CAES) can potentially allow renewable energy sources to meet electricity demands as reliably as coal-fired power plants. However, conventional CAES systems rely on the combustion of natural gas, require large storage volumes, and operate at high pressures, which possess inherent problems such as high costs, strict geological locations, and the production of greenhouse gas emissions. A novel and patented hybrid thermal-compressed air energy storage (HT-CAES) design is presented which allows a portion of the available energy, from the grid or renewable sources, to operate a compressor and the remainder to be converted and stored in themore » form of heat, through joule heating in a sensible thermal storage medium. The HT-CAES design incudes a turbocharger unit that provides supplementary mass flow rate alongside the air storage. The hybrid design and the addition of a turbocharger have the beneficial effect of mitigating the shortcomings of conventional CAES systems and its derivatives by eliminating combustion emissions and reducing storage volumes, operating pressures, and costs. Storage efficiency and cost are the two key factors, which upon integration with renewable energies would allow the sources to operate as independent forms of sustainable energy. The potential of the HT-CAES design is illustrated through a thermodynamic optimization study, which outlines key variables that have a major impact on the performance and economics of the storage system. The optimization analysis quantifies the required distribution of energy between thermal and compressed air energy storage, for maximum efficiency, and for minimum cost. This study provides a roundtrip energy and exergy efficiency map of the storage system and illustrates a trade off that exists between its capital cost and performance.« less

  8. Analysis of a multi-wavelength multi-camera phase-shifting profilometric system for real-time operation

    NASA Astrophysics Data System (ADS)

    Stoykova, Elena; Gotchev, Atanas; Sainov, Ventseslav

    2011-01-01

    Real-time accomplishment of a phase-shifting profilometry through simultaneous projection and recording of fringe patterns requires a reliable phase retrieval procedure. In the present work we consider a four-wavelength multi-camera system with four sinusoidal phase gratings for pattern projection that implements a four-step algorithm. Successful operation of the system depends on overcoming two challenges which stem out from the inherent limitations of the phase-shifting algorithm, namely the demand for a sinusoidal fringe profile and the necessity to ensure equal background and contrast of fringes in the recorded fringe patterns. As a first task, we analyze the systematic errors due to the combined influence of the higher harmonics and multi-wavelength illumination in the Fresnel diffraction zone considering the case when the modulation parameters of the four gratings are different. As a second task we simulate the system performance to evaluate the degrading effect of the speckle noise and the spatially varying fringe modulation at non-uniform illumination on the overall accuracy of the profilometric measurement. We consider the case of non-correlated speckle realizations in the recorded fringe patterns due to four-wavelength illumination. Finally, we apply a phase retrieval procedure which includes normalization, background removal and denoising of the recorded fringe patterns to both simulated and measured data obtained for a dome surface.

  9. FACT, Mega-ROSA, SOLAROSA

    NASA Technical Reports Server (NTRS)

    Spence, Brian; White, Steve; Schmid, Kevin; Douglas Mark

    2012-01-01

    The Flexible Array Concentrator Technology (FACT) is a lightweight, high-performance reflective concentrator blanket assembly that can be used on flexible solar array blankets. The FACT concentrator replaces every other row of solar cells on a solar array blanket, significantly reducing the cost of the array. The modular design is highly scalable for the array system designer, and exhibits compact stowage, good off-pointing acceptance, and mass/cost savings. The assembly s relatively low concentration ratio, accompanied by a large radiative area, provides for a low cell operating temperature, and eliminates many of the thermal problems inherent in high-concentration-ratio designs. Unlike other reflector technologies, the FACT concentrator modules function on both z-fold and rolled flexible solar array blankets, as well as rigid array systems. Mega-ROSA (Mega Roll-Out Solar Array) is a new, highly modularized and extremely scalable version of ROSA that provides immense power level range capability from 100 kW to several MW in size. Mega-ROSA will enable extremely high-power spacecraft and SEP-powered missions, including space-tug and largescale planetary science and lunar/asteroid exploration missions. Mega-ROSA's inherent broad power scalability is achieved while retaining ROSA s solar array performance metrics and missionenabling features for lightweight, compact stowage volume and affordability. This innovation will enable future ultra-high-power missions through lowcost (25 to 50% cost savings, depending on PV and blanket technology), lightweight, high specific power (greater than 200 to 400 Watts per kilogram BOL (beginning-of-life) at the wing level depending on PV and blanket technology), compact stowage volume (greater than 50 kilowatts per cubic meter for very large arrays), high reliability, platform simplicity (low failure modes), high deployed strength/stiffness when scaled to huge sizes, and high-voltage operation capability. Mega-ROSA is adaptable to all photovoltaic and concentrator flexible blanket technologies, and can readily accommodate standard multijunction and emerging ultra-lightweight IMM (inverted metamorphic) photovoltaic flexible blanket assemblies, as well as ENTECHs Stretched Lens Array (SLA) and DSSs (Deployable Space Systems) FACT, which allows for cost reduction at the array level.

  10. Challenges for operational forecasting and early warning of rainfall induced landslides

    NASA Astrophysics Data System (ADS)

    Guzzetti, Fausto

    2017-04-01

    In many areas of the world, landslides occur every year, claiming lives and producing severe economic and environmental damage. Many of the landslides with human or economic consequences are the result of intense or prolonged rainfall. For this reason, in many areas the timely forecast of rainfall-induced landslides is of both scientific interest and social relevance. In the recent years, there has been a mounting interest and an increasing demand for operational landslide forecasting, and for associated landslide early warning systems. Despite the relevance of the problem, and the increasing interest and demand, only a few systems have been designed, and are currently operated. Inspection of the - limited - literature on operational landslide forecasting, and on the associated early warning systems, reveals that common criteria and standards for the design, the implementation, the operation, and the evaluation of the performances of the systems, are lacking. This limits the possibility to compare and to evaluate the systems critically, to identify their inherent strengths and weaknesses, and to improve the performance of the systems. Lack of common criteria and of established standards can also limit the credibility of the systems, and consequently their usefulness and potential practical impact. Landslides are very diversified phenomena, and the information and the modelling tools used to attempt landslide forecasting vary largely, depending on the type and size of the landslides, the extent of the geographical area considered, the timeframe of the forecasts, and the scope of the predictions. Consequently, systems for landslide forecasting and early warning can be designed and implemented at several different geographical scales, from the local (site or slope specific) to the regional, or even national scale. The talk focuses on regional to national scale landslide forecasting systems, and specifically on operational systems based on empirical rainfall threshold models. Building on the experience gained in designing, implementing, and operating national and regional landslide forecasting systems in Italy, and on a preliminary review of the existing literature on regional landslide early warning systems, the talk discusses concepts, limitations and challenges inherent to the design of reliable forecasting and early warning systems for rainfall-triggered landslides, the evaluation of the performances of the systems, and on problems related to the use of the forecasts and the issuing of landslide warnings. Several of the typical elements of an operational landslide forecasting system are considered, including: (i) the rainfall and landslide information used to establish the threshold models, (ii) the methods and tools used to define the empirical rainfall thresholds, and their associated uncertainty, (iii) the quality (e.g., the temporal and spatial resolution) of the rainfall information used for operational forecasting, including rain gauge and radar measurements, satellite estimates, and quantitative weather forecasts, (iv) the ancillary information used to prepare the forecasts, including e.g., the terrain subdivisions and the landslide susceptibility zonations, (v) the criteria used to transform the forecasts into landslide warnings and the methods used to communicate the warnings, and (vi) the criteria and strategies adopted to evaluate the performances of the systems, and to define minimum or optimal performance levels.

  11. Modeling for Battery Prognostics

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.; Goebel, Kai; Khasin, Michael; Hogge, Edward; Quach, Patrick

    2017-01-01

    For any battery-powered vehicles (be it unmanned aerial vehicles, small passenger aircraft, or assets in exoplanetary operations) to operate at maximum efficiency and reliability, it is critical to monitor battery health as well performance and to predict end of discharge (EOD) and end of useful life (EOL). To fulfil these needs, it is important to capture the battery's inherent characteristics as well as operational knowledge in the form of models that can be used by monitoring, diagnostic, and prognostic algorithms. Several battery modeling methodologies have been developed in last few years as the understanding of underlying electrochemical mechanics has been advancing. The models can generally be classified as empirical models, electrochemical engineering models, multi-physics models, and molecular/atomist. Empirical models are based on fitting certain functions to past experimental data, without making use of any physicochemical principles. Electrical circuit equivalent models are an example of such empirical models. Electrochemical engineering models are typically continuum models that include electrochemical kinetics and transport phenomena. Each model has its advantages and disadvantages. The former type of model has the advantage of being computationally efficient, but has limited accuracy and robustness, due to the approximations used in developed model, and as a result of such approximations, cannot represent aging well. The latter type of model has the advantage of being very accurate, but is often computationally inefficient, having to solve complex sets of partial differential equations, and thus not suited well for online prognostic applications. In addition both multi-physics and atomist models are computationally expensive hence are even less suited to online application An electrochemistry-based model of Li-ion batteries has been developed, that captures crucial electrochemical processes, captures effects of aging, is computationally efficient, and is of suitable accuracy for reliable EOD prediction in a variety of operational profiles. The model can be considered an electrochemical engineering model, but unlike most such models found in the literature, certain approximations are done that allow to retain computational efficiency for online implementation of the model. Although the focus here is on Li-ion batteries, the model is quite general and can be applied to different chemistries through a change of model parameter values. Progress on model development, providing model validation results and EOD prediction results is being presented.

  12. Low Pressure Nuclear Thermal Rocket (LPNTR) concept

    NASA Technical Reports Server (NTRS)

    Ramsthaler, J. H.

    1991-01-01

    A background and a description of the low pressure nuclear thermal system are presented. Performance, mission analysis, development, critical issues, and some conclusions are discussed. The following subject areas are covered: LPNTR's inherent advantages in critical NTR requirement; reactor trade studies; reference LPNTR; internal configuration and flow of preliminary LPNTR; particle bed fuel assembly; preliminary LPNTR neutronic study results; multiple LPNTR engine concept; tank and engine configuration for mission analysis; LPNTR reliability potential; LPNTR development program; and LPNTR program costs.

  13. In search of consensus: Terminology for entheseal changes (EC).

    PubMed

    Villotte, Sébastien; Assis, Sandra; Cardoso, Francisca Alves; Henderson, Charlotte Yvette; Mariotti, Valentina; Milella, Marco; Pany-Kucera, Doris; Speith, Nivien; Wilczak, Cynthia A; Jurmain, Robert

    2016-06-01

    This article presents a consensus terminology for entheseal changes that was developed in English by an international team of scholars and then translated into French, Italian, Portuguese, Spanish and German. Use of a standard, neutral terminology to describe entheseal morphology will reduce misunderstandings between researchers, improve the reliability of comparisons between studies, and eliminate unwarranted etiological assumptions inherent in some of the descriptive terms presently used in the literature. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Blending forest fire smoke forecasts with observed data can improve their utility for public health applications

    NASA Astrophysics Data System (ADS)

    Yuchi, Weiran; Yao, Jiayun; McLean, Kathleen E.; Stull, Roland; Pavlovic, Radenko; Davignon, Didier; Moran, Michael D.; Henderson, Sarah B.

    2016-11-01

    Fine particulate matter (PM2.5) generated by forest fires has been associated with a wide range of adverse health outcomes, including exacerbation of respiratory diseases and increased risk of mortality. Due to the unpredictable nature of forest fires, it is challenging for public health authorities to reliably evaluate the magnitude and duration of potential exposures before they occur. Smoke forecasting tools are a promising development from the public health perspective, but their widespread adoption is limited by their inherent uncertainties. Observed measurements from air quality monitoring networks and remote sensing platforms are more reliable, but they are inherently retrospective. It would be ideal to reduce the uncertainty in smoke forecasts by integrating any available observations. This study takes spatially resolved PM2.5 estimates from an empirical model that integrates air quality measurements with satellite data, and averages them with PM2.5 predictions from two smoke forecasting systems. Two different indicators of population respiratory health are then used to evaluate whether the blending improved the utility of the smoke forecasts. Among a total of six models, including two single forecasts and four blended forecasts, the blended estimates always performed better than the forecast values alone. Integrating measured observations into smoke forecasts could improve public health preparedness for smoke events, which are becoming more frequent and intense as the climate changes.

  15. Biomechanical Behavior of Bioprosthetic Heart Valve Heterograft Tissues: Characterization, Simulation, and Performance

    PubMed Central

    Soares, Joao S.; Feaver, Kristen R.; Zhang, Will; Kamensky, David; Aggarwal, Ankush; Sacks, Michael S.

    2017-01-01

    The use of replacement heart valves continues to grow due to the increased prevalence of valvular heart disease resulting from an ageing population. Since bioprosthetic heart valves (BHVs) continue to be the preferred replacement valve, there continues to be a strong need to develop better and more reliable BHVs through and improved the general understanding of BHV failure mechanisms. The major technological hurdle for the lifespan of the BHV implant continues to be the durability of the constituent leaflet biomaterials, which if improved can lead to substantial clinical impact. In order to develop improved solutions for BHV biomaterials, it is critical to have a better understanding of the inherent biomechanical behaviors of the leaflet biomaterials, including chemical treatment technologies, the impact of repetitive mechanical loading, and the inherent failure modes. This review seeks to provide a comprehensive overview of these issues, with a focus on developing insight on the mechanisms of BHV function and failure. Additionally, this review provides a detailed summary of the computational biomechanical simulations that have been used to inform and develop a higher level of understanding of BHV tissues and their failure modes. Collectively, this information should serve as a tool not only to infer reliable and dependable prosthesis function, but also to instigate and facilitate the design of future bioprosthetic valves and clinically impact cardiology. PMID:27507280

  16. Micro-Inspector Spacecraft for Space Exploration Missions

    NASA Technical Reports Server (NTRS)

    Mueller, Juergen; Alkalai, Leon; Lewis, Carol

    2005-01-01

    NASA is seeking to embark on a new set of human and robotic exploration missions back to the Moon, to Mars, and destinations beyond. Key strategic technical challenges will need to be addressed to realize this new vision for space exploration, including improvements in safety and reliability to improve robustness of space operations. Under sponsorship by NASA's Exploration Systems Mission, the Jet Propulsion Laboratory (JPL), together with its partners in government (NASA Johnson Space Center) and industry (Boeing, Vacco Industries, Ashwin-Ushas Inc.) is developing an ultra-low mass (<3.0 kg) free-flying micro-inspector spacecraft in an effort to enhance safety and reduce risk in future human and exploration missions. The micro-inspector will provide remote vehicle inspections to ensure safety and reliability, or to provide monitoring of in-space assembly. The micro-inspector spacecraft represents an inherently modular system addition that can improve safety and support multiple host vehicles in multiple applications. On human missions, it may help extend the reach of human explorers, decreasing human EVA time to reduce mission cost and risk. The micro-inspector development is the continuation of an effort begun under NASA's Office of Aerospace Technology Enabling Concepts and Technology (ECT) program. The micro-inspector uses miniaturized celestial sensors; relies on a combination of solar power and batteries (allowing for unlimited operation in the sun and up to 4 hours in the shade); utilizes a low-pressure, low-leakage liquid butane propellant system for added safety; and includes multi-functional structure for high system-level integration and miniaturization. Versions of this system to be designed and developed under the H&RT program will include additional capabilities for on-board, vision-based navigation, spacecraft inspection, and collision avoidance, and will be demonstrated in a ground-based, space-related environment. These features make the micro-inspector design unique in its ability to serve crewed as well as robotic spacecraft, well beyond Earth-orbit and into arenas such as robotic missions, where human teleoperation capability is not locally available.

  17. Using triggered operations to offload collective communication operations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barrett, Brian W.; Hemmert, K. Scott; Underwood, Keith Douglas

    2010-04-01

    Efficient collective operations are a major component of application scalability. Offload of collective operations onto the network interface reduces many of the latencies that are inherent in network communications and, consequently, reduces the time to perform the collective operation. To support offload, it is desirable to expose semantic building blocks that are simple to offload and yet powerful enough to implement a variety of collective algorithms. This paper presents the implementation of barrier and broadcast leveraging triggered operations - a semantic building block for collective offload. Triggered operations are shown to be both semantically powerful and capable of improving performance.

  18. 77 FR 25753 - Biweekly Notice; Applications and Amendments to Facility Operating Licenses and Combined Licenses...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-01

    ... provide post accident design basis cooling. Therefore, the proposed change does not involve a significant... operating margin inherent in the design orifices of the RHR suppression pool cooling test return line and... information in comment submissions that you do not want to be publicly disclosed. The NRC posts all comment...

  19. Maslow and Field Experiences in Competency-Based Teacher Education.

    ERIC Educational Resources Information Center

    Warner, Allen R.

    Student teaching is examined in relation to Maslow's theory of human motivation that proposes an inherent human tendency toward self-actualization. It is pointed out that the majority of student teachers operate in fear as they enter their final phase of teacher training, and according to Maslow, they are operating at the safety level, concerned…

  20. 250 Robotic Pancreatic Resections: Safety and Feasibility

    PubMed Central

    Zureikat, Amer H.; Moser, A. James; Boone, Brian A.; Bartlett, David L.; Zenati, Mazen; Zeh, Herbert J.

    2015-01-01

    Background and Objectives Computer Assisted Robotic Surgery allows complex resections and anastomotic reconstructions to be performed with nearly identical standards to open surgery. We applied this technology to a variety of pancreatic resections to assess the safety, feasibility, versatility and reliability of this technology. Methods A retrospective review of a prospective database of robotic pancreatic resections at a single institution between August 2008 and November 2012 was performed. Peri-operative outcomes were analyzed. Results 250 consecutive robotic pancreatic resections were analyzed; pancreaticoduodenectomy (PD =132), distal pancreatectomy (DP=83), central pancreatectomy (CP=13), pancreatic enucleation (10), total pancreatectomy (TP=5), Appleby resection (4), and Frey procedure (3). Thirty day and 90 day mortality was 0.8 % and 2.0%. Rate of Clavien 3 and 4 complications was 14 and 6 %. The ISGPF grade C fistula rate was 4%. Mean operative time for the two most common procedures was 529 ± 103 mins for PD, and 257 ± 93 mins for DP. Continuous improvement in operative times was observed over the course of the experience. Conversion to open procedure was required in 16 patients (6%);(11 PD, 2 DP, 2 CP, 1 TP) for failure to progress (14) and bleeding (2). Conclusions This represents to our knowledge the largest series of robotic pancreatic resections. Safety and feasibility metrics including the low incidence of conversion support the robustness of this platform and suggest no unanticipated risks inherent to this new technology. By defining these early outcome metrics this report begins to establish a framework for comparative effectiveness studies of this platform. PMID:24002300

  1. Towards "DRONE-BORNE" Disaster Management: Future Application Scenarios

    NASA Astrophysics Data System (ADS)

    Tanzi, Tullio Joseph; Chandra, Madhu; Isnard, Jean; Camara, Daniel; Sebastien, Olivier; Harivelo, Fanilo

    2016-06-01

    Information plays a key role in crisis management and relief efforts for natural disaster scenarios. Given their flight properties, UAVs (Unmanned Aerial Vehicles) provide new and interesting perspectives on the data gathering for disaster management. A new generation of UAVs may help to improve situational awareness and information assessment. Among the advantages UAVs may bring to the disaster management field, we can highlight the gain in terms of time and human resources, as they can free rescue teams from time-consuming data collection tasks and assist research operations with more insightful and precise guidance thanks to advanced sensing capabilities. However, in order to be useful, UAVs need to overcome two main challenges. The first one is to achieve a sufficient autonomy level, both in terms of navigation and interpretation of the data sensed. The second major challenge relates to the reliability of the UAV, with respect to accidental (safety) or malicious (security) risks. This paper first discusses the potential of UAV in assisting in different humanitarian relief scenarios, as well as possible issues in such situations. Based on recent experiments, we discuss the inherent advantages of autonomous flight operations, both lone flights and formation flights. The question of autonomy is then addressed and a secure embedded architecture and its specific hardware capabilities is sketched out. We finally present a typical use case based on the new detection and observation abilities that UAVs can bring to rescue teams. Although this approach still has limits that have to be addressed, technically speaking as well as operationally speaking, it seems to be a very promising one to enhance disaster management efforts activities.

  2. Improving the Safety of Moving Lane Closures

    DOT National Transportation Integrated Search

    2009-06-01

    Moving lane closures are an increasingly utilized and inherently hazardous traffic control procedure for highway : maintenance and operations activities. To improve the safety of moving lane closures for workers and motorists, : this research studied...

  3. Chapter 15: Reliability of Wind Turbines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheng, Shuangwen; O'Connor, Ryan

    The global wind industry has witnessed exciting developments in recent years. The future will be even brighter with further reductions in capital and operation and maintenance costs, which can be accomplished with improved turbine reliability, especially when turbines are installed offshore. One opportunity for the industry to improve wind turbine reliability is through the exploration of reliability engineering life data analysis based on readily available data or maintenance records collected at typical wind plants. If adopted and conducted appropriately, these analyses can quickly save operation and maintenance costs in a potentially impactful manner. This chapter discusses wind turbine reliability bymore » highlighting the methodology of reliability engineering life data analysis. It first briefly discusses fundamentals for wind turbine reliability and the current industry status. Then, the reliability engineering method for life analysis, including data collection, model development, and forecasting, is presented in detail and illustrated through two case studies. The chapter concludes with some remarks on potential opportunities to improve wind turbine reliability. An owner and operator's perspective is taken and mechanical components are used to exemplify the potential benefits of reliability engineering analysis to improve wind turbine reliability and availability.« less

  4. Operational present status and reliability analysis of the upgraded EAST cryogenic system

    NASA Astrophysics Data System (ADS)

    Zhou, Z. W.; Y Zhang, Q.; Lu, X. F.; Hu, L. B.; Zhu, P.

    2017-12-01

    Since the first commissioning in 2005, the cryogenic system for EAST (Experimental Advanced Superconducting Tokamak) has been cooled down and warmed up for thirteen experimental campaigns. In order to promote the refrigeration efficiencies and reliability, the EAST cryogenic system was upgraded gradually with new helium screw compressors and new dynamic gas bearing helium turbine expanders with eddy current brake to improve the original poor mechanical and operational performance from 2012 to 2015. Then the totally upgraded cryogenic system was put into operation in the eleventh cool-down experiment, and has been operated for the latest several experimental campaigns. The upgraded system has successfully coped with various normal operational modes during cool-down and 4.5 K steady-state operation under pulsed heat load from the tokamak as well as the abnormal fault modes including turbines protection stop. In this paper, the upgraded EAST cryogenic system including its functional analysis and new cryogenic control networks will be presented in detail. Also, its operational present status in the latest cool-down experiments will be presented and the system reliability will be analyzed, which shows a high reliability and low fault rate after upgrade. In the end, some future necessary work to meet the higher reliability requirement for future uninterrupted long-term experimental operation will also be proposed.

  5. The effect of robot dynamics on smoothness during wrist pointing.

    PubMed

    Erwin, Andrew; Pezent, Evan; Bradley, Joshua; O'Malley, Marcia K

    2017-07-01

    The improvement of movement smoothness over the course of therapy is one of the positive outcomes observed during robotic rehabilitation. Although movements are generally robust to disturbances, certain perturbations might disrupt an individual's ability to produce these smooth movements. In this paper, we explore how a rehabilitation robot's inherent dynamics impact movement smoothness during pointing tasks. Able-bodied participants made wrist pointing movements under four different operating conditions. Despite the relative transparency of the device, inherent dynamic characteristics negatively impacted movement smoothness. Active compensation for Coulomb friction effects failed to mitigate the degradation in smoothness. Assessment of movements that involved coupled motions of the robot's joints reduced the bias seen in single degree of freedom movements. When using robotic devices for assessment of movement quality, the impact of the inherent dynamics must be considered.

  6. Next market opportunities for phosphoric acid fuel cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClelland, R.H.

    Key early entry markets for the next step PC25 Model C fuel cell are most likely to include: Premium Quality Power markets such as data centers, communications facilities, and the like; Healthcare Facilities, particularly for nursing homes and hospitals having 300 or more beds, here, the thermal side of a 200 kW fuel cell is an excellent match and some importance is also attached to power quality and reliability; and Auxiliary Electric Power at natural gas compression facilities, such facilities also tend to place a premium on reliability and low maintenance, moreover, the fuel cell`s inherently low emissions can bemore » very important within the northeast Ozone Transport Region. For the fuel cell concept to remain viable, penetration of this class of early entry markets is needed to sustain economic and reliability progress within a goal of moderate production volumes. This can then build the needed bridge to further markets and to other emerging fuel cell technologies.« less

  7. From Operating-System Correctness to Pervasively Verified Applications

    NASA Astrophysics Data System (ADS)

    Daum, Matthias; Schirmer, Norbert W.; Schmidt, Mareike

    Though program verification is known and has been used for decades, the verification of a complete computer system still remains a grand challenge. Part of this challenge is the interaction of application programs with the operating system, which is usually entrusted with retrieving input data from and transferring output data to peripheral devices. In this scenario, the correct operation of the applications inherently relies on operating-system correctness. Based on the formal correctness of our real-time operating system Olos, this paper describes an approach to pervasively verify applications running on top of the operating system.

  8. An Investigation of the Immediate Effect of Static Stretching on the Morphology and Stiffness of Achilles Tendon in Dominant and Non-Dominant Legs

    PubMed Central

    Chiu, Tsz-chun Roxy; Ngo, Hiu-ching; Lau, Lai-wa; Leung, King-wah; Lo, Man-him; Yu, Ho-fai; Ying, Michael

    2016-01-01

    Aims This study was undertaken to investigate the immediate effect of static stretching on normal Achilles tendon morphology and stiffness, and the different effect on dominant and non-dominant legs; and to evaluate inter-operator and intra-operator reliability of using shear-wave elastography in measuring Achilles tendon stiffness. Methods 20 healthy subjects (13 males, 7 females) were included in the study. Thickness, cross-sectional area and stiffness of Achilles tendons in both legs were measured before and after 5-min static stretching using grey-scale ultrasound and shear-wave elastography. Inter-operator and intra-operator reliability of tendon stiffness measurements of six operators were evaluated. Results Result showed that there was no significant change in the thickness and cross-sectional area of Achilles tendon after static stretching in both dominant and non-dominant legs (p > 0.05). Tendon stiffness showed a significant increase in non-dominant leg (p < 0.05) but not in dominant leg (p > 0.05). The inter-operator reliability of shear-wave elastography measurements was 0.749 and the intra-operator reliability ranged from 0.751 to 0.941. Conclusion Shear-wave elastography is a useful and non-invasive imaging tool to assess the immediate stiffness change of Achilles tendon in response to static stretching with high intra-operator and inter-operator reliability. PMID:27120097

  9. Seismic fiber optic multiplexed sensors for exploration and reservoir management

    NASA Astrophysics Data System (ADS)

    Houston, Mark H.

    2000-12-01

    Reliable downhole communications, control and sensor networks will dramatically improve oil reservoir management practices and will enable the construction of intelligent or smart-well completions. Fiber optic technology will play a key role in the implementation of these communication, control and sensing systems because of inherent advantages of power, weight and reliability over more conventional electronic-based systems. Field test data, acquired using an array of fiber optic seismic hydrophones within a steam-flood, heavy oil- production filed, showed a significant improvement (10X in this specific case) in subsurface resolution as compared to conventional surface seismic acquisition. These results demonstrate the viability of using multiplexed fiber optic sensors for exploration and reservoir management in 3D vertical seismic profiling (VSP) surveys and in permanent sensor arrays for 4D surveys.

  10. Theory and practice in the electrometric determination of pH in precipitation

    NASA Astrophysics Data System (ADS)

    Brennan, Carla Jo; Peden, Mark E.

    Basic theory and laboratory investigations have been applied to the electrometric determination of pH in precipitation samples in an effort to improve the reliability of the results obtained from these low ionic strength samples. The theoretical problems inherent in the measurement of pH in rain have been examined using natural precipitation samples with varying ionic strengths and pH values. The importance of electrode design and construction has been stressed. The proper choice of electrode can minimize or eliminate problems arising from residual liquid junction potentials, streaming potentials and temperature differences. Reliable pH measurements can be made in precipitation samples using commercially available calibration buffers providing low ionic strength quality control solutions are routinely used to verify electrode and meter performance.

  11. Autonomous Control of Space Nuclear Reactors

    NASA Technical Reports Server (NTRS)

    Merk, John

    2013-01-01

    Nuclear reactors to support future robotic and manned missions impose new and innovative technological requirements for their control and protection instrumentation. Long-duration surface missions necessitate reliable autonomous operation, and manned missions impose added requirements for failsafe reactor protection. There is a need for an advanced instrumentation and control system for space-nuclear reactors that addresses both aspects of autonomous operation and safety. The Reactor Instrumentation and Control System (RICS) consists of two functionally independent systems: the Reactor Protection System (RPS) and the Supervision and Control System (SCS). Through these two systems, the RICS both supervises and controls a nuclear reactor during normal operational states, as well as monitors the operation of the reactor and, upon sensing a system anomaly, automatically takes the appropriate actions to prevent an unsafe or potentially unsafe condition from occurring. The RPS encompasses all electrical and mechanical devices and circuitry, from sensors to actuation device output terminals. The SCS contains a comprehensive data acquisition system to measure continuously different groups of variables consisting of primary measurement elements, transmitters, or conditioning modules. These reactor control variables can be categorized into two groups: those directly related to the behavior of the core (known as nuclear variables) and those related to secondary systems (known as process variables). Reliable closed-loop reactor control is achieved by processing the acquired variables and actuating the appropriate device drivers to maintain the reactor in a safe operating state. The SCS must prevent a deviation from the reactor nominal conditions by managing limitation functions in order to avoid RPS actions. The RICS has four identical redundancies that comply with physical separation, electrical isolation, and functional independence. This architecture complies with the safety requirements of a nuclear reactor and provides high availability to the host system. The RICS is intended to interface with a host computer (the computer of the spacecraft where the reactor is mounted). The RICS leverages the safety features inherent in Earth-based reactors and also integrates the wide range neutron detector (WRND). A neutron detector provides the input that allows the RICS to do its job. The RICS is based on proven technology currently in use at a nuclear research facility. In its most basic form, the RICS is a ruggedized, compact data-acquisition and control system that could be adapted to support a wide variety of harsh environments. As such, the RICS could be a useful instrument outside the scope of a nuclear reactor, including military applications where failsafe data acquisition and control is required with stringent size, weight, and power constraints.

  12. Critical issues in assuring long lifetime and fail-safe operation of optical communications network

    NASA Astrophysics Data System (ADS)

    Paul, Dilip K.

    1993-09-01

    Major factors in assuring long lifetime and fail-safe operation in optical communications networks are reviewed in this paper. Reliable functionality to design specifications, complexity of implementation, and cost are the most critical issues. As economics is the driving force to set the goals as well as priorities for the design, development, safe operation, and maintenance schedules of reliable networks, a balance is sought between the degree of reliability enhancement, cost, and acceptable outage of services. Protecting both the link and the network with high reliability components, hardware duplication, and diversity routing can ensure the best network availability. Case examples include both fiber optic and lasercom systems. Also, the state-of-the-art reliability of photonics in space environment is presented.

  13. Temporalis myofascial flap transfer into the oral cavity without zygomatic arch osteotomy

    PubMed Central

    Tauro, David P.; Mishra, Madan; Singh, Gaurav

    2013-01-01

    Among plethora of options, the temporalis myofascial flap remains a workhorse for the maxillofacial reconstruction. The inherent advantages include reliable vascularity, adequate size, and proximity to the defect. Although contemporary surgical techniques provide fair surgical results with low rate of complications, their intraoral transposition involve additional surgical trauma by intentional fracturing of the zygomatic arch. We have proposed herein a simpler technique of temporalis myofascial flap transposition into the oral cavity without zygomatic arch osteotomy. PMID:24665182

  14. Magnetic Gearing Versus Conventional Gearing in Actuators for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Puchhammer, Gregor

    2014-01-01

    Magnetic geared actuators (MGA) are designed to perform highly reliable, robust and precise motion on satellite platforms or aerospace vehicles. The design allows MGA to be used for various tasks in space applications. In contrast to conventional geared drives, the contact and lubrication free force transmitting elements lead to a considerable lifetime and range extension of drive systems. This paper describes the fundamentals of magnetic wobbling gears (MWG) and the deduced inherent characteristics, and compares conventional and magnetic gearing.

  15. Displacement Damage Induced Catastrophic Second Breakdown in Silicon Carbide Schottky Power Diodes

    NASA Technical Reports Server (NTRS)

    Scheick, Leif; Selva, Luis; Selva, Luis

    2004-01-01

    A novel catastrophic breakdown mode in reversed biased Silicon carbide diodes has been seen for low LET particles. These particles are too low in LET to induce SEB, however SEB was seen from particles of higher LET. The low LET mechanism correlates with second breakdown in diodes due to increase leakage and assisted charge injection from incident particles. Percolation theory was used to predict some basic responses of the devices, but the inherent reliability issue with silicon carbide have proven challenging.

  16. [Automated procedure for volumetric measurement of metastases: estimation of tumor burden].

    PubMed

    Fabel, M; Bolte, H

    2008-09-01

    Cancer is a common and increasing disease worldwide. Therapy monitoring in oncologic patient care requires accurate and reliable measurement methods for evaluation of the tumor burden. RECIST (response evaluation criteria in solid tumors) and WHO criteria are still the current standards for therapy response evaluation with inherent disadvantages due to considerable interobserver variation of the manual diameter estimations. Volumetric analysis of e.g. lung, liver and lymph node metastases, promises to be a more accurate, precise and objective method for tumor burden estimation.

  17. 16 CFR 1211.7 - Inherent entrapment protection requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... door a minimum of 2 inches (50.8 mm). (b)(1) A solid object is to be placed on the floor of the test... the operator. When tested on the floor, the object shall be 1 inch (25.4 mm) high. In the test... when the door is fully closed. (2) For operators other than those attached to the door, a solid object...

  18. Theater Logistics Management: A Case for a Joint Distribution Solution

    DTIC Science & Technology

    2008-03-15

    Multinational (JIIM) operations necessitate creating joint-multinational-based distribution management centers which effectively manage materiel...in the world. However, as the operation continued, the inherent weakness of the intra-theater logistical distribution management link became clear...compounded the distribution management problem. The common thread between each of the noted GAO failures is the lack of a defined joint, theater

  19. 16 CFR 1211.7 - Inherent entrapment protection requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... door a minimum of 2 inches (50.8 mm). (b)(1) A solid object is to be placed on the floor of the test... when the door is fully closed. (2) For operators other than those attached to the door, a solid object is not required to be located in line with the driving point of the operator. The solid object is to...

  20. 16 CFR 1211.7 - Inherent entrapment protection requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... door a minimum of 2 inches (50.8 mm). (b)(1) A solid object is to be placed on the floor of the test... when the door is fully closed. (2) For operators other than those attached to the door, a solid object is not required to be located in line with the driving point of the operator. The solid object is to...

  1. Evaluation of Improved Pushback Forecasts Derived from Airline Ground Operations Data

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Theis, Georg; Feron, Eric; Clarke, John-Paul

    2003-01-01

    Accurate and timely predictions of airline pushbacks can potentially lead to improved performance of automated decision-support tools for airport surface traffic, thus reducing the variability and average duration of costly airline delays. One factor which affects the realization of these benefits is the level of uncertainty inherent in the turn processes. To characterize this inherent uncertainty, three techniques are developed for predicting time-to-go until pushback as a function of available ground-time; elapsed ground-time; and the status (not-started/in-progress/completed) of individual turn processes (cleaning, fueling, etc.). These techniques are tested against a large and detailed dataset covering approximately l0(exp 4) real-world turn operations obtained through collaboration with Deutsche Lufthansa AG. Even after the dataset is filtered to obtain a sample of turn operations with minimal uncertainty, the standard deviation of forecast error for all three techniques is lower-bounded away from zero, indicating that turn operations have a significant stochastic component. This lower-bound result shows that decision-support tools must be designed to incorporate robust mechanisms for coping with pushback demand stochasticity, rather than treating the pushback demand process as a known deterministic input.

  2. Medicine is not science: guessing the future, predicting the past.

    PubMed

    Miller, Clifford

    2014-12-01

    Irregularity limits human ability to know, understand and predict. A better understanding of irregularity may improve the reliability of knowledge. Irregularity and its consequences for knowledge are considered. Reliable predictive empirical knowledge of the physical world has always been obtained by observation of regularities, without needing science or theory. Prediction from observational knowledge can remain reliable despite some theories based on it proving false. A naïve theory of irregularity is outlined. Reducing irregularity and/or increasing regularity can increase the reliability of knowledge. Beyond long experience and specialization, improvements include implementing supporting knowledge systems of libraries of appropriately classified prior cases and clinical histories and education about expertise, intuition and professional judgement. A consequence of irregularity and complexity is that classical reductionist science cannot provide reliable predictions of the behaviour of complex systems found in nature, including of the human body. Expertise, expert judgement and their exercise appear overarching. Diagnosis involves predicting the past will recur in the current patient applying expertise and intuition from knowledge and experience of previous cases and probabilistic medical theory. Treatment decisions are an educated guess about the future (prognosis). Benefits of the improvements suggested here are likely in fields where paucity of feedback for practitioners limits development of reliable expert diagnostic intuition. Further analysis, definition and classification of irregularity is appropriate. Observing and recording irregularities are initial steps in developing irregularity theory to improve the reliability and extent of knowledge, albeit some forms of irregularity present inherent difficulties. © 2014 John Wiley & Sons, Ltd.

  3. Inter-operator and inter-device agreement and reliability of the SEM Scanner.

    PubMed

    Clendenin, Marta; Jaradeh, Kindah; Shamirian, Anasheh; Rhodes, Shannon L

    2015-02-01

    The SEM Scanner is a medical device designed for use by healthcare providers as part of pressure ulcer prevention programs. The objective of this study was to evaluate the inter-rater and inter-device agreement and reliability of the SEM Scanner. Thirty-one (31) volunteers free of pressure ulcers or broken skin at the sternum, sacrum, and heels were assessed with the SEM Scanner. Each of three operators utilized each of three devices to collect readings from four anatomical sites (sternum, sacrum, left and right heels) on each subject for a total of 108 readings per subject collected over approximately 30 min. For each combination of operator-device-anatomical site, three SEM readings were collected. Inter-operator and inter-device agreement and reliability were estimated. Over the course of this study, more than 3000 SEM Scanner readings were collected. Agreement between operators was good with mean differences ranging from -0.01 to 0.11. Inter-operator and inter-device reliability exceeded 0.80 at all anatomical sites assessed. The results of this study demonstrate the high reliability and good agreement of the SEM Scanner across different operators and different devices. Given the limitations of current methods to prevent and detect pressure ulcers, the SEM Scanner shows promise as an objective, reliable tool for assessing the presence or absence of pressure-induced tissue damage such as pressure ulcers. Copyright © 2015 Bruin Biometrics, LLC. Published by Elsevier Ltd.. All rights reserved.

  4. Low vibration microminiature split Stirling cryogenic cooler for infrared aerospace applications

    NASA Astrophysics Data System (ADS)

    Veprik, A.; Zechtzer, S.; Pundak, N.; Kirkconnel, C.; Freeman, J.; Riabzev, S.

    2011-06-01

    The operation of the thermo-mechanical unit of a cryogenic cooler may originate a resonant excitation of the spacecraft frame, optical bench or components of the optical train. This may result in degraded functionality of the inherently vibration sensitive space-borne infrared imager directly associated with the cooler or neighboring instrumentation typically requiring a quiet micro-g environment. The best practice for controlling cooler induced vibration relies on the principle of active momentum cancellation. In particular, the pressure wave generator typically contains two oppositely actuated piston compressors, while the single piston expander is counterbalanced by an auxiliary active counter-balancer. Active vibration cancellation is supervised by a dedicated DSP feed-forward controller, where the error signals are delivered by the vibration sensors (accelerometers or load cells). This can result in oversized, overweight and overpriced cryogenic coolers with degraded electromechanical performance and impaired reliability. The authors are advocating a reliable, compact, cost and power saving approach capitalizing on the combined application of a passive tuned dynamic absorber and a low frequency vibration isolator. This concept appears to be especially suitable for low budget missions involving mini and micro satellites, where price, size, weight and power consumption are of concern. The authors reveal the results of theoretical study and experimentation on the attainable performance using a fullscale technology demonstrator relying on a Ricor model K527 tactical split Stirling cryogenic cooler. The theoretical predictions are in fair agreement with the experimental data. From experimentation, the residual vibration export is quite suitable for demanding wide range of aerospace applications. The authors give practical recommendations on heatsinking and further maximizing performance.

  5. Brain networks for confidence weighting and hierarchical inference during probabilistic learning.

    PubMed

    Meyniel, Florent; Dehaene, Stanislas

    2017-05-09

    Learning is difficult when the world fluctuates randomly and ceaselessly. Classical learning algorithms, such as the delta rule with constant learning rate, are not optimal. Mathematically, the optimal learning rule requires weighting prior knowledge and incoming evidence according to their respective reliabilities. This "confidence weighting" implies the maintenance of an accurate estimate of the reliability of what has been learned. Here, using fMRI and an ideal-observer analysis, we demonstrate that the brain's learning algorithm relies on confidence weighting. While in the fMRI scanner, human adults attempted to learn the transition probabilities underlying an auditory or visual sequence, and reported their confidence in those estimates. They knew that these transition probabilities could change simultaneously at unpredicted moments, and therefore that the learning problem was inherently hierarchical. Subjective confidence reports tightly followed the predictions derived from the ideal observer. In particular, subjects managed to attach distinct levels of confidence to each learned transition probability, as required by Bayes-optimal inference. Distinct brain areas tracked the likelihood of new observations given current predictions, and the confidence in those predictions. Both signals were combined in the right inferior frontal gyrus, where they operated in agreement with the confidence-weighting model. This brain region also presented signatures of a hierarchical process that disentangles distinct sources of uncertainty. Together, our results provide evidence that the sense of confidence is an essential ingredient of probabilistic learning in the human brain, and that the right inferior frontal gyrus hosts a confidence-based statistical learning algorithm for auditory and visual sequences.

  6. Proton Exchange Membrane (PEM) Fuel Cells for Space Applications

    NASA Technical Reports Server (NTRS)

    Bradley, Karla

    2004-01-01

    This presentation will provide a summary of the PEM fuel cell development at the National Aeronautics and Space Administration, Johnson Space Center (NASA, JSC) in support of future space applications. Fuel cells have been used for space power generation due to their high energy storage density for multi-day missions. The Shuttle currently utilizes the alkaline fuel cell technology, which has highly safe and reliable performance. However, the alkaline technology has a limited life due to the corrosion inherent to the alkaline technology. PEM fuel cells are under development by industry for transportation, residential and commercial stationary power applications. NASA is trying to incorporate some of this stack technology development in the PEM fuel cells for space. NASA has some unique design and performance parameters which make developing a PEM fuel cell system more challenging. Space fuel cell applications utilize oxygen, rather than air, which yields better performance but increases the hazard level. To reduce the quantity of reactants that need to be flown in space, NASA also utilizes water separation and reactant recirculation. Due to the hazards of utilizing active components for recirculation and water separation, NASA is trying to develop passive recirculation and water separation methods. However, the ability to develop recirculation components and water separators that are gravity-independent and successfully operate over the full range of power levels is one of the greatest challenges to developing a safe and reliable PEM fuel cell system. PEM stack, accessory component, and system tests that have been performed for space power applications will be discussed.

  7. Brain networks for confidence weighting and hierarchical inference during probabilistic learning

    PubMed Central

    Meyniel, Florent; Dehaene, Stanislas

    2017-01-01

    Learning is difficult when the world fluctuates randomly and ceaselessly. Classical learning algorithms, such as the delta rule with constant learning rate, are not optimal. Mathematically, the optimal learning rule requires weighting prior knowledge and incoming evidence according to their respective reliabilities. This “confidence weighting” implies the maintenance of an accurate estimate of the reliability of what has been learned. Here, using fMRI and an ideal-observer analysis, we demonstrate that the brain’s learning algorithm relies on confidence weighting. While in the fMRI scanner, human adults attempted to learn the transition probabilities underlying an auditory or visual sequence, and reported their confidence in those estimates. They knew that these transition probabilities could change simultaneously at unpredicted moments, and therefore that the learning problem was inherently hierarchical. Subjective confidence reports tightly followed the predictions derived from the ideal observer. In particular, subjects managed to attach distinct levels of confidence to each learned transition probability, as required by Bayes-optimal inference. Distinct brain areas tracked the likelihood of new observations given current predictions, and the confidence in those predictions. Both signals were combined in the right inferior frontal gyrus, where they operated in agreement with the confidence-weighting model. This brain region also presented signatures of a hierarchical process that disentangles distinct sources of uncertainty. Together, our results provide evidence that the sense of confidence is an essential ingredient of probabilistic learning in the human brain, and that the right inferior frontal gyrus hosts a confidence-based statistical learning algorithm for auditory and visual sequences. PMID:28439014

  8. The Challenge of Wireless Reliability and Coexistence.

    PubMed

    Berger, H Stephen

    2016-09-01

    Wireless communication plays an increasingly important role in healthcare delivery. This further heightens the importance of wireless reliability, but quantifying wireless reliability is a complex and difficult challenge. Understanding the risks that accompany the many benefits of wireless communication should be a component of overall risk management. The emerging trend of using sensors and other device-to-device communications, as part of the emerging Internet of Things concept, is evident in healthcare delivery. The trend increases both the importance and complexity of this challenge. As with most system problems, finding a solution requires breaking down the problem into manageable steps. Understanding the operational reliability of a new wireless device and its supporting system requires developing solid, quantified answers to three questions: 1) How well can this new device and its system operate in a spectral environment where many other wireless devices are also operating? 2) What is the spectral environment in which this device and its system are expected to operate? Are the risks and reliability in its operating environment acceptable? 3) How might the new device and its system affect other devices and systems already in use? When operated under an insightful risk management process, wireless technology can be safely implemented, resulting in improved delivery of care.

  9. Joint Force Quarterly. Issue 52, 1st Quarter, January 2009

    DTIC Science & Technology

    2009-01-01

    hazard potential n self -contained operations with minimal heat or waste effluents n largely robotic operation n inherently safe operation volume...Moreover, even if a node is destroyed or a link cut, these systems are self - healing , allowing them to continue functioning with no apparent degra...Maxie Y. Davis, and Lee T. Wight 97 Irregular Warfare Is Warfare By Kenneth C. Coons, Jr., and Glenn M. Harned 104 Wired for War? Robots and Military

  10. Space Transportation System Availability Requirement and Its Influencing Attributes Relationships

    NASA Technical Reports Server (NTRS)

    Rhodes, Russell E.; Adams, Timothy C.; McCleskey, Carey M.

    2008-01-01

    It is important that engineering and management accept the need for an availability requirement that is derived with its influencing attributes. It is the intent of this paper to provide the visibility of relationships of these major attribute drivers (variables) to each other and the resultant system inherent availability. Also important to provide bounds of the variables providing engineering the insight required to control the system's engineering solution, e.g., these influencing attributes become design requirements also. These variables will drive the need to provide integration of similar discipline functions or technology selection to allow control of the total parts count. The relationship of selecting a reliability requirement will place a constraint on parts count to achieve a given availability requirement or if allowed to increase the parts count will drive the system reliability requirement higher. They also provide the understanding for the relationship of mean repair time (or mean down time) to maintainability, e.g., accessibility for repair, and both the mean time between failure, e.g., reliability of hardware and availability. The concerns and importance of achieving a strong availability requirement is driven by the need for affordability, the choice of using the two launch solution for the single space application, or the need to control the spare parts count needed to support the long stay in either orbit or on the surface of the moon. Understanding the requirements before starting the architectural design concept will avoid considerable time and money required to iterate the design to meet the redesign and assessment process required to achieve the results required of the customer's space transportation system. In fact the impact to the schedule to being able to deliver the system that meets the customer's needs, goals, and objectives may cause the customer to compromise his desired operational goal and objectives resulting in considerable increased life cycle cost of the fielded space transportation system.

  11. Space Transportation System Availability Requirements and Its Influencing Attributes Relationships

    NASA Technical Reports Server (NTRS)

    Rhodes, Russell E.; Adams, Timothy C.; McCleskey, Carey M.

    2008-01-01

    It is important that engineering and management accept the need for an availability requirement that is derived with its influencing attributes. It is the intent of this paper to provide the visibility of relationships of these major attribute drivers (variables) to each other and the resultant system inherent availability. Also important to provide bounds of the variables providing engineering the insight required to control the system's engineering solution, e.g., these influencing attributes become design requirements also. These variables will drive the need to provide integration of similar discipline functions or technology selection to allow control of the total parts count. The relationship of selecting a reliability requirement will place a constraint on parts count to achieve a given availability requirement or if allowed to increase the parts count will drive the system reliability requirement higher. They also provide the understanding for the relationship of mean repair time (or mean down time) to maintainability, e.g., accessibility for repair, and both the mean time between failure, e.g., reliability of hardware and availability. The concerns and importance of achieving a strong availability requirement is driven by the need for affordability, the choice of using the two launch solution for the single space application, or the need to control the spare parts count needed to support the long stay in either orbit or on the surface of the moon. Understanding the requirements before starting the architectural design concept will avoid considerable time and money required to iterate the design to meet the redesign and assessment process required to achieve the results required of the customer's space transportation system. In fact the impact to the schedule to being able to deliver the system that meets the customer's needs, goals, and objectives may cause the customer to compromise his desired operational goal and objectives resulting in considerable increased life cycle cost of the fielded space transportation system.

  12. Criticality Safety Evaluation of the LLNL Inherently Safe Subcritical Assembly (ISSA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Percher, Catherine

    2012-06-19

    The LLNL Nuclear Criticality Safety Division has developed a training center to illustrate criticality safety and reactor physics concepts through hands-on experimental training. The experimental assembly, the Inherently Safe Subcritical Assembly (ISSA), uses surplus highly enriched research reactor fuel configured in a water tank. The training activities will be conducted by LLNL following the requirements of an Integration Work Sheet (IWS) and associated Safety Plan. Students will be allowed to handle the fissile material under the supervision of LLNL instructors. This report provides the technical criticality safety basis for instructional operations with the ISSA experimental assembly.

  13. Enhanced Component Performance Study: Air-Operated Valves 1998-2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2015-11-01

    This report presents a performance evaluation of air-operated valves (AOVs) at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience failure reports from fiscal year 1998 through 2014 for the component reliability as reported in the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The AOV failure modes considered are failure-to-open/close, failure to operate or control, and spurious operation. The component reliability estimates and the reliability data are trended for the most recent 10-year period, while yearly estimates for reliability are provided for the entire active period. One statistically significantmore » trend was observed in the AOV data: The frequency of demands per reactor year for valves recording the fail-to-open or fail-to-close failure modes, for high-demand valves (those with greater than twenty demands per year), was found to be decreasing. The decrease was about three percent over the ten year period trended.« less

  14. Human Factors in Railroad Operations : Initial Studies

    DOT National Transportation Integrated Search

    1972-01-01

    This report summarizes the progress of a year's work in providing support in human factors to the Federal Railroad Administration. The principal topics include: (a) a description of the locomotive engineer's job, particularly with regard to its inher...

  15. Reliability and quality assurance on the MOD 2 wind system

    NASA Technical Reports Server (NTRS)

    Mason, W. E. B.; Jones, B. G.

    1981-01-01

    The Safety, Reliability, and Quality Assurance (R&QA) approach developed for the largest wind turbine generator, the Mod 2, is described. The R&QA approach assures that the machine is not hazardous to the public or to the operating personnel, is operated unattended on a utility grid, demonstrates reliable operation, and helps establish the quality assurance and maintainability requirements for future wind turbine projects. The significant guideline consisted of a failure modes and effects analysis (FMEA) during the design phase, hardware inspections during parts fabrication, and three simple documents to control activities during machine construction and operation.

  16. Materials as stem cell regulators

    PubMed Central

    Murphy, William L.; McDevitt, Todd C.; Engler, Adam J.

    2014-01-01

    The stem cell/material interface is a complex, dynamic microenvironment in which the cell and the material cooperatively dictate one another's fate: the cell by remodelling its surroundings, and the material through its inherent properties (such as adhesivity, stiffness, nanostructure or degradability). Stem cells in contact with materials are able to sense their properties, integrate cues via signal propagation and ultimately translate parallel signalling information into cell fate decisions. However, discovering the mechanisms by which stem cells respond to inherent material characteristics is challenging because of the highly complex, multicomponent signalling milieu present in the stem cell environment. In this Review, we discuss recent evidence that shows that inherent material properties may be engineered to dictate stem cell fate decisions, and overview a subset of the operative signal transduction mechanisms that have begun to emerge. Further developments in stem cell engineering and mechanotransduction are poised to have substantial implications for stem cell biology and regenerative medicine. PMID:24845994

  17. An Overview of a Trajectory-Based Solution for En Route and Terminal Area Self-Spacing to Include Parallel Runway Operations

    NASA Technical Reports Server (NTRS)

    Abbott, Terence S.

    2011-01-01

    This paper presents an overview of an algorithm specifically designed to support NASA's Airborne Precision Spacing concept. This airborne self-spacing concept is trajectory-based, allowing for spacing operations prior to the aircraft being on a common path. This implementation provides the ability to manage spacing against two traffic aircraft, with one of these aircraft operating to a parallel dependent runway. Because this algorithm is trajectory-based, it also has the inherent ability to support required-time-of-arrival (RTA) operations

  18. Economic assessment and optimal operation of CSP systems with TES in California electricity markets

    NASA Astrophysics Data System (ADS)

    Dowling, Alexander W.; Dyreson, Ana; Miller, Franklin; Zavala, Victor M.

    2017-06-01

    The economics and performance of concentrated power (CSP) systems with thermal energy storage (TES) inherently depend on operating policies and the surrounding weather conditions and electricity markets. We present an integrated economic assessment framework to quantify the maximum possible revenues from simultaneous energy and ancillary services sales by CSP systems. The framework includes both discrete start-up/shutdown restrictions and detailed physical models. Analysis of coinci-dental historical market and meteorological data reveals provision of ancillary services increases market revenue 18% to 37% relative to energy-only participation. Surprisingly, only 53% to 62% of these revenues are available through sole participation in the day-ahead market, indicating significant opportunities at faster timescales. Motivated by water-usage concerns and permitting requirements, we also describe a new nighttime radiative-enhanced dry-cooling system with cold-side storage that consumes no water and offers higher effciencies than traditional air-cooled designs. Operation of this new system is complicated by the cold-side storage and inherent coupling between the cooling system and power plant, further motivating integrated economic analysis.

  19. Approximation Model Building for Reliability & Maintainability Characteristics of Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Morris, W. Douglas; White, Nancy H.; Lepsch, Roger A.; Brown, Richard W.

    2000-01-01

    This paper describes the development of parametric models for estimating operational reliability and maintainability (R&M) characteristics for reusable vehicle concepts, based on vehicle size and technology support level. A R&M analysis tool (RMAT) and response surface methods are utilized to build parametric approximation models for rapidly estimating operational R&M characteristics such as mission completion reliability. These models that approximate RMAT, can then be utilized for fast analysis of operational requirements, for lifecycle cost estimating and for multidisciplinary sign optimization.

  20. A dual-channel flux-switching permanent magnet motor for hybrid electric vehicles

    NASA Astrophysics Data System (ADS)

    Hua, Wei; Wu, Zhongze; Cheng, Ming; Wang, Baoan; Zhang, Jianzhong; Zhou, Shigui

    2012-04-01

    The flux-switching permanent magnet (FSPM) motor is a relatively novel brushless machine having both magnets and concentrated windings in the stator, which exhibits inherently sinusoidal PM flux-linkage, back-EMF waveforms, and high torque capability. However, in the application of hybrid electric vehicles, it is essential to prevent magnets and armature windings moving in radial direction due to the possible vibration during operation, and to ensure fault-tolerant capability. Hence, in this paper based on an original FSPM motor, a dual-channel FSPM (DC-FSPM) motor with modified structure to fix both armature windings and magnets and improved reliability is proposed for a practical 10 kW integral starter/generator (ISG) in hybrid electric vehicles. The influences of different solutions and the end-effect on the static characteristics, are evaluated based on the 2D and 3D finite element analysis, respectively. Finally, both the predicted and experimental results, compared with a prototype DC-FSPM motor and an interior PM motor used in Honda Civic, confirm that the more sinusoidal back-EMF waveform and lower torque ripple can be achieved in the DC-FSPM motor, whereas the torque is smaller under the same coil current.

  1. Bladder tissue engineering through nanotechnology.

    PubMed

    Harrington, Daniel A; Sharma, Arun K; Erickson, Bradley A; Cheng, Earl Y

    2008-08-01

    The field of tissue engineering has developed in phases: initially researchers searched for "inert" biomaterials to act solely as replacement structures in the body. Then, they explored biodegradable scaffolds--both naturally derived and synthetic--for the temporary support of growing tissues. Now, a third phase of tissue engineering has developed, through the subcategory of "regenerative medicine." This renewed focus toward control over tissue morphology and cell phenotype requires proportional advances in scaffold design. Discoveries in nanotechnology have driven both our understanding of cell-substrate interactions, and our ability to influence them. By operating at the size regime of proteins themselves, nanotechnology gives us the opportunity to directly speak the language of cells, through reliable, repeatable creation of nanoscale features. Understanding the synthesis of nanoscale materials, via "top-down" and "bottom-up" strategies, allows researchers to assess the capabilities and limits inherent in both techniques. Urology research as a whole, and bladder regeneration in particular, are well-positioned to benefit from such advances, since our present technology has yet to reach the end goal of functional bladder restoration. In this article, we discuss the current applications of nanoscale materials to bladder tissue engineering, and encourage researchers to explore these interdisciplinary technologies now, or risk playing catch-up in the future.

  2. Early Oscillation Detection for DC/DC Converter Fault Diagnosis

    NASA Technical Reports Server (NTRS)

    Wang, Bright L.

    2011-01-01

    The electrical power system of a spacecraft plays a very critical role for space mission success. Such a modern power system may contain numerous hybrid DC/DC converters both inside the power system electronics (PSE) units and onboard most of the flight electronics modules. One of the faulty conditions for DC/DC converter that poses serious threats to mission safety is the random occurrence of oscillation related to inherent instability characteristics of the DC/DC converters and design deficiency of the power systems. To ensure the highest reliability of the power system, oscillations in any form shall be promptly detected during part level testing, system integration tests, flight health monitoring, and on-board fault diagnosis. The popular gain/phase margin analysis method is capable of predicting stability levels of DC/DC converters, but it is limited only to verification of designs and to part-level testing on some of the models. This method has to inject noise signals into the control loop circuitry as required, thus, interrupts the DC/DC converter's normal operation and increases risks of degrading and damaging the flight unit. A novel technique to detect oscillations at early stage for flight hybrid DC/DC converters was developed.

  3. A Hybrid Readout Solution for GaN-Based Detectors Using CMOS Technology.

    PubMed

    Padmanabhan, Preethi; Hancock, Bruce; Nikzad, Shouleh; Bell, L Douglas; Kroep, Kees; Charbon, Edoardo

    2018-02-03

    Gallium nitride (GaN) and its alloys are becoming preferred materials for ultraviolet (UV) detectors due to their wide bandgap and tailorable out-of-band cutoff from 3.4 eV to 6.2 eV. GaN based avalanche photodiodes (APDs) are particularly suitable for their high photon sensitivity and quantum efficiency in the UV region and for their inherent insensitivity to visible wavelengths. Challenges exist however for practical utilization. With growing interests in such photodetectors, hybrid readout solutions are becoming prevalent with CMOS technology being adopted for its maturity, scalability, and reliability. In this paper, we describe our approach to combine GaN APDs with a CMOS readout circuit, comprising of a linear array of 1 × 8 capacitive transimpedance amplifiers (CTIAs), implemented in a 0.35 µm high voltage CMOS technology. Further, we present a simple, yet sustainable circuit technique to allow operation of APDs under high reverse biases, up to ≈80 V with verified measurement results. The readout offers a conversion gain of 0.43 µV/e - , obtaining avalanche gains up to 10³. Several parameters of the CTIA are discussed followed by a perspective on possible hybridization, exploiting the advantages of a 3D-stacked technology.

  4. Antiretroviral procurement and supply chain management.

    PubMed

    Ripin, David J; Jamieson, David; Meyers, Amy; Warty, Umesh; Dain, Mary; Khamsi, Cyril

    2014-01-01

    Procurement, the country-level process of ordering antiretrovirals (ARVs), and supply chain management, the mechanism by which they are delivered to health-care facilities, are critical processes required to move ARVs from manufacturers to patients. To provide a glimpse into the ARV procurement and supply chain, the following pages provide an overview of the primary stakeholders, principal operating models, and policies and regulations involved in ARV procurement. Also presented are key challenges that need to be addressed to ensure that the supply chain is not a barrier to the goal of universal coverage. This article will cover the steps necessary to order and distribute ARVs, including different models of delivery, key stakeholders involved, strategic considerations that vary depending on context and policies affecting them. The single drug examples given illustrate the complications inherent in fragmented supply and demand-driven models of procurement and supply chain management, and suggest tools for navigating these hurdles that will ultimately result in more secure and reliable ARV provision. Understanding the dynamics of ARV supply chain is important for the global health community, both to ensure full and efficient treatment of persons living with HIV as well as to inform the supply chain decisions for other public health products.

  5. Design, Development and Pre-Flight Testing of the Communications, Navigation, and Networking Reconfigurable Testbed (Connect) to Investigate Software Defined Radio Architecture on the International Space Station

    NASA Technical Reports Server (NTRS)

    Over, Ann P.; Barrett, Michael J.; Reinhart, Richard C.; Free, James M.; Cikanek, Harry A., III

    2011-01-01

    The Communication Navigation and Networking Reconfigurable Testbed (CoNNeCT) is a NASA-sponsored mission, which will investigate the usage of Software Defined Radios (SDRs) as a multi-function communication system for space missions. A softwaredefined radio system is a communication system in which typical components of the system (e.g., modulators) are incorporated into software. The software-defined capability allows flexibility and experimentation in different modulation, coding and other parameters to understand their effects on performance. This flexibility builds inherent redundancy and flexibility into the system for improved operational efficiency, real-time changes to space missions and enhanced reliability/redundancy. The CoNNeCT Project is a collaboration between industrial radio providers and NASA. The industrial radio providers are providing the SDRs and NASA is designing, building and testing the entire flight system. The flight system will be integrated on the Express Logistics Carrier (ELC) on the International Space Station (ISS) after launch on the H-IIB Transfer Vehicle in 2012. This paper provides an overview of the technology research objectives, payload description, design challenges and pre-flight testing results.

  6. Input-output oriented computation algorithms for the control of large flexible structures

    NASA Technical Reports Server (NTRS)

    Minto, K. D.

    1989-01-01

    An overview is given of work in progress aimed at developing computational algorithms addressing two important aspects in the control of large flexible space structures; namely, the selection and placement of sensors and actuators, and the resulting multivariable control law design problem. The issue of sensor/actuator set selection is particularly crucial to obtaining a satisfactory control design, as clearly a poor choice will inherently limit the degree to which good control can be achieved. With regard to control law design, the researchers are driven by concerns stemming from the practical issues associated with eventual implementation of multivariable control laws, such as reliability, limit protection, multimode operation, sampling rate selection, processor throughput, etc. Naturally, the burden imposed by dealing with these aspects of the problem can be reduced by ensuring that the complexity of the compensator is minimized. Our approach to these problems is based on extensions to input/output oriented techniques that have proven useful in the design of multivariable control systems for aircraft engines. In particular, researchers are exploring the use of relative gain analysis and the condition number as a means of quantifying the process of sensor/actuator selection and placement for shape control of a large space platform.

  7. A methodology for achieving high-speed rates for artificial conductance injection in electrically excitable biological cells.

    PubMed

    Butera, R J; Wilson, C G; Delnegro, C A; Smith, J C

    2001-12-01

    We present a novel approach to implementing the dynamic-clamp protocol (Sharp et al., 1993), commonly used in neurophysiology and cardiac electrophysiology experiments. Our approach is based on real-time extensions to the Linux operating system. Conventional PC-based approaches have typically utilized single-cycle computational rates of 10 kHz or slower. In thispaper, we demonstrate reliable cycle-to-cycle rates as fast as 50 kHz. Our system, which we call model reference current injection (MRCI); pronounced merci is also capable of episodic logging of internal state variables and interactive manipulation of model parameters. The limiting factor in achieving high speeds was not processor speed or model complexity, but cycle jitter inherent in the CPU/motherboard performance. We demonstrate these high speeds and flexibility with two examples: 1) adding action-potential ionic currents to a mammalian neuron under whole-cell patch-clamp and 2) altering a cell's intrinsic dynamics via MRCI while simultaneously coupling it via artificial synapses to an internal computational model cell. These higher rates greatly extend the applicability of this technique to the study of fast electrophysiological currents such fast a currents and fast excitatory/inhibitory synapses.

  8. Fuzzy – PI controller to control the velocity parameter of Induction Motor

    NASA Astrophysics Data System (ADS)

    Malathy, R.; Balaji, V.

    2018-04-01

    The major application of Induction motor includes the usage of the same in industries because of its high robustness, reliability, low cost, highefficiency and good self-starting capability. Even though it has the above mentioned advantages, it also have some limitations: (1) the standard motor is not a true constant-speed machine, itsfull-load slip varies less than 1 % (in high-horsepower motors).And (2) it is not inherently capable of providing variable-speedoperation. In order to solve the above mentioned problem smart motor controls and variable speed controllers are used. Motor applications involve non linearity features, which can be controlled by Fuzzy logic controller as it is capable of handling those features with high efficiency and it act similar to human operator. This paper presents individuality of the plant modelling. The fuzzy logic controller (FLC)trusts on a set of linguistic if-then rules, a rule-based Mamdani for closed loop Induction Motor model. Themotor model is designed and membership functions are chosenaccording to the parameters of the motor model. Simulation results contains non linearity in induction motor model. A conventional PI controller iscompared practically to fuzzy logic controller using Simulink.

  9. A review of inherent safety characteristics of metal alloy sodium-cooled fast reactor fuel against postulated accidents

    DOE PAGES

    Sofu, Tanju

    2015-04-01

    The thermal, mechanical, and neutronic performance of the metal alloy fast reactor fuel design complements the safety advantages of the liquid metal cooling and the pool-type primary system. Together, these features provide large safety margins in both normal operating modes and for a wide range of postulated accidents. In particular, they maximize the measures of safety associated with inherent reactor response to unprotected, double-fault accidents, and to minimize risk to the public and plant investment. High thermal conductivity and high gap conductance play the most significant role in safety advantages of the metallic fuel, resulting in a flatter radial temperaturemore » profile within the pin and much lower normal operation and transient temperatures in comparison to oxide fuel. Despite the big difference in melting point, both oxide and metal fuels have a relatively similar margin to melting during postulated accidents. When the metal fuel cladding fails, it typically occurs below the coolant boiling point and the damaged fuel pins remain coolable. Metal fuel is compatible with sodium coolant, eliminating the potential of energetic fuel--coolant reactions and flow blockages. All these, and the low retained heat leading to a longer grace period for operator action, are significant contributing factors to the inherently benign response of metallic fuel to postulated accidents. This paper summarizes the past analytical and experimental results obtained in past sodium-cooled fast reactor safety programs in the United States, and presents an overview of fuel safety performance as observed in laboratory and in-pile tests.« less

  10. A review of inherent safety characteristics of metal alloy sodium-cooled fast reactor fuel against postulated accidents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sofu, Tanju

    2015-04-01

    The thermal, mechanical, and neutronic performance of the metal alloy fast reactor fuel design complements the safety advantages of the liquid metal cooling and the pool-type primary system. Together, these features provide large safety margins in both normal operating modes and for a wide range of postulated accidents. In particular, they maximize the measures of safety associated with inherent reactor response to unprotected, double-fault accidents, and to minimize risk to the public and plant investment. High thermal conductivity and high gap conductance play the most significant role in safety advantages of the metallic fuel, resulting in a flatter radial temperaturemore » profile within the pin and much lower normal operation and transient temperatures in comparison to oxide fuel. Despite the big difference in melting point, both oxide and metal fuels have a relatively similar margin to melting during postulated accidents. When the metal fuel cladding fails, it typically occurs below the coolant boiling point and the damaged fuel pins remain cool-able. Metal fuel is compatible with sodium coolant, eliminating the potential of energetic fuel coolant reactions and flow blockages. All these, and the low retained heat leading to a longer grace period for operator action, are significant contributing factors to the inherently benign response of metallic fuel to postulated accidents. This paper summarizes the past analytical and experimental results obtained in past sodium-cooled fast reactor safety programs in the United States, and presents an overview of fuel safety performance as observed in laboratory and in-pile tests.« less

  11. A Meta-Analysis of Factors Affecting Trust in Human-Robot Interaction

    DTIC Science & Technology

    2011-10-01

    directly affects the willingness of people to accept robot -produced information, follow robots ’ suggestions, and thus benefit from the advantages inherent...perceived complexity of operation). Consequently, if the perceived risk of using the robot exceeds its perceived benefit , practical operators almost...necessary presence of a human caregiver (Graf, Hans, & Schraft, 2004). Other robotic devices, such as wheelchairs (Yanco, 2001) and exoskeletons (e.g

  12. Operation Inherent Resolve

    DTIC Science & Technology

    2015-04-01

    suggestions for reducing this burden, to Washington Headquarters Services , Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway... Service Auditors General to coordinate their oversight and avoid duplication of effort. Section 8L provides a new mandate for the three Lead IG...SUMMARY 7 • Medical Support Service in Iraq (DoS OIG). DoS OIG issued a manage- ment assistance report on concerns with oversight of medical support

  13. 16 CFR § 1211.7 - Inherent entrapment protection requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... operator reverses the door a minimum of 2 inches (50.8 mm). (b)(1) A solid object is to be placed on the... the door, a solid object is not required to be located in line with the driving point of the operator. The solid object is to be located at points at the center, and within 1 foot of each end of the door...

  14. A high specific power solar array for low to mid-power spacecraft

    NASA Technical Reports Server (NTRS)

    Jones, P. Alan; White, Stephen F.; Harvey, T. Jeffery; Smith, Brian S.

    1993-01-01

    UltraFlex is the generic term for a solar array system which delivers on-orbit power in the 400 to 6,000 watt per wing sizes with end-of-life specific power performance ranging to 150 watts-per-kilogram. Such performance is accomplished with off-the-shelf solar cells and state-of the-art materials and processes. Much of the recent work in photovoltaics is centered on advanced solar cell development. Successful as such work has been, no integrated solar array system has emerged which meets NASA's stated goals of 'increasing the end-of-life performance of space solar cells and arrays while minimizing their mass and cost.' This issue is addressed; namely, is there an array design that satisfies the usual requirements for space-rated hardware and that is inherently reliable, inexpensive, easily manufactured and simple, which can be used with both advanced cells currently in development and with inexpensive silicon cells? The answer is yes. The UltraFlex array described incorporates use of a blanket substrate which is thermally compatible with silicon and other materials typical of advanced multi-junction devices. The blanket materials are intrinsically insensitive to atomic oxygen degradation, are space rated, and are compatible with standard cell bonding processes. The deployment mechanism is simple and reliable and the structure is inherently stiff (high natural frequency). Mechanical vibration modes are also readily damped. The basic design is presented as well as supporting analysis and development tests.

  15. A high specific power solar array for low to mid-power spacecraft

    NASA Astrophysics Data System (ADS)

    Jones, P. Alan; White, Stephen F.; Harvey, T. Jeffery; Smith, Brian S.

    1993-05-01

    UltraFlex is the generic term for a solar array system which delivers on-orbit power in the 400 to 6,000 watt per wing sizes with end-of-life specific power performance ranging to 150 watts-per-kilogram. Such performance is accomplished with off-the-shelf solar cells and state-of the-art materials and processes. Much of the recent work in photovoltaics is centered on advanced solar cell development. Successful as such work has been, no integrated solar array system has emerged which meets NASA's stated goals of 'increasing the end-of-life performance of space solar cells and arrays while minimizing their mass and cost.' This issue is addressed; namely, is there an array design that satisfies the usual requirements for space-rated hardware and that is inherently reliable, inexpensive, easily manufactured and simple, which can be used with both advanced cells currently in development and with inexpensive silicon cells? The answer is yes. The UltraFlex array described incorporates use of a blanket substrate which is thermally compatible with silicon and other materials typical of advanced multi-junction devices. The blanket materials are intrinsically insensitive to atomic oxygen degradation, are space rated, and are compatible with standard cell bonding processes. The deployment mechanism is simple and reliable and the structure is inherently stiff (high natural frequency). Mechanical vibration modes are also readily damped. The basic design is presented as well as supporting analysis and development tests.

  16. 75 FR 71613 - Mandatory Reliability Standards for Interconnection Reliability Operating Limits

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-24

    ... Reliability Standards. The proposed Reliability Standards were designed to prevent instability, uncontrolled... Reliability Standards.\\2\\ The proposed Reliability Standards were designed to prevent instability... the SOLs, which if exceeded, could expose a widespread area of the bulk electric system to instability...

  17. Verification of Ensemble Forecasts for the New York City Operations Support Tool

    NASA Astrophysics Data System (ADS)

    Day, G.; Schaake, J. C.; Thiemann, M.; Draijer, S.; Wang, L.

    2012-12-01

    The New York City water supply system operated by the Department of Environmental Protection (DEP) serves nine million people. It covers 2,000 square miles of portions of the Catskill, Delaware, and Croton watersheds, and it includes nineteen reservoirs and three controlled lakes. DEP is developing an Operations Support Tool (OST) to support its water supply operations and planning activities. OST includes historical and real-time data, a model of the water supply system complete with operating rules, and lake water quality models developed to evaluate alternatives for managing turbidity in the New York City Catskill reservoirs. OST will enable DEP to manage turbidity in its unfiltered system while satisfying its primary objective of meeting the City's water supply needs, in addition to considering secondary objectives of maintaining ecological flows, supporting fishery and recreation releases, and mitigating downstream flood peaks. The current version of OST relies on statistical forecasts of flows in the system based on recent observed flows. To improve short-term decision making, plans are being made to transition to National Weather Service (NWS) ensemble forecasts based on hydrologic models that account for short-term weather forecast skill, longer-term climate information, as well as the hydrologic state of the watersheds and recent observed flows. To ensure that the ensemble forecasts are unbiased and that the ensemble spread reflects the actual uncertainty of the forecasts, a statistical model has been developed to post-process the NWS ensemble forecasts to account for hydrologic model error as well as any inherent bias and uncertainty in initial model states, meteorological data and forecasts. The post-processor is designed to produce adjusted ensemble forecasts that are consistent with the DEP historical flow sequences that were used to develop the system operating rules. A set of historical hindcasts that is representative of the real-time ensemble forecasts is needed to verify that the post-processed forecasts are unbiased, statistically reliable, and preserve the skill inherent in the "raw" NWS ensemble forecasts. A verification procedure and set of metrics will be presented that provide an objective assessment of ensemble forecasts. The procedure will be applied to both raw ensemble hindcasts and to post-processed ensemble hindcasts. The verification metrics will be used to validate proper functioning of the post-processor and to provide a benchmark for comparison of different types of forecasts. For example, current NWS ensemble forecasts are based on climatology, using each historical year to generate a forecast trace. The NWS Hydrologic Ensemble Forecast System (HEFS) under development will utilize output from both the National Oceanic Atmospheric Administration (NOAA) Global Ensemble Forecast System (GEFS) and the Climate Forecast System (CFS). Incorporating short-term meteorological forecasts and longer-term climate forecast information should provide sharper, more accurate forecasts. Hindcasts from HEFS will enable New York City to generate verification results to validate the new forecasts and further fine-tune system operating rules. Project verification results will be presented for different watersheds across a range of seasons, lead times, and flow levels to assess the quality of the current ensemble forecasts.

  18. Techniques used for the screening of hemoglobin levels in blood donors: current insights and future directions.

    PubMed

    Chaudhary, Rajendra; Dubey, Anju; Sonker, Atul

    2017-01-01

    Blood donor hemoglobin (Hb) estimation is an important donation test that is performed prior to blood donation. It serves the dual purpose of protecting the donors' health against anemia and ensuring good quality of blood components, which has an implication on recipients' health. Diverse cutoff criteria have been defined world over depending on population characteristics; however, no testing methodology and sample requirement have been specified for Hb screening. Besides the technique, there are several physiological and methodological factors that affect accuracy and reliability of Hb estimation. These include the anatomical source of blood sample, posture of the donor, timing of sample and several other biological factors. Qualitative copper sulfate gravimetric method has been the archaic time-tested method that is still used in resource-constrained settings. Portable hemoglobinometers are modern quantitative devices that have been further modified to reagent-free cuvettes. Furthermore, noninvasive spectrophotometry was introduced, mitigating pain to blood donor and eliminating risk of infection. Notwithstanding a tremendous evolution in terms of ease of operation, accuracy, mobility, rapidity and cost, a component of inherent variability persists, which may partly be attributed to pre-analytical variables. Hence, blood centers should pay due attention to validation of test methodology, competency of operating staff and regular proficiency testing of the outputs. In this article, we have reviewed various regulatory guidelines, described the variables that affect the measurements and compared the validated technologies for Hb screening of blood donors along with enumeration of their merits and limitations.

  19. Bacterial taxa abundance pattern in an industrial wastewater treatment system determined by the full rRNA cycle approach.

    PubMed

    Figuerola, Eva L M; Erijman, Leonardo

    2007-07-01

    The description of the diversity and structure of microbial communities through quantification of the constituent populations is one of the major objectives in environmental microbiology. The implications of models for community assembly are practical as well as theoretical, because the extent of biodiversity is thought to influence the function of ecosystems. Current attempts to predict species diversity in different environments derive the numbers of individuals for each operational taxonomic unit (OTU) from the frequency of clones in 16S rDNA gene libraries, which are subjected to a number of inherent biases and artefacts. We show that diversity of the bacterial community present in a complex microbial ensemble can be estimated by fitting the data of the full-cycle rRNA approach to a model of species abundance distribution. Sequences from a 16S rDNA gene library from activated sludge were reliably assigned to OTUs at a genetic distance of 0.04. A group of 17 newly designed rRNA-targeted oligonucleotide probes were used to quantify by fluorescence in situ hybridization, OTUs represented with more than three clones in the 16S rDNA clone library. Cell abundance distribution was best described by a geometric series, after the goodness of fit was evaluated by the Kolmogorov-Smirnov test. Although a complete mechanistic understanding of all the ecological processes involved is still not feasible, describing the distribution pattern of a complex bacterial assemblage model can shed light on the way bacterial communities operate.

  20. Ammonia and ammonium hydroxide sensors for ammonia/water absorption machines: Literature review and data compilation

    NASA Astrophysics Data System (ADS)

    Anheier, N. C., Jr.; McDonald, C. E.; Cuta, J. M.; Cuta, F. M.; Olsen, K. B.

    1995-05-01

    This report describes an evaluation of various sensing techniques for determining the ammonia concentration in the working fluid of ammonia/water absorption cycle systems. The purpose was to determine if any existing sensor technology or instrumentation could provide an accurate, reliable, and cost-effective continuous measure of ammonia concentration in water. The resulting information will be used for design optimization and cycle control in an ammonia-absorption heat pump. Pacific Northwest Laboratory (PNL) researchers evaluated each sensing technology against a set of general requirements characterizing the potential operating conditions within the absorption cycle. The criteria included the physical constraints for in situ operation, sensor characteristics, and sensor application. PNL performed an extensive literature search, which uncovered several promising sensing technologies that might be applicable to this problem. Sixty-two references were investigated, and 33 commercial vendors were identified as having ammonia sensors. The technologies for ammonia sensing are acoustic wave, refractive index, electrode, thermal, ion-selective field-effect transistor (ISFET), electrical conductivity, pH/colormetric, and optical absorption. Based on information acquired in the literature search, PNL recommends that follow-on activities focus on ISFET devices and a fiber optic evanescent sensor with a colormetric indicator. The ISFET and fiber optic evanescent sensor are inherently microminiature and capable of in situ measurements. Further, both techniques have been demonstrated selective to the ammonium ion (NH4(+)). The primary issue remaining is how to make the sensors sufficiently corrosion-resistant to be useful in practice.

  1. Silicon Carbide Gas Sensors for Propulsion Emissions and Safety Applications

    NASA Technical Reports Server (NTRS)

    Hunter, G. W.; Xu, J.; Neudeck, P. G.; Lukco, D.; Trunek, A.; Spry, D.; Lampard, P.; Androjna, D.; Makel, D.; Ward, B.

    2007-01-01

    Silicon carbide (SiC) based gas sensors have the ability to meet the needs of a range of aerospace propulsion applications including emissions monitoring, leak detection, and hydrazine monitoring. These applications often require sensitive gas detection in a range of environments. An effective sensing approach to meet the needs of these applications is a Schottky diode based on a SiC semiconductor. The primary advantage of using SiC as a semiconductor is its inherent stability and capability to operate at a wide range of temperatures. The complete SiC Schottky diode gas sensing structure includes both the SiC semiconductor and gas sensitive thin film metal layers; reliable operation of the SiC-based gas sensing structure requires good control of the interface between these gas sensitive layers and the SiC. This paper reports on the development of SiC gas sensors. The focus is on two efforts to better control the SiC gas sensitive Schottky diode interface. First, the use of palladium oxide (PdOx) as a barrier layer between the metal and SiC is discussed. Second, the use of atomically flat SiC to provide an improved SiC semiconductor surface for gas sensor element deposition is explored. The use of SiC gas sensors in a multi-parameter detection system is briefly discussed. It is concluded that SiC gas sensors have potential in a range of propulsion system applications, but tailoring of the sensor for each application is necessary.

  2. Chemical Method of Urine Volume Measurement

    NASA Technical Reports Server (NTRS)

    Petrack, P.

    1967-01-01

    A system has been developed and qualified as flight hardware for the measurement of micturition volumes voided by crewmen during Gemini missions. This Chemical Urine Volume Measurement System (CUVMS) is used for obtaining samples of each micturition for post-flight volume determination and laboratory analysis for chemical constituents of physiological interest. The system is versatile with respect to volumes measured, with a capacity beyond the largest micturition expected to be encountered, and with respect to mission duration of inherently indefinite length. The urine sample is used for the measurement of total micturition volume by a tracer dilution technique, in which a fixed, predetermined amount of tritiated water is introduced and mixed into the voided urine, and the resulting concentration of the tracer in the sample is determined with a liquid scintillation spectrometer. The tracer employed does not interfere with the analysis for the chemical constituents of the urine. The CUVMS hardware consists of a four-way selector valve in which an automatically operated tracer metering pump is incorporated, a collection/mixing bag, and tracer storage accumulators. The assembled system interfaces with a urine receiver at the selector valve inlet, sample bags which connect to the side of the selector valve, and a flexible hose which carries the excess urine to the overboard drain connection. Results of testing have demonstrated system volume measurement accuracy within the specification limits of +/-5%, and operating reliability suitable for system use aboard the GT-7 mission, in which it was first used.

  3. Active and passive vibration suppression for space structures

    NASA Technical Reports Server (NTRS)

    Hyland, David C.

    1991-01-01

    The relative benefits of passive and active vibration suppression for large space structures (LSS) are discussed. The intent is to sketch the true ranges of applicability of these approaches using previously published technical results. It was found that the distinction between active and passive vibration suppression approaches is not as sharp as might be thought at first. The relative simplicity, reliability, and cost effectiveness touted for passive measures are vitiated by 'hidden costs' bound up with detailed engineering implementation issues and inherent performance limitations. At the same time, reliability and robustness issues are often cited against active control. It is argued that a continuum of vibration suppression measures offering mutually supporting capabilities is needed. The challenge is to properly orchestrate a spectrum of methods to reap the synergistic benefits of combined advanced materials, passive damping, and active control.

  4. NASA Ares 1 Crew Launch Vehicle Upper Stage Configuration Selection Process

    NASA Technical Reports Server (NTRS)

    Cook, Jerry R.

    2006-01-01

    The Upper Stage Element of NASA s Ares I Crew Launch Vehicle (CLV) is a "clean-sheet" approach that is being designed and developed in-house, with Element management at MSFC. The USE concept is a self-supporting cylindrical structure, approximately 115 long and 216" in diameter. While the Reusable Solid Rocket Booster (RSRB) design has changed since the CLV inception, the Upper Stage Element design has remained essentially a clean-sheet approach. Although a clean-sheet upper stage design inherently carries more risk than a modified design, it does offer many advantages: a design for increased reliability; built-in extensibility to allow for commonality/growth without major redesign; and incorporation of state-of-the-art materials, hardware, and design, fabrication, and test techniques and processes to facilitate a potentially better, more reliable system.

  5. The case against one-shot testing for initial dental licensure.

    PubMed

    Chambers, David W; Dugoni, Arthur A; Paisley, Ian

    2004-03-01

    High-stakes testing are expected to meet standards for cost-effectiveness, fairness, transparency, high reliability, and high validity. It is questionable whether initial licensure examinations in dentistry meet such standards. Decades of piecemeal adjustments in the system have resulted in limited improvement. The essential flaw in the system is reliance on a one-shot sample of a small segment of the skills, understanding, and supporting values needed for today's professional practice of dentistry. The "snapshot" approach to testing produces inherently substandard levels of reliability and validity. A three-step alternative is proposed: boards should (1) define the competencies required of beginning practitioners, (2) establish the psychometric standards needed to make defensible judgments about candidates, and (3) base licensure decisions only on portfolios of evidence that test for defined competencies at established levels of quality.

  6. Time-Tagged Risk/Reliability Assessment Program for Development and Operation of Space System

    NASA Astrophysics Data System (ADS)

    Kubota, Yuki; Takegahara, Haruki; Aoyagi, Junichiro

    We have investigated a new method of risk/reliability assessment for development and operation of space system. It is difficult to evaluate risk of spacecraft, because of long time operation, maintenance free and difficulty of test under the ground condition. Conventional methods are FMECA, FTA, ETA and miscellaneous. These are not enough to assess chronological anomaly and there is a problem to share information during R&D. A new method of risk and reliability assessment, T-TRAP (Time-tagged Risk/Reliability Assessment Program) is proposed as a management tool for the development and operation of space system. T-TRAP consisting of time-resolved Fault Tree and Criticality Analyses, upon occurrence of anomaly in the system, facilitates the responsible personnel to quickly identify the failure cause and decide corrective actions. This paper describes T-TRAP method and its availability.

  7. Systems Issues In Terrestrial Fiber Optic Link Reliability

    NASA Astrophysics Data System (ADS)

    Spencer, James L.; Lewin, Barry R.; Lee, T. Frank S.

    1990-01-01

    This paper reviews fiber optic system reliability issues from three different viewpoints - availability, operating environment, and evolving technologies. Present availability objectives for interoffice links and for the distribution loop must be re-examined for applications such as the Synchronous Optical Network (SONET), Fiber-to-the-Home (FTTH), and analog services. The hostile operating environments of emerging applications (such as FTTH) must be carefully considered in system design as well as reliability assessments. Finally, evolving technologies might require the development of new reliability testing strategies.

  8. Technical information report: Plasma melter operation, reliability, and maintenance analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendrickson, D.W.

    1995-03-14

    This document provides a technical report of operability, reliability, and maintenance of a plasma melter for low-level waste vitrification, in support of the Hanford Tank Waste Remediation System (TWRS) Low-Level Waste (LLW) Vitrification Program. A process description is provided that minimizes maintenance and downtime and includes material and energy balances, equipment sizes and arrangement, startup/operation/maintence/shutdown cycle descriptions, and basis for scale-up to a 200 metric ton/day production facility. Operational requirements are provided including utilities, feeds, labor, and maintenance. Equipment reliability estimates and maintenance requirements are provided which includes a list of failure modes, responses, and consequences.

  9. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cetiner, Mustafa Sacit; none,; Flanagan, George F.

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less

  10. Supplier selection criteria for sustainable supply chain management in thermal power plant

    NASA Astrophysics Data System (ADS)

    Firoz, Faisal; Narayan Biswal, Jitendra; Satapathy, Suchismita

    2018-02-01

    Supplies are always in great demand when it comes to industrial operations. The quality of raw material their price accompanied by sustainability and environmental effects are a major concern for industrial operators these days. Supply Chain Management is the subject which is focused on how the supply of different products is carried out. The motive is that each operation performed can be optimized and inherently the efficiency of the whole chain is integrated. In this paper we will be dealing with all the criteria that are required to be evaluated before selecting a supplier, in particular, focusing on Thermal Power Plant. The most suppliers of the thermal power plant are the coal suppliers. The quality of coal directly determines the efficiency of the whole plant. And when there are matters concerning coal environmental pollution plays a very crucial role. ANP method has been used here to select suppliers of thermal power sectors in Indian context. After applying ANP to prioritize the sustainable supplier selection criteria, it is found that for thermal power industries best suppliers are Nationalized/State owned suppliers then 2nd ranked suppliers are imported supplier. Private owned suppliers are ranked least. So private owned suppliers must be more concerned about their performance. Among these suppliers it is found that to compete in the global market privatized suppliers have to give more emphasize on most important criteria like sustainability, then fuel cost and quality. Still some sub-criteria like a clean program, environmental issues, quality, reliability, service rate, investment in high technology, green transportation channel, waste management etc needs for continuous improvement as per their priority.

  11. Fiber-Optic Determination of N2, O2, and Fuel Vapor in the Ullage of Liquid-Fuel Tanks

    NASA Technical Reports Server (NTRS)

    Nguyen, Quang-Viet

    2008-01-01

    A fiber-optic sensor system has been developed that can remotely measure the concentration of molecular oxygen (O2), nitrogen (N2), hydrocarbon vapor, and other gases (CO2, CO, H2O, chlorofluorocarbons, etc.) in the ullage of a liquid-fuel tank. The system provides an accurate and quantitative identification of the above gases with an accuracy of better than 1 percent by volume (for O2 or N2) in real-time (5 seconds). In an effort to prevent aircraft fuel tank fires or explosions similar to the tragic TWA Flight 800 explosion in 1996, OBIGGS are currently being developed for large commercial aircraft to prevent dangerous conditions from forming inside fuel tanks by providing an inerting gas blanket that is low in oxygen, thus preventing the ignition of the fuel/air mixture in the ullage. OBIGGS have been used in military aircraft for many years and are now standard equipment on some newer large commercial aircraft (such as the Boeing 787). Currently, OBIGGS are being developed for retrofitting to existing commercial aircraft fleets in response to pending mandates from the FAA. Most OBIGGS use an air separation module (ASM) that separates O2 from N2 to make nitrogen-enriched air from compressed air flow diverted from the engine (bleed air). Current OBIGGS systems do not have a closed-loop feedback control, in part, due to the lack of suitable process sensors that can reliably measure N2 or O2 and at the same time, do not constitute an inherent source of ignition. Thus, current OBIGGS operate with a high factor-of-safety dictated by process protocol to ensure adequate fuel-tank inerting. This approach is inherently inefficient as it consumes more engine bleed air than is necessary compared to a closed-loop controlled approach. The reduction of bleed air usage is important as it reduces fuel consumption, which translates to both increased flight range and lower operational costs. Numerous approaches to developing OBIGGS feedback-control sensors have been under development by many research groups and companies. However, the direct measurement of nitrogen (N2) is a challenge to most OBIGGS ullage sensors (such as tunable diode laser absorption) as they cannot measure N2 directly but depend on the measurement of oxygen (O2). The problem with a singular measure of O2, is that as the concentration (number density) of O2 decreases due to the inerting process or due to lower pressures from high altitudes, the precision and accuracy of the O2 measurement decreases. However, measuring O2 density in combination with N2 density (which is more abundant in air and in a N2-inerted fuel tank) can provide a much more accurate and reliable determination of the OBIGGS efficacy.

  12. A Comprehensive Planning Model

    ERIC Educational Resources Information Center

    Temkin, Sanford

    1972-01-01

    Combines elements of the problem solving approach inherent in methods of applied economics and operations research and the structural-functional analysis common in social science modeling to develop an approach for economic planning and resource allocation for schools and other public sector organizations. (Author)

  13. Save It or Spend It?

    ERIC Educational Resources Information Center

    Morrell, Louis R.

    1999-01-01

    Discusses principles for allocation of endowment funds by governing boards, including intergenerational equity, the inherent conflict between an institution's operating budget and its endowment, the importance of achieving financial integrity, and spending policies in volatile markets. Guidelines for board-reviewing policies are offered. (DB)

  14. Method for evaluating the reliability of compressor impeller of turbocharger for vehicle application in plateau area

    NASA Astrophysics Data System (ADS)

    Wang, Zheng; Wang, Zengquan; Wang, A.-na; Zhuang, Li; Wang, Jinwei

    2016-10-01

    As turbocharging diesel engines for vehicle application are applied in plateau area, the environmental adaptability of engines has drawn more attention. For the environmental adaptability problem of turbocharging diesel engines for vehicle application, the present studies almost focus on the optimization of performance match between turbocharger and engine, and the reliability problem of turbocharger is almost ignored. The reliability problem of compressor impeller of turbocharger for vehicle application when diesel engines operate in plateau area is studied. Firstly, the rule that the rotational speed of turbocharger changes with the altitude height is presented, and the potential failure modes of compressor impeller are analyzed. Then, the failure behavior models of compressor impeller are built, and the reliability models of compressor impeller operating in plateau area are developed. Finally, the rule that the reliability of compressor impeller changes with the altitude height is studied, the measurements for improving the reliability of the compressor impellers of turbocharger operating in plateau area are given. The results indicate that when the operating speed of diesel engine is certain, the rotational speed of turbocharger increases with the increase of altitude height, and the failure risk of compressor impeller with the failure modes of hub fatigue and blade resonance increases. The reliability of compressor impeller decreases with the increase of altitude height, and it also decreases as the increase of number of the mission profile cycle of engine. The method proposed can not only be used to evaluating the reliability of compressor impeller when diesel engines operate in plateau area but also be applied to direct the structural optimization of compressor impeller.

  15. Reliability evaluation of oil pipelines operating in aggressive environment

    NASA Astrophysics Data System (ADS)

    Magomedov, R. M.; Paizulaev, M. M.; Gebel, E. S.

    2017-08-01

    In connection with modern increased requirements for ecology and safety, the development of diagnostic services complex is obligatory and necessary enabling to ensure the reliable operation of the gas transportation infrastructure. Estimation of oil pipelines technical condition should be carried out not only to establish the current values of the equipment technological parameters in operation, but also to predict the dynamics of changes in the physical and mechanical characteristics of the material, the appearance of defects, etc. to ensure reliable and safe operation. In the paper, existing Russian and foreign methods for evaluation of the oil pipelines reliability are considered, taking into account one of the main factors leading to the appearance of crevice in the pipeline material, i.e. change the shape of its cross-section, - corrosion. Without compromising the generality of the reasoning, the assumption of uniform corrosion wear for the initial rectangular cross section has been made. As a result a formula for calculation the probability of failure-free operation was formulated. The proposed mathematical model makes it possible to predict emergency situations, as well as to determine optimal operating conditions for oil pipelines.

  16. Improving the Reliability of Technological Subsystems Equipment for Steam Turbine Unit in Operation

    NASA Astrophysics Data System (ADS)

    Brodov, Yu. M.; Murmansky, B. E.; Aronson, R. T.

    2017-11-01

    The authors’ conception is presented of an integrated approach to reliability improving of the steam turbine unit (STU) state along with its implementation examples for the various STU technological subsystems. Basing on the statistical analysis of damage to turbine individual parts and components, on the development and application of modern methods and technologies of repair and on operational monitoring techniques, the critical components and elements of equipment are identified and priorities are proposed for improving the reliability of STU equipment in operation. The research results are presented of the analysis of malfunctions for various STU technological subsystems equipment operating as part of power units and at cross-linked thermal power plants and resulting in turbine unit shutdown (failure). Proposals are formulated and justified for adjustment of maintenance and repair for turbine components and parts, for condenser unit equipment, for regeneration subsystem and oil supply system that permit to increase the operational reliability, to reduce the cost of STU maintenance and repair and to optimize the timing and amount of repairs.

  17. Consistency of clinical biomechanical measures between three different institutions: implications for multi-center biomechanical and epidemiological research.

    PubMed

    Myer, Gregory D; Wordeman, Samuel C; Sugimoto, Dai; Bates, Nathaniel A; Roewer, Benjamin D; Medina McKeon, Jennifer M; DiCesare, Christopher A; Di Stasi, Stephanie L; Barber Foss, Kim D; Thomas, Staci M; Hewett, Timothy E

    2014-05-01

    Multi-center collaborations provide a powerful alternative to overcome the inherent limitations to single-center investigations. Specifically, multi-center projects can support large-scale prospective, longitudinal studies that investigate relatively uncommon outcomes, such as anterior cruciate ligament injury. This project was conceived to assess within- and between-center reliability of an affordable, clinical nomogram utilizing two-dimensional video methods to screen for risk of knee injury. The authors hypothesized that the two-dimensional screening methods would provide good-to-excellent reliability within and between institutions for assessment of frontal and sagittal plane biomechanics. Nineteen female, high school athletes participated. Two-dimensional video kinematics of the lower extremity during a drop vertical jump task were collected on all 19 study participants at each of the three facilities. Within-center and between-center reliability were assessed with intra- and inter-class correlation coefficients. Within-center reliability of the clinical nomogram variables was consistently excellent, but between-center reliability was fair-to-good. Within-center intra-class correlation coefficient for all nomogram variables combined was 0.98, while combined between-center inter-class correlation coefficient was 0.63. Injury risk screening protocols were reliable within and repeatable between centers. These results demonstrate the feasibility of multi-site biomechanical studies and establish a framework for further dissemination of injury risk screening algorithms. Specifically, multi-center studies may allow for further validation and optimization of two-dimensional video screening tools. 2b.

  18. Comparison of ENDF/B-VII.1 and JEFF-3.2 in VVER-1000 operational data calculation

    NASA Astrophysics Data System (ADS)

    Frybort, Jan

    2017-09-01

    Safe operation of a nuclear reactor requires an extensive calculational support. Operational data are determined by full-core calculations during the design phase of a fuel loading. Loading pattern and design of fuel assemblies are adjusted to meet safety requirements and optimize reactor operation. Nodal diffusion code ANDREA is used for this task in case of Czech VVER-1000 reactors. Nuclear data for this diffusion code are prepared regularly by lattice code HELIOS. These calculations are conducted in 2D on fuel assembly level. There is also possibility to calculate these macroscopic data by Monte-Carlo Serpent code. It can make use of alternative evaluated libraries. All calculations are affected by inherent uncertainties in nuclear data. It is useful to see results of full-core calculations based on two sets of diffusion data obtained by Serpent code calculations with ENDF/B-VII.1 and JEFF-3.2 nuclear data including also decay data library and fission yields data. The comparison is based directly on fuel assembly level macroscopic data and resulting operational data. This study illustrates effect of evaluated nuclear data library on full-core calculations of a large PWR reactor core. The level of difference which results exclusively from nuclear data selection can help to understand the level of inherent uncertainties of such full-core calculations.

  19. Minimum Control Requirements for Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Boulange, Richard; Jones, Harry; Jones, Harry

    2002-01-01

    Advanced control technologies are not necessary for the safe, reliable and continuous operation of Advanced Life Support (ALS) systems. ALS systems can and are adequately controlled by simple, reliable, low-level methodologies and algorithms. The automation provided by advanced control technologies is claimed to decrease system mass and necessary crew time by reducing buffer size and minimizing crew involvement. In truth, these approaches increase control system complexity without clearly demonstrating an increase in reliability across the ALS system. Unless these systems are as reliable as the hardware they control, there is no savings to be had. A baseline ALS system is presented with the minimal control system required for its continuous safe reliable operation. This baseline control system uses simple algorithms and scheduling methodologies and relies on human intervention only in the event of failure of the redundant backup equipment. This ALS system architecture is designed for reliable operation, with minimal components and minimal control system complexity. The fundamental design precept followed is "If it isn't there, it can't fail".

  20. Reliability Standards of Complex Engineering Systems

    NASA Astrophysics Data System (ADS)

    Galperin, E. M.; Zayko, V. A.; Gorshkalev, P. A.

    2017-11-01

    Production and manufacture play an important role in today’s modern society. Industrial production is nowadays characterized by increased and complex communications between its parts. The problem of preventing accidents in a large industrial enterprise becomes especially relevant. In these circumstances, the reliability of enterprise functioning is of particular importance. Potential damage caused by an accident at such enterprise may lead to substantial material losses and, in some cases, can even cause a loss of human lives. That is why industrial enterprise functioning reliability is immensely important. In terms of their reliability, industrial facilities (objects) are divided into simple and complex. Simple objects are characterized by only two conditions: operable and non-operable. A complex object exists in more than two conditions. The main characteristic here is the stability of its operation. This paper develops the reliability indicator combining the set theory methodology and a state space method. Both are widely used to analyze dynamically developing probability processes. The research also introduces a set of reliability indicators for complex technical systems.

  1. Integrating High-Reliability Principles to Transform Access and Throughput by Creating a Centralized Operations Center.

    PubMed

    Davenport, Paul B; Carter, Kimberly F; Echternach, Jeffrey M; Tuck, Christopher R

    2018-02-01

    High-reliability organizations (HROs) demonstrate unique and consistent characteristics, including operational sensitivity and control, situational awareness, hyperacute use of technology and data, and actionable process transformation. System complexity and reliance on information-based processes challenge healthcare organizations to replicate HRO processes. This article describes a healthcare organization's 3-year journey to achieve key HRO features to deliver high-quality, patient-centric care via an operations center powered by the principles of high-reliability data and software to impact patient throughput and flow.

  2. A Fundamental Key to Next-Generation Directed-Energy Systems

    DTIC Science & Technology

    2012-01-01

    and be inherently safe to operate. By design, they must minimize or eliminate the risk of hostile attack or collateral damage especially during...bile Construction Battalion (NMCB) 7’s convoy security element are secured following an escort mission from a forward operating base. The Cougar -type...profile, small, lightweight DE systems means: • Less vulnerability to attack • Greater mobility and maneuverability • Simplified logistics with

  3. Naval Classical Thinkers and Operational Art

    DTIC Science & Technology

    2009-01-01

    Principles and Practice of Military Operations on Land, published in 1911, did not attract as much attention as his previous two major works...thinker. He failed, for example, to consider factors such as social and cultural conditions in the rise of sea power; the rise of the English middle...three key ideas: the inherent value of a strategic central or interior position, the principle of concentration, and the close relationship between

  4. Systemic Operational Design: Epistemological Bumpf or the Way Ahead for Operational Design?

    DTIC Science & Technology

    2006-05-25

    facilitating the design of such architectural frames (meta-concepts), they are doomed to be trapped in a simplistic structuralist approach.”1...systems theory and complexity theory . SOD emerged and evolved in response to inherent challenges in the contemporary Israeli security environment...discussed in subsequent chapters. Theory . Theory is critical to this examination of the CEOD approach and SOD because theory underpins and informs

  5. Test-retest reliability of resting-state magnetoencephalography power in sensor and source space.

    PubMed

    Martín-Buro, María Carmen; Garcés, Pilar; Maestú, Fernando

    2016-01-01

    Several studies have reported changes in spontaneous brain rhythms that could be used as clinical biomarkers or in the evaluation of neuropsychological and drug treatments in longitudinal studies using magnetoencephalography (MEG). There is an increasing necessity to use these measures in early diagnosis and pathology progression; however, there is a lack of studies addressing how reliable they are. Here, we provide the first test-retest reliability estimate of MEG power in resting-state at sensor and source space. In this study, we recorded 3 sessions of resting-state MEG activity from 24 healthy subjects with an interval of a week between each session. Power values were estimated at sensor and source space with beamforming for classical frequency bands: delta (2-4 Hz), theta (4-8 Hz), alpha (8-13 Hz), low beta (13-20 Hz), high beta (20-30 Hz), and gamma (30-45 Hz). Then, test-retest reliability was evaluated using the intraclass correlation coefficient (ICC). We also evaluated the relation between source power and the within-subject variability. In general, ICC of theta, alpha, and low beta power was fairly high (ICC > 0.6) while in delta and gamma power was lower. In source space, fronto-posterior alpha, frontal beta, and medial temporal theta showed the most reliable profiles. Signal-to-noise ratio could be partially responsible for reliability as low signal intensity resulted in high within-subject variability, but also the inherent nature of some brain rhythms in resting-state might be driving these reliability patterns. In conclusion, our results described the reliability of MEG power estimates in each frequency band, which could be considered in disease characterization or clinical trials. © 2015 Wiley Periodicals, Inc.

  6. Reliability of High Power Laser Diode Arrays Operating in Long Pulse Mode

    NASA Technical Reports Server (NTRS)

    Amzajerdian, Farzin; Meadows, Byron L.; Barnes, Bruce W.; Lockard, George E.; Singh, Upendra N.; Kavaya, Michael J.; Baker, Nathaniel R.

    2006-01-01

    Reliability and lifetime of quasi-CW laser diode arrays are greatly influenced by their thermal characteristics. This paper examines the thermal properties of laser diode arrays operating in long pulse duration regime.

  7. Recent GE BWR fuel experience and design evolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, J.E.; Potts, G.A.; Proebstle, R.A.

    1992-01-01

    Reliable fuel operation is essential to the safe, reliable, and economic power production by today's commercial nuclear reactors. GE Nuclear Energy is committed to maximize fuel reliability through the progressive development of improved fuel design features and dedication to provide the maximum quality of the design features and dedication to provide the maximum quality of the design, fabrication, and operation of GE BWR fuel. Over the last 35 years, GE has designed, fabricated, and placed in operation over 82,000 BWR fuel bundles containing over 5 million fuel rods. This experience includes successful commercial reactor operation of fuel assemblies to greatermore » than 45000 MWd/MTU bundle average exposure. This paper reports that this extensive experience base has enabled clear identification and characterization of the active failure mechanisms. With this failure mechanism characterization, mitigating actions have been developed and implemented by GE to provide the highest reliability BWR fuel bundles possible.« less

  8. Development of SiC Large Tapered Crystal Growth

    NASA Technical Reports Server (NTRS)

    Neudeck, Phil

    2010-01-01

    Majority of very large potential benefits of wide band gap semiconductor power electronics have NOT been realized due in large part to high cost and high defect density of commercial wafers. Despite 20 years of development, present SiC wafer growth approach is yet to deliver majority of SiC's inherent performance and cost benefits to power systems. Commercial SiC power devices are significantly de-rated in order to function reliably due to the adverse effects of SiC crystal dislocation defects (thousands per sq cm) in the SiC wafer.

  9. Task-level robot programming: Integral part of evolution from teleoperation to autonomy

    NASA Technical Reports Server (NTRS)

    Reynolds, James C.

    1987-01-01

    An explanation is presented of task-level robot programming and of how it differs from the usual interpretation of task planning for robotics. Most importantly, it is argued that the physical and mathematical basis of task-level robot programming provides inherently greater reliability than efforts to apply better known concepts from artificial intelligence (AI) to autonomous robotics. Finally, an architecture is presented that allows the integration of task-level robot programming within an evolutionary, redundant, and multi-modal framework that spans teleoperation to autonomy.

  10. Eutelsat - Maturity and reliability through high technology and international cooperation in satellite communications

    NASA Astrophysics Data System (ADS)

    Caruso, Andrea

    1987-12-01

    In May 1986, the Eutelsat organization placed a contract for its second generation of communications satellites with a West European consortium. There is a firm order for three spacecraft, with options for five additional units. These Eutelsat II satellites will employ two transponder bandwidths, shaped spot beams for enlarged EIRP coverage, redundant transmit chains, and a large number of alternative antenna configurations for maximum use of 12-GHz band. It is suggested that communications satellites are inherently more flexible than fiber-optic cables.

  11. Nuclear electric propulsion mission engineering study. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Results of a mission engineering analysis of nuclear-thermionic electric propulsion spacecraft for unmanned interplanetary and geocentric missions are summarized. Critical technologies associated with the development of nuclear electric propulsion (NEP) are assessed. Outer planet and comet rendezvous mission analysis, NEP stage design for geocentric and interplanetary missions, NEP system development cost and unit costs, and technology requirements for NEP stage development are studied. The NEP stage design provides both inherent reliability and high payload mass capability. The NEP stage and payload integration was found to be compatible with the space shuttle.

  12. The pedestrian watchmaker: Genetic clocks from engineered oscillators

    PubMed Central

    Cookson, Natalie A.; Tsimring, Lev S.; Hasty, Jeff

    2010-01-01

    The crucial role of time-keeping has required organisms to develop sophisticated regulatory networks to ensure the reliable propagation of periodic behavior. These biological clocks have long been a focus of research; however, a clear understanding of how they maintain oscillations in the face of unpredictable environments and the inherent noise of biological systems remains elusive. Here, we review the current understanding of circadian oscillations using Drosophila melanogaster as a typical example and discuss the utility of an alternative synthetic biology approach to studying these highly intricate systems. PMID:19903483

  13. Mars outpost - System and operations challenges

    NASA Technical Reports Server (NTRS)

    Roberts, Barney; Guerra, Lisa

    1990-01-01

    The paper addresses the challenges inherent in establishing an outpost on the planet Mars. For background purposes, the unique, remote Martian environment and the developmental phases of a settlement in such an environment are discussed. Challenges are identified in terms of surface systems and operations. Due to its importance to habitability, the life support system (LSS) is highlighted with various options identified. Operations for the Mars outpost, earth-based and local, are characterized by a decentralized concept. The challenge of integrating logistics analysis early in system design and operations strategy is also addressed. In order to understand and reduce the system and operations challenges, the application of terrestrial and lunar testbeds is explained.

  14. An approximation formula for a class of Markov reliability models

    NASA Technical Reports Server (NTRS)

    White, A. L.

    1984-01-01

    A way of considering a small but often used class of reliability model and approximating algebraically the systems reliability is shown. The models considered are appropriate for redundant reconfigurable digital control systems that operate for a short period of time without maintenance, and for such systems the method gives a formula in terms of component fault rates, system recovery rates, and system operating time.

  15. Study of turboprop systems reliability and maintenance costs

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The overall reliability and maintenance costs (R&MC's) of past and current turboprop systems were examined. Maintenance cost drivers were found to be scheduled overhaul (40%), lack of modularity particularly in the propeller and reduction gearbox, and lack of inherent durability (reliability) of some parts. Comparisons were made between the 501-D13/54H60 turboprop system and the widely used JT8D turbofan. It was found that the total maintenance cost per flight hour of the turboprop was 75% higher than that of the JT8D turbofan. Part of this difference was due to propeller and gearbox costs being higher than those of the fan and reverser, but most of the difference was in the engine core where the older technology turboprop core maintenance costs were nearly 70 percent higher than for the turbofan. The estimated maintenance cost of both the advanced turboprop and advanced turbofan were less than the JT8D. The conclusion was that an advanced turboprop and an advanced turbofan, using similar cores, will have very competitive maintenance costs per flight hour.

  16. A Statistical Simulation Approach to Safe Life Fatigue Analysis of Redundant Metallic Components

    NASA Technical Reports Server (NTRS)

    Matthews, William T.; Neal, Donald M.

    1997-01-01

    This paper introduces a dual active load path fail-safe fatigue design concept analyzed by Monte Carlo simulation. The concept utilizes the inherent fatigue life differences between selected pairs of components for an active dual path system, enhanced by a stress level bias in one component. The design is applied to a baseline design; a safe life fatigue problem studied in an American Helicopter Society (AHS) round robin. The dual active path design is compared with a two-element standby fail-safe system and the baseline design for life at specified reliability levels and weight. The sensitivity of life estimates for both the baseline and fail-safe designs was examined by considering normal and Weibull distribution laws and coefficient of variation levels. Results showed that the biased dual path system lifetimes, for both the first element failure and residual life, were much greater than for standby systems. The sensitivity of the residual life-weight relationship was not excessive at reliability levels up to R = 0.9999 and the weight penalty was small. The sensitivity of life estimates increases dramatically at higher reliability levels.

  17. Reasons to Doubt the Reliability of Eyewitness Memory: Commentary on Wixted, Mickes, and Fisher (2018).

    PubMed

    Wade, Kimberley A; Nash, Robert A; Lindsay, D Stephen

    2018-05-01

    Wixted, Mickes, and Fisher (this issue) take issue with the common trope that eyewitness memory is inherently unreliable. They draw on a large body of mock-crime research and a small number of field studies, which indicate that high-confidence eyewitness reports are usually accurate, at least when memory is uncontaminated and suitable interviewing procedures are used. We agree with the thrust of Wixted et al.'s argument and welcome their invitation to confront the mass underselling of eyewitnesses' potential reliability. Nevertheless, we argue that there is a comparable risk of overselling eyewitnesses' reliability. Wixted et al.'s reasoning implies that near-pristine conditions or uncontaminated memories are normative, but there are at least two good reasons to doubt this. First, psychological science does not yet offer a good understanding of how often and when eyewitness interviews might deviate from best practice in ways that compromise the accuracy of witnesses' reports. Second, witnesses may frequently be exposed to preinterview influences that could corrupt reports obtained in best-practice interviews.

  18. Reliability considerations of a fuel cell backup power system for telecom applications

    NASA Astrophysics Data System (ADS)

    Serincan, Mustafa Fazil

    2016-03-01

    A commercial fuel cell backup power unit is tested in real life operating conditions at a base station of a Turkish telecom operator. The fuel cell system responds to 256 of 260 electric power outages successfully, providing the required power to the base station. Reliability of the fuel cell backup power unit is found to be 98.5% at the system level. On the other hand, a qualitative reliability analysis at the component level is carried out. Implications of the power management algorithm on reliability is discussed. Moreover, integration of the backup power unit to the base station ecosystem is reviewed in the context of reliability. Impact of inverter design on the stability of the output power is outlined. Significant current harmonics are encountered when a generic inverter is used. However, ripples are attenuated significantly when a custom design inverter is used. Further, fault conditions are considered for real world case studies such as running out of hydrogen, a malfunction in the system, or an unprecedented operating scheme. Some design guidelines are suggested for hybridization of the backup power unit for an uninterrupted operation.

  19. Band-Pass Amplifier Without Discrete Reactance Elements

    NASA Technical Reports Server (NTRS)

    Kleinberg, L.

    1984-01-01

    Inherent or "natural" device capacitance exploited. Band-Pass Circuit has input impedance of equivalent circuit at frequencies much greater than operational-amplifier rolloff frequency. Apparent inductance and capacitance arise from combined effects of feedback and reactive component of amplifier gain in frequency range.

  20. 32 CFR 750.23 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... activities. Nonappropriated-fund activities are entities established and operated for the benefit of military... for is deemed to be an “inherently dangerous activity”; (ii) where a nondelegable duty in the employer.... Examples include Navy and Marine Corps Exchanges, officer or enlisted clubs, and recreational services...

  1. Improved Airborne Gravity Results Using New Relative Gravity Sensor Technology

    NASA Astrophysics Data System (ADS)

    Brady, N.

    2013-12-01

    Airborne gravity data has contributed greatly to our knowledge of subsurface geophysics particularly in rugged and otherwise inaccessible areas such as Antarctica. Reliable high quality GPS data has renewed interest in improving the accuracy of airborne gravity systems and recent improvements in the electronic control of the sensor have increased the accuracy and ability of the classic Lacoste and Romberg zero length spring gravity meters to operate in turbulent air conditions. Lacoste and Romberg type gravity meters provide increased sensitivity over other relative gravity meters by utilizing a mass attached to a horizontal beam which is balanced by a ';zero length spring'. This type of dynamic gravity sensor is capable of measuring gravity changes on the order of 0.05 milliGals in laboratory conditions but more commonly 0.7 to 1 milliGal in survey use. The sensor may have errors induced by the electronics used to read the beam position as well as noise induced by unwanted accelerations, commonly turbulence, which moves the beam away from its ideal balance position otherwise known as the reading line. The sensor relies on a measuring screw controlled by a computer which attempts to bring the beam back to the reading line position. The beam is also heavily damped so that it does not react to most unwanted high frequency accelerations. However this heavily damped system is slow to react, particularly in turns where there are very high Eotvos effects. New sensor technology utilizes magnetic damping of the beam coupled with an active feedback system which acts to effectively keep the beam locked at the reading line position. The feedback system operates over the entire range of the system so there is now no requirement for a measuring screw. The feedback system operates at very high speed so that even large turbulent events have minimal impact on data quality and very little, if any, survey line data is lost because of large beam displacement errors. Airborne testing along with results from ground based van testing and laboratory results have shown that the new sensor provides more consistent gravity data, as measured by repeated line surveys, as well as preserving the inherent sensitivity of the Lacoste and Romberg zero length spring design. The sensor also provides reliability during survey operation as there is no mechanical counter screw. Results will be presented which show the advantages of the new sensor system over the current technology in both data quality and survey productivity. Applications include high resolution geoid mapping, crustal structure investigations and resource mapping of minerals, oil and gas.

  2. Key Reliability Drivers of Liquid Propulsion Engines and A Reliability Model for Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Huang, Zhao-Feng; Fint, Jeffry A.; Kuck, Frederick M.

    2005-01-01

    This paper is to address the in-flight reliability of a liquid propulsion engine system for a launch vehicle. We first establish a comprehensive list of system and sub-system reliability drivers for any liquid propulsion engine system. We then build a reliability model to parametrically analyze the impact of some reliability parameters. We present sensitivity analysis results for a selected subset of the key reliability drivers using the model. Reliability drivers identified include: number of engines for the liquid propulsion stage, single engine total reliability, engine operation duration, engine thrust size, reusability, engine de-rating or up-rating, engine-out design (including engine-out switching reliability, catastrophic fraction, preventable failure fraction, unnecessary shutdown fraction), propellant specific hazards, engine start and cutoff transient hazards, engine combustion cycles, vehicle and engine interface and interaction hazards, engine health management system, engine modification, engine ground start hold down with launch commit criteria, engine altitude start (1 in. start), Multiple altitude restart (less than 1 restart), component, subsystem and system design, manufacturing/ground operation support/pre and post flight check outs and inspection, extensiveness of the development program. We present some sensitivity analysis results for the following subset of the drivers: number of engines for the propulsion stage, single engine total reliability, engine operation duration, engine de-rating or up-rating requirements, engine-out design, catastrophic fraction, preventable failure fraction, unnecessary shutdown fraction, and engine health management system implementation (basic redlines and more advanced health management systems).

  3. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...

  4. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...

  5. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...

  6. 49 CFR Appendix E to Part 238 - General Principles of Reliability-Based Maintenance Programs

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... that have already occurred but were not evident to the operating crew. (b) Components or systems in a... shows decreasing reliability with increasing operating age. An age/time limit may be used to reduce the... maintenance of a component or system to protect the safety and operating capability of the equipment, a number...

  7. Reliability design and verification for launch-vehicle propulsion systems - Report of an AIAA Workshop, Washington, DC, May 16, 17, 1989

    NASA Astrophysics Data System (ADS)

    Launch vehicle propulsion system reliability considerations during the design and verification processes are discussed. The tools available for predicting and minimizing anomalies or failure modes are described and objectives for validating advanced launch system propulsion reliability are listed. Methods for ensuring vehicle/propulsion system interface reliability are examined and improvements in the propulsion system development process are suggested to improve reliability in launch operations. Also, possible approaches to streamline the specification and procurement process are given. It is suggested that government and industry should define reliability program requirements and manage production and operations activities in a manner that provides control over reliability drivers. Also, it is recommended that sufficient funds should be invested in design, development, test, and evaluation processes to ensure that reliability is not inappropriately subordinated to other management considerations.

  8. The reliability analysis of a separated, dual fail operational redundant strapdown IMU. [inertial measurement unit

    NASA Technical Reports Server (NTRS)

    Motyka, P.

    1983-01-01

    A methodology for quantitatively analyzing the reliability of redundant avionics systems, in general, and the dual, separated Redundant Strapdown Inertial Measurement Unit (RSDIMU), in particular, is presented. The RSDIMU is described and a candidate failure detection and isolation system presented. A Markov reliability model is employed. The operational states of the system are defined and the single-step state transition diagrams discussed. Graphical results, showing the impact of major system parameters on the reliability of the RSDIMU system, are presented and discussed.

  9. Magnet reliability in the Fermilab Main Injector and implications for the ILC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tartaglia, M.A.; Blowers, J.; Capista, D.

    2007-08-01

    The International Linear Collider reference design requires over 13000 magnets, of approximately 135 styles, which must operate with very high reliability. The Fermilab Main Injector represents a modern machine with many conventional magnet styles, each of significant quantity, that has now accumulated many hundreds of magnet-years of operation. We review here the performance of the magnets built for this machine, assess their reliability and categorize the failure modes, and discuss implications for reliability of similar magnet styles expected to be used at the ILC.

  10. Transit Reliability Information Program : PATCO-WMATA Propulsion System Reliability/Productivity Analysis

    DOT National Transportation Integrated Search

    1984-10-01

    The Transit Reliability Information Program (TRIP) is a government-initiated program to assist the transit industry in satisfying its need for transit reliability information. TRIP provides this assistance through the operation of a national data ban...

  11. Boundary conditioning of capacitive MEMS devices through fabrication methods and operating environments

    NASA Astrophysics Data System (ADS)

    Muthukumaran, Packirisamy; Stiharu, Ion G.; Bhat, Rama B.

    2003-10-01

    This paper presents and applies the concept of micro-boundary conditioning to the design synthesis of microsystems in order to quantify the influence of inherent limitations of the fabrication process and the operating conditions on both static and dynamic behavior of microsystems. The predicted results on the static and dynamic behavior of a capacitive MEMS device, fabricated through MUMPs process, under the influence of the fabrication limitation and operating environment are presented along with the test results. The comparison between the predicted and experimental results shows a good agreement.

  12. Investigation of the MTC noise estimation with a coupled neutronic/thermal-hydraulic dedicated model - 'Closing the loop'

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demaziere, C.; Larsson, V.

    2012-07-01

    This paper investigates the reliability of different noise estimators aimed at determining the Moderator Temperature Coefficient (MTC) of reactivity in Pressurized Water Reactors. By monitoring the inherent fluctuations in the neutron flux and moderator temperature, an on-line monitoring of the MTC without perturbing reactor operation is possible. In order to get an accurate estimation of the MTC by noise analysis, the point-kinetic component of the neutron noise and the core-averaged moderator temperature noise have to be used. Because of the scarcity of the in-core instrumentation, the determination of these quantities is difficult, and several possibilities thus exist for estimating themore » MTC by noise analysis. Furthermore, the effect of feedback has to be negligible at the frequency chosen for estimating the MTC in order to get a proper determination of the MTC. By using an integrated neutronic/thermal- hydraulic model specifically developed for estimating the three-dimensional distributions of the fluctuations in neutron flux, moderator properties, and fuel temperature, different approaches for estimating the MTC by noise analysis can be tested individually. It is demonstrated that a reliable MTC estimation can only be provided if the core is equipped with a sufficient number of both neutron detectors and temperature sensors, i.e. if the core contain in-core detectors monitoring both the axial and radial distributions of the fluctuations in neutron flux and moderator temperature. It is further proven that the effect of feedback is negligible for frequencies higher than 0.1 Hz, and thus the MTC noise estimations have to be performed at higher frequencies. (authors)« less

  13. Frequency stabilization of diode-laser-pumped solid state lasers

    NASA Technical Reports Server (NTRS)

    Byer, Robert L.

    1988-01-01

    The goal of the NASA Sunlite program is to fly two diode-laser-pumped solid-state lasers on the space shuttle and while doing so to perform a measurement of their frequency stability and temporal coherence. These measurements will be made by combining the outputs of the two lasers on an optical radiation detector and spectrally analyzing the beat note. Diode-laser-pumped solid-state lasers have several characteristics that will make them useful in space borne experiments. First, this laser has high electrical efficiency. Second, it is of a technology that enables scaling to higher powers in the future. Third, the laser can be made extremely reliable, which is crucial for many space based applications. Fourth, they are frequency and amplitude stable and have high temporal coherence. Diode-laser-pumped solid-state lasers are inherently efficient. Recent results have shown 59 percent slope efficiency for a diode-laser-pumped solid-state laser. As for reliability, the laser proposed should be capable of continuous operation. This is possible because the diode lasers can be remote from the solid state gain medium by coupling through optical fibers. Diode lasers are constructed with optical detectors for monitoring their output power built into their mounting case. A computer can actively monitor the output of each diode laser. If it sees any variation in the output power that might indicate a problem, the computer can turn off that diode laser and turn on a backup diode laser. As for stability requirements, it is now generally believed that any laser can be stabilized if the laser has a frequency actuator capable of tuning the laser frequency as far as it is likely to drift in a measurement time.

  14. System-wide versus component-specific trust using multiple aids.

    PubMed

    Keller, David; Rice, Stephen

    2010-01-01

    Previous research in operator trust toward automated aids has focused primarily on single aids. The current study focuses on how operator trust is affected by the presence of multiple aids. Two competing theories of multiple-trust are presented. A component-specific trust theory predicts that operators will differentially place their trust in automated aids that vary in reliability. A system-wide trust theory predicts that operators will treat multiple imperfect aids as one "system" and merge their trust across aids despite differences in the aids' reliability. A simulated flight task was used to test these theories, whereby operators performed a pursuit tracking task while concurrently monitoring multiple system gauges that were augmented with perfect or imperfect automated aids. The data revealed that a system-wide trust theory best predicted the data; operators merged their trust across both aids, behaving toward a perfectly reliable aid in the same manner as they did towards unreliable aids.

  15. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... operators of the Bulk-Power System, and other interested parties for improvement of the Electric Reliability... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Electric Reliability..., Reliability Standards that provide for an adequate level of reliability of the Bulk-Power System, and (2) Has...

  16. Stability, Nonlinearity and Reliability of Electrostatically Actuated MEMS Devices

    PubMed Central

    Zhang, Wen-Ming; Meng, Guang; Chen, Di

    2007-01-01

    Electrostatic micro-electro-mechanical system (MEMS) is a special branch with a wide range of applications in sensing and actuating devices in MEMS. This paper provides a survey and analysis of the electrostatic force of importance in MEMS, its physical model, scaling effect, stability, nonlinearity and reliability in detail. It is necessary to understand the effects of electrostatic forces in MEMS and then many phenomena of practical importance, such as pull-in instability and the effects of effective stiffness, dielectric charging, stress gradient, temperature on the pull-in voltage, nonlinear dynamic effects and reliability due to electrostatic forces occurred in MEMS can be explained scientifically, and consequently the great potential of MEMS technology could be explored effectively and utilized optimally. A simplified parallel-plate capacitor model is proposed to investigate the resonance response, inherent nonlinearity, stiffness softened effect and coupled nonlinear effect of the typical electrostatically actuated MEMS devices. Many failure modes and mechanisms and various methods and techniques, including materials selection, reasonable design and extending the controllable travel range used to analyze and reduce the failures are discussed in the electrostatically actuated MEMS devices. Numerical simulations and discussions indicate that the effects of instability, nonlinear characteristics and reliability subjected to electrostatic forces cannot be ignored and are in need of further investigation.

  17. Synthesizing cognition in neuromorphic electronic systems

    PubMed Central

    Neftci, Emre; Binas, Jonathan; Rutishauser, Ueli; Chicca, Elisabetta; Indiveri, Giacomo; Douglas, Rodney J.

    2013-01-01

    The quest to implement intelligent processing in electronic neuromorphic systems lacks methods for achieving reliable behavioral dynamics on substrates of inherently imprecise and noisy neurons. Here we report a solution to this problem that involves first mapping an unreliable hardware layer of spiking silicon neurons into an abstract computational layer composed of generic reliable subnetworks of model neurons and then composing the target behavioral dynamics as a “soft state machine” running on these reliable subnets. In the first step, the neural networks of the abstract layer are realized on the hardware substrate by mapping the neuron circuit bias voltages to the model parameters. This mapping is obtained by an automatic method in which the electronic circuit biases are calibrated against the model parameters by a series of population activity measurements. The abstract computational layer is formed by configuring neural networks as generic soft winner-take-all subnetworks that provide reliable processing by virtue of their active gain, signal restoration, and multistability. The necessary states and transitions of the desired high-level behavior are then easily embedded in the computational layer by introducing only sparse connections between some neurons of the various subnets. We demonstrate this synthesis method for a neuromorphic sensory agent that performs real-time context-dependent classification of motion patterns observed by a silicon retina. PMID:23878215

  18. CHRR: coordinate hit-and-run with rounding for uniform sampling of constraint-based models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haraldsdóttir, Hulda S.; Cousins, Ben; Thiele, Ines

    In constraint-based metabolic modelling, physical and biochemical constraints define a polyhedral convex set of feasible flux vectors. Uniform sampling of this set provides an unbiased characterization of the metabolic capabilities of a biochemical network. However, reliable uniform sampling of genome-scale biochemical networks is challenging due to their high dimensionality and inherent anisotropy. Here, we present an implementation of a new sampling algorithm, coordinate hit-and-run with rounding (CHRR). This algorithm is based on the provably efficient hit-and-run random walk and crucially uses a preprocessing step to round the anisotropic flux set. CHRR provably converges to a uniform stationary sampling distribution. Wemore » apply it to metabolic networks of increasing dimensionality. We show that it converges several times faster than a popular artificial centering hit-and-run algorithm, enabling reliable and tractable sampling of genome-scale biochemical networks.« less

  19. Static test induced loads verification beyond elastic limit

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1996-01-01

    Increasing demands for reliable and least-cost high-performance aerostructures are pressing design analyses, materials, and manufacturing processes to new and narrowly experienced performance and verification technologies. This study assessed the adequacy of current experimental verification of the traditional binding ultimate safety factor which covers rare events in which no statistical design data exist. Because large high-performance structures are inherently very flexible, boundary rotations and deflections under externally applied loads approaching fracture may distort their transmission and unknowingly accept submarginal structures or prematurely fracturing reliable ones. A technique was developed, using measured strains from back-to-back surface mounted gauges, to analyze, define, and monitor induced moments and plane forces through progressive material changes from total-elastic to total-inelastic zones within the structural element cross section. Deviations from specified test loads are identified by the consecutively changing ratios of moment-to-axial load.

  20. Static test induced loads verification beyond elastic limit

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1996-01-01

    Increasing demands for reliable and least-cost high performance aerostructures are pressing design analyses, materials, and manufacturing processes to new and narrowly experienced performance and verification technologies. This study assessed the adequacy of current experimental verification of the traditional binding ultimate safety factor which covers rare events in which no statistical design data exist. Because large, high-performance structures are inherently very flexible, boundary rotations and deflections under externally applied loads approaching fracture may distort their transmission and unknowingly accept submarginal structures or prematurely fracturing reliable ones. A technique was developed, using measured strains from back-to-back surface mounted gauges, to analyze, define, and monitor induced moments and plane forces through progressive material changes from total-elastic to total inelastic zones within the structural element cross section. Deviations from specified test loads are identified by the consecutively changing ratios of moment-to-axial load.

  1. Against the proposition: for the diagnosis of viral infections, commercial assays provide more reliable results than do in-house assays.

    PubMed

    James, Vivienne

    2008-01-01

    There are no differences inherent in the design of commercial or in-house assays and their early development is similar. The same principles apply and it is on the same criteria of accuracy, reproducibility and clinical relevance of results that all assays are judged. However, if there is sufficient uptake of a commercial assay, its strengths and any flaws soon become apparent and it will only be the best commercial assays that remain in the market. For the in-house assays it is through comparability studies and external quality assessment (EQA) schemes that the best can be demonstrated, albeit this information is only accessible initially to the EQA provider and the laboratories using the assays. The EQA results described here support my supposition that, for the diagnosis of viral infections, commercial assays do not provide more reliable results than do in-house assays.

  2. Steady-State Somatosensory Evoked Potential for Brain-Computer Interface—Present and Future

    PubMed Central

    Ahn, Sangtae; Kim, Kiwoong; Jun, Sung Chan

    2016-01-01

    Brain-computer interface (BCI) performance has achieved continued improvement over recent decades, and sensorimotor rhythm-based BCIs that use motor function have been popular subjects of investigation. However, it remains problematic to introduce them to the public market because of their low reliability. As an alternative resolution to this issue, visual-based BCIs that use P300 or steady-state visually evoked potentials (SSVEPs) seem promising; however, the inherent visual fatigue that occurs with these BCIs may be unavoidable. For these reasons, steady-state somatosensory evoked potential (SSSEP) BCIs, which are based on tactile selective attention, have gained increasing attention recently. These may reduce the fatigue induced by visual attention and overcome the low reliability of motor activity. In this literature survey, recent findings on SSSEP and its methodological uses in BCI are reviewed. Further, existing limitations of SSSEP BCI and potential future directions for the technique are discussed. PMID:26834611

  3. Shock and vibration effects on performance reliability and mechanical integrity of proton exchange membrane fuel cells: A critical review and discussion

    NASA Astrophysics Data System (ADS)

    Haji Hosseinloo, Ashkan; Ehteshami, Mohsen Mousavi

    2017-10-01

    Performance reliability and mechanical integrity are the main bottlenecks in mass commercialization of PEMFCs for applications with inherent harsh environment such as automotive and aerospace applications. Imparted shock and vibration to the fuel cell in such applications could bring about numerous issues including clamping torque loosening, gas leakage, increased electrical resistance, and structural damage and breakage. Here, we provide a comprehensive review and critique of the literature focusing on the effects of mechanically harsh environment on PEMFCs, and at the end, we suggest two main future directions in FC technology research that need immediate attention: (i) developing a generic and adequately accurate dynamic model of PEMFCs to assess the dynamic response of FC devices, and (ii) designing effective and robust shock and vibration protection systems based on the developed models in (i).

  4. Reliability and utility of citizen science reef monitoring data collected by Reef Check Australia, 2002-2015.

    PubMed

    Done, Terence; Roelfsema, Chris; Harvey, Andrew; Schuller, Laura; Hill, Jocelyn; Schläppy, Marie-Lise; Lea, Alexandra; Bauer-Civiello, Anne; Loder, Jennifer

    2017-04-15

    Reef Check Australia (RCA) has collected data on benthic composition and cover at >70 sites along >1000km of Australia's Queensland coast from 2002 to 2015. This paper quantifies the accuracy, precision and power of RCA benthic composition data, to guide its application and interpretation. A simulation study established that the inherent accuracy of the Reef Check point sampling protocol is high (<±7% error absolute), in the range of estimates of benthic cover from 1% to 50%. A field study at three reef sites indicated that, despite minor observer- and deployment-related biases, the protocol does reliably document moderate ecological changes in coral communities. The error analyses were then used to guide the interpretation of inter-annual variability and long term trends at three study sites in RCA's major 2002-2015 data series for the Queensland coast. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Standards for Environmental Measurement Using GIS: Toward a Protocol for Protocols.

    PubMed

    Forsyth, Ann; Schmitz, Kathryn H; Oakes, Michael; Zimmerman, Jason; Koepp, Joel

    2006-02-01

    Interdisciplinary research regarding how the built environment influences physical activity has recently increased. Many research projects conducted jointly by public health and environmental design professionals are using geographic information systems (GIS) to objectively measure the built environment. Numerous methodological issues remain, however, and environmental measurements have not been well documented with accepted, common definitions of valid, reliable variables. This paper proposes how to create and document standardized definitions for measures of environmental variables using GIS with the ultimate goal of developing reliable, valid measures. Inherent problems with software and data that hamper environmental measurement can be offset by protocols combining clear conceptual bases with detailed measurement instructions. Examples demonstrate how protocols can more clearly translate concepts into specific measurement. This paper provides a model for developing protocols to allow high quality comparative research on relationships between the environment and physical activity and other outcomes of public health interest.

  6. Transformation of cell-derived microparticles into quantum-dot-labeled nanovectors for antitumor siRNA delivery.

    PubMed

    Chen, Gang; Zhu, Jun-Yi; Zhang, Zhi-Ling; Zhang, Wei; Ren, Jian-Gang; Wu, Min; Hong, Zheng-Yuan; Lv, Cheng; Pang, Dai-Wen; Zhao, Yi-Fang

    2015-01-12

    Cell-derived microparticles (MPs) have been recently recognized as critical intercellular information conveyors. However, further understanding of their biological behavior and potential application has been hampered by the limitations of current labeling techniques. Herein, a universal donor-cell-assisted membrane biotinylation strategy was proposed for labeling MPs by skillfully utilizing the natural membrane phospholipid exchange of their donor cells. This innovative strategy conveniently led to specific, efficient, reproducible, and biocompatible quantum dot (QD) labeling of MPs, thereby reliably conferring valuable traceability on MPs. By further loading with small interference RNA, QD-labeled MPs that had inherent cell-targeting and biomolecule-conveying ability were successfully employed for combined bioimaging and tumor-targeted therapy. This study provides the first reliable and biofriendly strategy for transforming biogenic MPs into functionalized nanovectors. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. CHRR: coordinate hit-and-run with rounding for uniform sampling of constraint-based models

    DOE PAGES

    Haraldsdóttir, Hulda S.; Cousins, Ben; Thiele, Ines; ...

    2017-01-31

    In constraint-based metabolic modelling, physical and biochemical constraints define a polyhedral convex set of feasible flux vectors. Uniform sampling of this set provides an unbiased characterization of the metabolic capabilities of a biochemical network. However, reliable uniform sampling of genome-scale biochemical networks is challenging due to their high dimensionality and inherent anisotropy. Here, we present an implementation of a new sampling algorithm, coordinate hit-and-run with rounding (CHRR). This algorithm is based on the provably efficient hit-and-run random walk and crucially uses a preprocessing step to round the anisotropic flux set. CHRR provably converges to a uniform stationary sampling distribution. Wemore » apply it to metabolic networks of increasing dimensionality. We show that it converges several times faster than a popular artificial centering hit-and-run algorithm, enabling reliable and tractable sampling of genome-scale biochemical networks.« less

  8. Athermal metal optics made of nickel plated AlSi40

    NASA Astrophysics Data System (ADS)

    Gebhardt, Andreas; Kinast, Jan; Rohloff, Ralf-Rainer; Seifert, Walter; Beier, Matthias; Scheiding, Sebastian; Peschel, Thomas

    2017-11-01

    Metal optics is an inherent part of space instrumentation for years. Diamond turned aluminum (Al6061) mirrors are widely used for application in the mid- and near-infrared (mid-IR and NIR, respectively) spectral range. Aluminum mirrors plated with electroless nickel (NiP) expand the field of application towards multispectral operating instruments down to the ultraviolet wavelengths. Due to the significant mismatch in the coefficient of thermal expansion (CTE) between aluminum and NiP, however, this advantage occurs at the cost of bimetallic bending. Challenging requirements can be met by using bare beryllium or aluminum beryllium composites (AlBeMet) as a CTE tailored substrate material and amorphous NiP as polishable layer. For health reasons, the use of beryllium causes complications in the process chain. Thus, the beryllium approach is subjected to specific applications only. Metal optics has proven to be advantageous in respect of using conventional CNC and ultra-precision fabrication methods to realize complex and light-weighted instrument structures. Moreover, the mirror designs can be effectively optimized for a deterministic system assembly and optimization. Limitations in terms of dimensional stability over temperature and time are mainly given by the inherent material properties (figures of merit) of the substrate material in interaction with the polishing layer. To find an optimal compromise, a thermal matched aluminum-silicon alloy (silicon contents ≍ 40 wt%) plated with NiP (AlSi40/NiP ) was investigated in a joined project of the Max Planck Institute for Astronomy MPIA and the Fraunhofer Institute for Applied Optics and Precision Engineering IOF. The main tasks of the project were the minimization of the bimetallic bending, the development of reliable stabilizing and aging procedures, and the establishment of a proven fabrication method. This paper describes fundamental results regarding the optimization of the athermal material combination. Furthermore, the developed production chain for high quality freeform mirrors made of AlSi40/NiP is pointed out.

  9. Surface tension confined liquid cryogen cooler

    NASA Technical Reports Server (NTRS)

    Castles, Stephen H. (Inventor); Schein, Michael E. (Inventor)

    1989-01-01

    A cryogenic cooler is provided for use in craft such as launch, orbital, and space vehicles subject to substantial vibration, changes in orientation, and weightlessness. The cooler contains a small pore, large free volume, low density material to restrain a cryogen through surface tension effects during launch and zero-g operations and maintains instrumentation within the temperature range of 10 to 140 K. The cooler operation is completely passive, with no inherent vibration or power requirements.

  10. Linearly Polarized Single-Frequency Oscillations of Laser-Diode-Pumped Microchip Ceramic Nd:YAG Lasers with Forced Ince-Gaussian Mode Operations

    NASA Astrophysics Data System (ADS)

    Otsuka, Kenju; Nemoto, Kana; Kamikariya, Koji; Miyasaka, Yoshihiko; Chu, Shu-Chun

    2007-09-01

    Detailed oscillation spectra and polarization properties have been examined in laser-diode-pumped (LD-pumped) microchip ceramic (i.e., polycrystalline) Nd:YAG lasers and the inherent segregation of lasing patterns into local modes possessing different polarization states was observed. Single-frequency linearly-polarized stable oscillations were realized by forcing the laser to Ince-Gaussian mode operations by adjusting azimuthal cavity symmetry.

  11. Cephalometric study of facial growth in children after combined pushback and pharyngeal flap operations.

    PubMed

    Pearl, R M; Kaplan, E N

    1976-04-01

    Linear and angular cephalometric measurements of children who had had combined palatal pushbacks and superiorly-based pharyngeal flaps do not show later growth retardation of the face. There was an inherent tendency for children with overt clefts of the secondary palate, classic submucous clefts, or occult submucous clefts to demonstrate pre-operatively a narrow SNA and SNB--but the difference between these angles (ANB) was normal.

  12. 23 CFR 635.109 - Standardized changed condition clauses.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... anticipated, customary, or inherent to the construction industry) and the contractor believes that additional... OPERATIONS CONSTRUCTION AND MAINTENANCE Contract Procedures § 635.109 Standardized changed condition clauses... clauses shall be made part of, and incorporated in, each highway construction project approved under 23 U...

  13. Open Education as a "Heterotopia of Desire"

    ERIC Educational Resources Information Center

    Gourlay, Lesley

    2015-01-01

    The movement towards "openness" in education has tended to position itself as inherently democratising, radical, egalitarian and critical of powerful gatekeepers to learning. While "openness" is often positioned as a critique, I will argue that its mainstream discourses--while appearing to oppose large-scale operations of…

  14. Transit Reliability Information Program : Reliability Verification Demonstration Plan for Rapid Rail Vehicles

    DOT National Transportation Integrated Search

    1981-08-01

    The Transit Reliability Information Program (TRIP) is a government-initiated program to assist the transit industry in satisfying its need for transit reliability information. TRIP provides this assistance through the operation of a national Data Ban...

  15. Ten Year Operating Test Results and Post-Test Analysis of a 1/10 Segment Stirling Sodium Heat Pipe, Phase III

    NASA Technical Reports Server (NTRS)

    Rosenfeld, John, H; Minnerly, Kenneth, G; Dyson, Christopher, M.

    2012-01-01

    High-temperature heat pipes are being evaluated for use in energy conversion applications such as fuel cells, gas turbine re-combustors, Stirling cycle heat sources; and with the resurgence of space nuclear power both as reactor heat removal elements and as radiator elements. Long operating life and reliable performance are critical requirements for these applications. Accordingly, long-term materials compatibility is being evaluated through the use of high-temperature life test heat pipes. Thermacore, Inc., has carried out a sodium heat pipe 10-year life test to establish long-term operating reliability. Sodium heat pipes have demonstrated favorable materials compatibility and heat transport characteristics at high operating temperatures in air over long time periods. A representative one-tenth segment Stirling Space Power Converter heat pipe with an Inconel 718 envelope and a stainless steel screen wick has operated for over 87,000 hr (10 yr) at nearly 700 C. These life test results have demonstrated the potential for high-temperature heat pipes to serve as reliable energy conversion system components for power applications that require long operating lifetime with high reliability. Detailed design specifications, operating history, and post-test analysis of the heat pipe and sodium working fluid are described.

  16. Robust Statistical Fusion of Image Labels

    PubMed Central

    Landman, Bennett A.; Asman, Andrew J.; Scoggins, Andrew G.; Bogovic, John A.; Xing, Fangxu; Prince, Jerry L.

    2011-01-01

    Image labeling and parcellation (i.e. assigning structure to a collection of voxels) are critical tasks for the assessment of volumetric and morphometric features in medical imaging data. The process of image labeling is inherently error prone as images are corrupted by noise and artifacts. Even expert interpretations are subject to subjectivity and the precision of the individual raters. Hence, all labels must be considered imperfect with some degree of inherent variability. One may seek multiple independent assessments to both reduce this variability and quantify the degree of uncertainty. Existing techniques have exploited maximum a posteriori statistics to combine data from multiple raters and simultaneously estimate rater reliabilities. Although quite successful, wide-scale application has been hampered by unstable estimation with practical datasets, for example, with label sets with small or thin objects to be labeled or with partial or limited datasets. As well, these approaches have required each rater to generate a complete dataset, which is often impossible given both human foibles and the typical turnover rate of raters in a research or clinical environment. Herein, we propose a robust approach to improve estimation performance with small anatomical structures, allow for missing data, account for repeated label sets, and utilize training/catch trial data. With this approach, numerous raters can label small, overlapping portions of a large dataset, and rater heterogeneity can be robustly controlled while simultaneously estimating a single, reliable label set and characterizing uncertainty. The proposed approach enables many individuals to collaborate in the construction of large datasets for labeling tasks (e.g., human parallel processing) and reduces the otherwise detrimental impact of rater unavailability. PMID:22010145

  17. Study of complete interconnect reliability for a GaAs MMIC power amplifier

    NASA Astrophysics Data System (ADS)

    Lin, Qian; Wu, Haifeng; Chen, Shan-ji; Jia, Guoqing; Jiang, Wei; Chen, Chao

    2018-05-01

    By combining the finite element analysis (FEA) and artificial neural network (ANN) technique, the complete prediction of interconnect reliability for a monolithic microwave integrated circuit (MMIC) power amplifier (PA) at the both of direct current (DC) and alternating current (AC) operation conditions is achieved effectively in this article. As a example, a MMIC PA is modelled to study the electromigration failure of interconnect. This is the first time to study the interconnect reliability for an MMIC PA at the conditions of DC and AC operation simultaneously. By training the data from FEA, a high accuracy ANN model for PA reliability is constructed. Then, basing on the reliability database which is obtained from the ANN model, it can give important guidance for improving the reliability design for IC.

  18. Modeling Imperfect Generator Behavior in Power System Operation Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krad, Ibrahim

    A key component in power system operations is the use of computer models to quickly study and analyze different operating conditions and futures in an efficient manner. The output of these models are sensitive to the data used in them as well as the assumptions made during their execution. One typical assumption is that generators and load assets perfectly follow operator control signals. While this is a valid simulation assumption, generators may not always accurately follow control signals. This imperfect response of generators could impact cost and reliability metrics. This paper proposes a generator model that capture this imperfect behaviormore » and examines its impact on production costs and reliability metrics using a steady-state power system operations model. Preliminary analysis shows that while costs remain relatively unchanged, there could be significant impacts on reliability metrics.« less

  19. Effects of extended lay-off periods on performance and operator trust under adaptable automation.

    PubMed

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-03-01

    Little is known about the long-term effects of system reliability when operators do not use a system during an extended lay-off period. To examine threats to skill maintenance, 28 participants operated twice a simulation of a complex process control system for 2.5 h, with an 8-month retention interval between sessions. Operators were provided with an adaptable support system, which operated at one of the following reliability levels: 60%, 80% or 100%. Results showed that performance, workload, and trust remained stable at the second testing session, but operators lost self-confidence in their system management abilities. Finally, the effects of system reliability observed at the first testing session were largely found again at the second session. The findings overall suggest that adaptable automation may be a promising means to support operators in maintaining their performance at the second testing session. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  20. Development of an Extreme High Temperature n-type Ohmic Contact to Silicon Carbide

    NASA Technical Reports Server (NTRS)

    Evans, Laura J.; Okojie, Robert S.; Lukco, Dorothy

    2011-01-01

    We report on the initial demonstration of a tungsten-nickel (75:25 at. %) ohmic contact to silicon carbide (SiC) that performed for up to fifteen hours of heat treatment in argon at 1000 C. The transfer length method (TLM) test structure was used to evaluate the contacts. Samples showed consistent ohmic behavior with specific contact resistance values averaging 5 x 10-4 -cm2. The development of this contact metallization should allow silicon carbide devices to operate more reliably at the present maximum operating temperature of 600 C while potentially extending operations to 1000 C. Introduction Silicon Carbide (SiC) is widely recognized as one of the materials of choice for high temperature, harsh environment sensors and electronics due to its ability to survive and continue normal operation in such environments [1]. Sensors and electronics in SiC have been developed that are capable of operating at temperatures of 600 oC. However operating these devices at the upper reliability temperature threshold increases the potential for early degradation. Therefore, it is important to raise the reliability temperature ceiling higher, which would assure increased device reliability when operated at nominal temperature. There are also instances that require devices to operate and survive for prolonged periods of time above 600 oC [2, 3]. This is specifically needed in the area of hypersonic flight where robust sensors are needed to monitor vehicle performance at temperature greater than 1000 C, as well as for use in the thermomechanical characterization of high temperature materials (e.g. ceramic matrix composites). While SiC alone can withstand these temperatures, a major challenge is to develop reliable electrical contacts to the device itself in order to facilitate signal extraction

  1. Risk Management for Sodium Fast Reactors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denman, Matthew R.; Groth, Katrina; Cardoni, Jeffrey N.

    2015-01-01

    Accident management is an important component to maintaining risk at acceptable levels for all complex systems, such as nuclear power plants. With the introduction of self - correcting, or inherently safe, reactor designs the focus has shifted from management by operators to allowing the syste m's design to manage the accident. While inherently and passively safe designs are laudable, extreme boundary conditions can interfere with the design attributes which facilitate inherent safety , thus resulting in unanticipated and undesirable end states. This report examines an inherently safe and small sodium fast reactor experiencing a beyond design basis seismic event withmore » the intend of exploring two issues : (1) can human intervention either improve or worsen the potential end states and (2) can a Bayes ian Network be constructed to infer the state of the reactor to inform (1). ACKNOWLEDGEMENTS The author s would like to acknowledge the U.S. Department of E nergy's Office of Nuclear Energy for funding this research through Work Package SR - 14SN100303 under the Advanced Reactor Concepts program. The authors also acknowledge the PRA teams at A rgonne N ational L aborator y , O ak R idge N ational L aborator y , and I daho N ational L aborator y for their continue d contributions to the advanced reactor PRA mission area.« less

  2. Overview of RICOR's reliability theoretical analysis, accelerated life demonstration test results and verification by field data

    NASA Astrophysics Data System (ADS)

    Vainshtein, Igor; Baruch, Shlomi; Regev, Itai; Segal, Victor; Filis, Avishai; Riabzev, Sergey

    2018-05-01

    The growing demand for EO applications that work around the clock 24hr/7days a week, such as in border surveillance systems, emphasizes the need for a highly reliable cryocooler having increased operational availability and optimized system's Integrated Logistic Support (ILS). In order to meet this need, RICOR developed linear and rotary cryocoolers which achieved successfully this goal. Cryocoolers MTTF was analyzed by theoretical reliability evaluation methods, demonstrated by normal and accelerated life tests at Cryocooler level and finally verified by field data analysis derived from Cryocoolers operating at system level. The following paper reviews theoretical reliability analysis methods together with analyzing reliability test results derived from standard and accelerated life demonstration tests performed at Ricor's advanced reliability laboratory. As a summary for the work process, reliability verification data will be presented as a feedback from fielded systems.

  3. Inherent Contrast in Magnetic Resonance Imaging and the Potential for Contrast Enhancement

    PubMed Central

    Brasch, Robert C.

    1985-01-01

    Magnetic resonance (MR) imaging is emerging as a powerful new diagnostic tool valued for its apparent lack of adverse effects. The excellent inherent contrast between biologic tissues and fluids afforded by MR imaging is one of the foremost characteristics of this technique and depends on physicochemical properties such as hydrogen density and T1 and T2 relaxation rates, on magnetic field strength and on operator-chosen factors for acquiring the MR imaging signal. Pharmaceutical contrast-enhancing agents shorten the MR imaging process and improve sensitivity and diagnostic accuracy. ImagesFigure 1.Figure 2.Figure 3.Figure 4.Figure 5.Figure 6.Figure 8.Figure 9.Figure 10.Figure 11. PMID:2992172

  4. Surface Acoustic Wave Monitor for Deposition and Analysis of Ultra-Thin Films

    NASA Technical Reports Server (NTRS)

    Hines, Jacqueline H. (Inventor)

    2015-01-01

    A surface acoustic wave (SAW) based thin film deposition monitor device and system for monitoring the deposition of ultra-thin films and nanomaterials and the analysis thereof is characterized by acoustic wave device embodiments that include differential delay line device designs, and which can optionally have integral reference devices fabricated on the same substrate as the sensing device, or on a separate device in thermal contact with the film monitoring/analysis device, in order to provide inherently temperature compensated measurements. These deposition monitor and analysis devices can include inherent temperature compensation, higher sensitivity to surface interactions than quartz crystal microbalance (QCM) devices, and the ability to operate at extreme temperatures.

  5. Parent Praise to 1-3 Year-Olds Predicts Children’s Motivational Frameworks 5 Years Later

    PubMed Central

    Gunderson, Elizabeth A.; Gripshover, Sarah J.; Romero, Carissa; Dweck, Carol S.; Goldin-Meadow, Susan; Levine, Susan C.

    2013-01-01

    In laboratory studies, praising children’s effort encourages them to adopt incremental motivational frameworks—they believe ability is malleable, attribute success to hard work, enjoy challenges, and generate strategies for improvement. In contrast, praising children’s inherent abilities encourages them to adopt fixed-ability frameworks. Does the praise parents spontaneously give children at home show the same effects? Although parents’ early praise of inherent characteristics was not associated with children’s later fixed-ability frameworks, parents’ praise of children’s effort at 14-38 months (N=53) did predict incremental frameworks at 7-8 years, suggesting that causal mechanisms identified in experimental work may be operating in home environments. PMID:23397904

  6. Budgeting for Efficiency and Effectiveness

    ERIC Educational Resources Information Center

    Pereus, Steven C.

    2012-01-01

    For most districts, budgeting has become a cost-cutting exercise designed to close the gap between revenues and expenses. During this process, decision makers inherently assume that existing operations are efficient and effective--an assumption that is rarely validated by facts. Cutting programs and services balances budgets but does not…

  7. Deep Space Telecommunications

    NASA Technical Reports Server (NTRS)

    Kuiper, T. B. H.; Resch, G. M.

    2000-01-01

    The increasing load on NASA's deep Space Network, the new capabilities for deep space missions inherent in a next-generation radio telescope, and the potential of new telescope technology for reducing construction and operation costs suggest a natural marriage between radio astronomy and deep space telecommunications in developing advanced radio telescope concepts.

  8. Mitigation of multipacting, enhanced by gas condensation on the high power input coupler of a superconducting RF module, by comprehensive warm aging

    NASA Astrophysics Data System (ADS)

    Wang, Chaoen; Chang, Lung-Hai; Chang, Mei-Hsia; Chen, Ling-Jhen; Chung, Fu-Tsai; Lin, Ming-Chyuan; Liu, Zong-Kai; Lo, Chih-Hung; Tsai, Chi-Lin; Yeh, Meng-Shu; Yu, Tsung-Chi

    2017-11-01

    Excitation of multipacting, enhanced by gas condensation on cold surfaces of the high power input coupler in a SRF module poses the highest challenge for reliable SRF operation under high average RF power. This could prevent the light source SRF module from being operated with a desired high beam current. Off-line long-term reliability tests have been conducted for the newly constructed 500-MHz SRF KEKB type modules at an accelerating RF voltage of 1.6-MV to enable prediction of their operational reliability in the 3-GeV Taiwan Photon Source (TPS), since prediction from mere production performance by conventional horizontal test is presently unreliable. As expected, operational difficulties resulting from multipacting, enhanced by gas condensation, have been identified in the course of long-term reliability test. Our present hypothesis is that gas condensation can be slowed down by preserving the vacuum pressure at the power coupler close to that reached just after its cool down to liquid helium temperatures. This is achievable by reduction of the power coupler out-gassing rate through comprehensive warm aging. Its feasibility and effectiveness has been experimentally verified in a second long term reliability test. Our success opens the possibility to operate the SRF module free of multipacting trouble and opens a new direction to improve the operational performance of next generation SRF modules in light sources with high beam currents.

  9. Reliable multicast protocol specifications protocol operations

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd; Whetten, Brian

    1995-01-01

    This appendix contains the complete state tables for Reliable Multicast Protocol (RMP) Normal Operation, Multi-RPC Extensions, Membership Change Extensions, and Reformation Extensions. First the event types are presented. Afterwards, each RMP operation state, normal and extended, is presented individually and its events shown. Events in the RMP specification are one of several things: (1) arriving packets, (2) expired alarms, (3) user events, (4) exceptional conditions.

  10. Medical Logistics Lessons Observed During Operations Enduring Freedom and Iraqi Freedom.

    PubMed

    Dole, Mark J; Kissane, Jonathan M

    2016-01-01

    Medical Logistics (MEDLOG) is a function of the Army's integrated System for Health that provides the medical products and specialized logistics services required to deliver health protection and care under all operational conditions. In unified land operations, MEDLOG is an inherent function of Health Service Support (HSS), which also includes casualty care and medical evacuation. This paper focuses on a few key lessons observed during Operations Enduring Freedom and Iraqi Freedom with direct implications for the support of HSS in future operations as envisioned in the Army Operating Concept and the Joint Concept for Health Services. It also examines a few key enablers that helped mitigate these challenges that are not yet fully acknowledged in Army Medical Department doctrine, policy, and planning.

  11. Independent Space Operators: Gaining a Voice in Design for Operability

    NASA Technical Reports Server (NTRS)

    McCleskey, Carey M.; Claybaugh, William R., II

    2006-01-01

    Affordable and sustainable space exploration remains an elusive goal. We explore the competitive advantages of evolving towards independent operators for space transportation in our economy. We consider the pros and cons of evolving business organizations that operate and maintain space transportation system assets independently from flight system manufacturers and from host spaceports. The case is made that a more competitive business climate for creating inherently operable, dependable, and supportable space transportation systems can evolve out of today's traditional vertical business model-a model within which the voice of the operator is often heard, but rarely acted upon during crucial design commitments and critical design processes. Thus new business models may be required, driven less by hardware consumption and more by space system utilization.

  12. 40 CFR 75.42 - Reliability criteria.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Reliability criteria. 75.42 Section 75...) CONTINUOUS EMISSION MONITORING Alternative Monitoring Systems § 75.42 Reliability criteria. To demonstrate reliability equal to or better than the continuous emission monitoring system, the owner or operator shall...

  13. Control of a 30 cm diameter mercury bombardment thruster

    NASA Technical Reports Server (NTRS)

    Terdan, F. F.; Bechtel, R. T.

    1973-01-01

    Increased thruster performance has made closed-loop automatic control more difficult than previously. Specifically, high perveance optics tend to make reliable recycling more difficult. Control logic functions were established for three automatic modes of operation of a 30-cm thruster using a power conditioner console with flight-like characteristics. The three modes provide (1) automatic startup to reach thermal stability, (2) steady-state closed-loop control, and (3) the reliable recycling of the high voltages following an arc breakdown to reestablish normal operation. Power supply impedance characteristics necessary for stable operation and the effect of the magnetic baffle on the reliable recycling was studied.

  14. A computational framework for modeling targets as complex adaptive systems

    NASA Astrophysics Data System (ADS)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  15. Ammonia and ammonium hydroxide sensors for ammonia/water absorption machines: Literature review and data compilation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anheier, N.C. Jr.; McDonald, C.E.; Cuta, J.M.

    1995-05-01

    This report describes an evaluation of various sensing techniques for determining the ammonia concentration in the working fluid of ammonia/water absorption cycle systems. The purpose of this work was to determine if any existing sensor technology or instrumentation could provide an accurate, reliable, and cost-effective continuous measure of ammonia concentration in water. The resulting information will be used for design optimization and cycle control in an ammonia-absorption heat pump. PNL researchers evaluated each sensing technology against a set of general requirements characterizing the potential operating conditions within the absorption cycle. The criteria included the physical constraints for in situ operation,more » sensor characteristics, and sensor application. PNL performed an extensive literature search, which uncovered several promising sensing technologies that might be applicable to this problem. Sixty-two references were investigated, and 33 commercial vendors were identified as having ammonia sensors. The technologies for ammonia sensing are acoustic wave, refractive index, electrode, thermal, ion-selective field-effect transistor (ISFET), electrical conductivity, pH/colormetric, and optical absorption. Based on information acquired in the literature search, PNL recommends that follow-on activities focus on ISFET devices and a fiber optic evanescent sensor with a colormetric indicator. The ISFET and fiber optic evanescent sensor are inherently microminiature and capable of in situ measurements. Further, both techniques have been demonstrated selective to the ammonium ion (NH{sub 4}{sup +}). The primary issue remaining is how to make the sensors sufficiently corrosion-resistant to be useful in practice.« less

  16. Ignition and Inertial Confinement Fusion at The National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Moses, Edward I.

    2016-10-01

    The National Ignition Facility (NIF), the world's largest and most powerful laser system for inertial confinement fusion (ICF) and for studying high-energy-density (HED) science, is now operational at Lawrence Livermore National Laboratory (LLNL). The NIF is now conducting experiments to commission the laser drive, the hohlraum and the capsule and to develop the infrastructure needed to begin the first ignition experiments in FY 2010. Demonstration of ignition and thermonuclear bum in the laboratory is a major NIF goal. NIF will achieve this by concentrating the energy from the 192 beams into a mm3-sized target and igniting a deuterium-tritium mix, liberating more energy than is required to initiate the fusion reaction. NIP's ignition program is a national effort managed via the National Ignition Campaign (NIC). The NIC has two major goals: execution of DT ignition experiments starting in FY20l0 with the goal of demonstrating ignition and a reliable, repeatable ignition platform by the conclusion of the NIC at the end of FY2012. The NIC will also develop the infrastructure and the processes required to operate NIF as a national user facility. The achievement of ignition at NIF will demonstrate the scientific feasibility of ICF and focus worldwide attention on laser fusion as a viable energy option. A laser fusion-based energy concept that builds on NIF, known as LIFE (Laser Inertial Fusion Energy), is currently under development. LIFE is inherently safe and can provide a global carbon-free energy generation solution in the 21st century. This paper describes recent progress on NIF, NIC, and the LIFE concept.

  17. Identification of Rotorcraft Structural Dynamics from Flight and Wind Tunnel Data

    NASA Technical Reports Server (NTRS)

    McKillip, Robert M., Jr.

    1997-01-01

    Excessive vibration remains one one of the most difficult problems that faces the helicopter industry today, affecting all production helicopters at some phase of their development. Vibrations in rotating structures may arise from external periodic dynamic airloads whose frequencies are are close to the natural frequencies of the rotating system itself. The goal for the structures engineer would thus be to design a structure as free from resonance effects as possible. In the case of a helicopter rotor blade these dynamic loads are a consequence of asymmetric airload distribution on the rotor blade in forward flight, leading to a rich collection of higher harmonic airloads that force rotor and airframe response. Accurate prediction of the dynamic characteristics of a helicopter rotor blade will provide the opportunity to affect in a positive manner noise intensity, vibration level, durability, reliability and operating costs by reducing objectionable frequencies or moving them to a different frequency range and thus providing us with a lower vibration rotor. In fact, the dynamic characteristics tend to define the operating limits of a rotorcraft. As computing power has increased greatly over the last decade, researchers and engineers have turned to analyzing the vibrational characteristics of aerospace structures at the design and development stage of the production of an aircraft. Modern rotor blade construction methods lead to products with low mass and low inherent damping so careful design and analysis is required to avoid resonance and an undesirable dynamic performance. In addition, accurate modal analysis is necessary for several current approaches in elastic system identification and active control.

  18. Global optimization algorithms to compute thermodynamic equilibria in large complex systems with performance considerations

    DOE PAGES

    Piro, M. H. A.; Simunovic, S.

    2016-03-17

    Several global optimization methods are reviewed that attempt to ensure that the integral Gibbs energy of a closed isothermal isobaric system is a global minimum to satisfy the necessary and sufficient conditions for thermodynamic equilibrium. In particular, the integral Gibbs energy function of a multicomponent system containing non-ideal phases may be highly non-linear and non-convex, which makes finding a global minimum a challenge. Consequently, a poor numerical approach may lead one to the false belief of equilibrium. Furthermore, confirming that one reaches a global minimum and that this is achieved with satisfactory computational performance becomes increasingly more challenging in systemsmore » containing many chemical elements and a correspondingly large number of species and phases. Several numerical methods that have been used for this specific purpose are reviewed with a benchmark study of three of the more promising methods using five case studies of varying complexity. A modification of the conventional Branch and Bound method is presented that is well suited to a wide array of thermodynamic applications, including complex phases with many constituents and sublattices, and ionic phases that must adhere to charge neutrality constraints. Also, a novel method is presented that efficiently solves the system of linear equations that exploits the unique structure of the Hessian matrix, which reduces the calculation from a O(N 3) operation to a O(N) operation. As a result, this combined approach demonstrates efficiency, reliability and capabilities that are favorable for integration of thermodynamic computations into multi-physics codes with inherent performance considerations.« less

  19. ICS logging solution for network-based attacks using Gumistix technology

    NASA Astrophysics Data System (ADS)

    Otis, Jeremy R.; Berman, Dustin; Butts, Jonathan; Lopez, Juan

    2013-05-01

    Industrial Control Systems (ICS) monitor and control operations associated with the national critical infrastructure (e.g., electric power grid, oil and gas pipelines and water treatment facilities). These systems rely on technologies and architectures that were designed for system reliability and availability. Security associated with ICS was never an inherent concern, primarily due to the protections afforded by network isolation. However, a trend in ICS operations is to migrate to commercial networks via TCP/IP in order to leverage commodity benefits and cost savings. As a result, system vulnerabilities are now exposed to the online community. Indeed, recent research has demonstrated that many exposed ICS devices are being discovered using readily available applications (e.g., ShodanHQ search engine and Google-esque queries). Due to the lack of security and logging capabilities for ICS, most knowledge about attacks are derived from real world incidents after an attack has already been carried out and the damage has been done. This research provides a method for introducing sensors into the ICS environment that collect information about network-based attacks. The sensors are developed using an inexpensive Gumstix platform that can be deployed and incorporated with production systems. Data obtained from the sensors provide insight into attack tactics (e.g., port scans, Nessus scans, Metasploit modules, and zero-day exploits) and characteristics (e.g., attack origin, frequency, and level of persistence). Findings enable security professionals to draw an accurate, real-time awareness of the threats against ICS devices and help shift the security posture from reactionary to preventative.

  20. Global optimization algorithms to compute thermodynamic equilibria in large complex systems with performance considerations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piro, M. H. A.; Simunovic, S.

    Several global optimization methods are reviewed that attempt to ensure that the integral Gibbs energy of a closed isothermal isobaric system is a global minimum to satisfy the necessary and sufficient conditions for thermodynamic equilibrium. In particular, the integral Gibbs energy function of a multicomponent system containing non-ideal phases may be highly non-linear and non-convex, which makes finding a global minimum a challenge. Consequently, a poor numerical approach may lead one to the false belief of equilibrium. Furthermore, confirming that one reaches a global minimum and that this is achieved with satisfactory computational performance becomes increasingly more challenging in systemsmore » containing many chemical elements and a correspondingly large number of species and phases. Several numerical methods that have been used for this specific purpose are reviewed with a benchmark study of three of the more promising methods using five case studies of varying complexity. A modification of the conventional Branch and Bound method is presented that is well suited to a wide array of thermodynamic applications, including complex phases with many constituents and sublattices, and ionic phases that must adhere to charge neutrality constraints. Also, a novel method is presented that efficiently solves the system of linear equations that exploits the unique structure of the Hessian matrix, which reduces the calculation from a O(N 3) operation to a O(N) operation. As a result, this combined approach demonstrates efficiency, reliability and capabilities that are favorable for integration of thermodynamic computations into multi-physics codes with inherent performance considerations.« less

  1. AMTEC: Current status and vision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levy, G.C.; Hunt, T.K.; Sievers, R.K.

    1997-12-31

    The recent history of alkali metal thermal-to-electric conversion (AMTEC) has been tantalizing as technical advances have struck down most of the remaining barriers for realization of practical applications. AMTEC has always offered promise with its inherently noise-free, vibration-free, and high efficiency operation. Today`s AMTEC cells are also compact, lightweight and reliable, achieving near 20% conversion efficiency. Pathways have been defined that should lead to efficiencies of 30% or higher within two years. Prototype AMTEC devices are being built today for applications ranging from powering deep space probes (100--150 W) to residential appliance cogeneration (350--500 W) to remote and portable powermore » units (10--500 W). Multi-kilowatt systems may be only two years away. Current designs have power densities of 100--200 W/kg. Where is AMTEC technology at the start of the new millennium? Performance will exceed the numbers given above with the power capacity reaching 10 kW or more. These high power systems will also provide 100 volts or more when desired. Some AMTEC devices may be designed to operate at input temperatures well below that required today (800--900 C), providing more flexibility on the choice of heat source. Realization of industrial and consumer applications for AMTEDC will depend on manufacturing economies achieved through simplification of cell fabrication and high volume production. Advanced Modular Power Systems, Inc. is developing AMTEC manufacturing technology which may lead to costs under $25/watt within two years and under $1/watt eventually. At this cost, AMTEC devices will find broad consumer, and industrial applications.« less

  2. Development of YAG:Dy Thermographic Phosphor Coatings for Turbine Engine Applications

    NASA Technical Reports Server (NTRS)

    Eldridge, J. I.; Jenkins, T. P.; Allison, S. W.; Wolfe, D. E.; Jordan, E. H.

    2012-01-01

    The selection and development of thermographic phosphor coatings were pursued to meet the objective of demonstrating luminescence-decay-based temperature measurements up to 1300C on the surface of a vane in an operating demonstrator turbine engine. To meet this objective, YAG:Dy was selected based on the desirable luminescence performance observed for YAG:Dy powder: (1) excellent temperature sensitivity and intensity at operating turbine engine temperatures, (2) an emission peak at the relatively short wavelength of 456 nm, where the interference from background blackbody radiation is fairly low, and (3) its nearly single exponential decay which makes for a simple, reliable temperature calibration. However, implementation of YAG:Dy for surface temperature measurements required application of YAG:Dy as a coating onto the surface of a superalloy component with a preexisting yttria-stabilized zirconia (YSZ) thermal barrier coating (TBC). An inherent dilemma in producing a YAG:Dy coating is that coating processing is constrained to be performed at temperatures below (less than 1200C) what is considered safe for the superalloy component, much lower than temperatures used to produce the high quality crystalline powder. Therefore, YAG:Dy coatings tend to exhibit lower luminescence performance compared to well prepared YAG:Dy powder, and the luminescence performance of the coating will depend on the method of coating deposition. In this presentation, the luminescence performance of YAG:Dy coatings prepared by the different methods of (1) application of a binder-based YAG:Dy-containing paint, (2) solution precursor plasma spray (SPPS), and (3) electron-beam physical vapor deposition (EB-PVD) and the effect of post-deposition heat treatments will be discussed.

  3. The role of emotions in moral case deliberation: theory, practice, and methodology.

    PubMed

    Molewijk, Bert; Kleinlugtenbelt, Dick; Widdershoven, Guy

    2011-09-01

    In clinical moral decision making, emotions often play an important role. However, many clinical ethicists are ignorant, suspicious or even critical of the role of emotions in making moral decisions and in reflecting on them. This raises practical and theoretical questions about the understanding and use of emotions in clinical ethics support services. This paper presents an Aristotelian view on emotions and describes its application in the practice of moral case deliberation. According to Aristotle, emotions are an original and integral part of (virtue) ethics. Emotions are an inherent part of our moral reasoning and being, and therefore they should be an inherent part of any moral deliberation. Based on Aristotle's view, we examine five specific aspects of emotions: the description of emotions, the attitude towards emotions, the thoughts present in emotions, the reliability of emotions, and the reasonable principle that guides an emotion. We then discuss three ways of dealing with emotions in the process of moral case deliberation. Finally, we present an Aristotelian conversation method, and present practical experiences using this method. © 2011 Blackwell Publishing Ltd.

  4. Can reliable values of Young's modulus be deduced from Fisher's (1971) spinning lens measurements?

    PubMed

    Burd, H J; Wilde, G S; Judge, S J

    2006-04-01

    The current textbook view of the causes of presbyopia rests very largely on a series of experiments reported by R.F. Fisher some three decades ago, and in particular on the values of lens Young's modulus inferred from the deformation caused by spinning excised lenses about their optical axis (Fisher 1971) We studied the extent to which inferred values of Young's modulus are influenced by assumptions inherent in the mathematical procedures used by Fisher to interpret the test and we investigated several alternative interpretation methods. The results suggest that modelling assumptions inherent in Fisher's original method may have led to systematic errors in the determination of the Young's modulus of the cortex and nucleus. Fisher's conclusion that the cortex is stiffer than the nucleus, particularly in middle age, may be an artefact associated with these systematic errors. Moreover, none of the models we explored are able to account for Fisher's claim that the removal of the capsule has only a modest effect on the deformations induced in the spinning lens.

  5. Displaying contextual information reduces the costs of imperfect decision automation in rapid retasking of ISR assets.

    PubMed

    Rovira, Ericka; Cross, Austin; Leitch, Evan; Bonaceto, Craig

    2014-09-01

    The impact of a decision support tool designed to embed contextual mission factors was investigated. Contextual information may enable operators to infer the appropriateness of data underlying the automation's algorithm. Research has shown the costs of imperfect automation are more detrimental than perfectly reliable automation when operators are provided with decision support tools. Operators may trust and rely on the automation more appropriately if they understand the automation's algorithm. The need to develop decision support tools that are understandable to the operator provides the rationale for the current experiment. A total of 17 participants performed a simulated rapid retasking of intelligence, surveillance, and reconnaissance (ISR) assets task with manual, decision automation, or contextual decision automation differing in two levels of task demand: low or high. Automation reliability was set at 80%, resulting in participants experiencing a mixture of reliable and automation failure trials. Dependent variables included ISR coverage and response time of replanning routes. Reliable automation significantly improved ISR coverage when compared with manual performance. Although performance suffered under imperfect automation, contextual decision automation helped to reduce some of the decrements in performance. Contextual information helps overcome the costs of imperfect decision automation. Designers may mitigate some of the performance decrements experienced with imperfect automation by providing operators with interfaces that display contextual information, that is, the state of factors that affect the reliability of the automation's recommendation.

  6. Accountability: To Whom--For What?

    ERIC Educational Resources Information Center

    Sharp, Rachel

    1991-01-01

    Australian higher education has an elitist disregard for the needs, interests, and concerns of the rest of society, and operates under a system that is inherently unaccountable in any democratic sense. Dawkinism does little to change this, and the academic community appears unable or unwilling to question the status quo. (MSE)

  7. 46 CFR 195.35-5 - General.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 1014 (incorporated by reference, see § 195.01-3). (d) All lifelines shall be of steel or bronze wire rope. Steel wire rope shall be either inherently corrosion-resistant, or made so by galvanizing or... breaking strength of 1,500 pounds. (e) All equipment shall be maintained in an operative condition, and it...

  8. 46 CFR 195.35-5 - General.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 1014 (incorporated by reference, see § 195.01-3). (d) All lifelines shall be of steel or bronze wire rope. Steel wire rope shall be either inherently corrosion-resistant, or made so by galvanizing or... breaking strength of 1,500 pounds. (e) All equipment shall be maintained in an operative condition, and it...

  9. 46 CFR 195.35-5 - General.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 1014 (incorporated by reference, see § 195.01-3). (d) All lifelines shall be of steel or bronze wire rope. Steel wire rope shall be either inherently corrosion-resistant, or made so by galvanizing or... breaking strength of 1,500 pounds. (e) All equipment shall be maintained in an operative condition, and it...

  10. Prospective Mathematics Teachers' Sense Making of Polynomial Multiplication and Factorization Modeled with Algebra Tiles

    ERIC Educational Resources Information Center

    Caglayan, Günhan

    2013-01-01

    This study is about prospective secondary mathematics teachers' understanding and sense making of representational quantities generated by algebra tiles, the quantitative units (linear vs. areal) inherent in the nature of these quantities, and the quantitative addition and multiplication operations--referent preserving versus referent…

  11. Next generation molten NaI batteries for grid scale energy storage

    NASA Astrophysics Data System (ADS)

    Small, Leo J.; Eccleston, Alexis; Lamb, Joshua; Read, Andrew C.; Robins, Matthew; Meaders, Thomas; Ingersoll, David; Clem, Paul G.; Bhavaraju, Sai; Spoerke, Erik D.

    2017-08-01

    Robust, safe, and reliable grid-scale energy storage continues to be a priority for improved energy surety, expanded integration of renewable energy, and greater system agility required to meet modern dynamic and evolving electrical energy demands. We describe here a new sodium-based battery based on a molten sodium anode, a sodium iodide/aluminum chloride (NaI/AlCl3) cathode, and a high conductivity NaSICON (Na1+xZr2SixP3-xO12) ceramic separator. This NaI battery operates at intermediate temperatures (120-180 °C) and boasts an energy density of >150 Wh kg-1. The energy-dense NaI-AlCl3 ionic liquid catholyte avoids lifetime-limiting plating and intercalation reactions, and the use of earth-abundant elements minimizes materials costs and eliminates economic uncertainties associated with lithium metal. Moreover, the inherent safety of this system under internal mechanical failure is characterized by negligible heat or gas production and benign reaction products (Al, NaCl). Scalability in design is exemplified through evolution from 0.85 to 10 Ah (28 Wh) form factors, displaying lifetime average Coulombic efficiencies of 99.45% and energy efficiencies of 81.96% over dynamic testing lasting >3000 h. This demonstration promises a safe, cost-effective, and long-lifetime technology as an attractive candidate for grid scale storage.

  12. The effects of taboo-related distraction on driving performance.

    PubMed

    Chan, Michelle; Madan, Christopher R; Singhal, Anthony

    2016-07-01

    Roadside billboards containing negative and positive emotional content have been shown to influence driving performance, however, the impact of highly arousing taboo information is unknown. Taboo information more reliably evokes emotional arousal and can lead to greater attentional capture due to its inherent 'shock value.' The objective of the present study was to examine driver distraction associated with four types of information presented on roadside billboards: highly arousing taboo words, moderately arousing positive and negative words, and non-arousing neutral words. Participants viewed blocks of taboo, positive, negative and neutral words presented on roadside billboards while operating a driving simulator. They also responded to target (household-related) words by pressing a button on the steering wheel. At the end of the session, a surprise recall task was completed for all the words they saw while driving. Results showed that taboo words captured the most attention as revealed by better memory recall compared to all the other word types. Interestingly, taboo words were associated with better lane control compared to the other word types. We suggest that taboo-related arousal can enhance attentional focus during a complex task like simulated driving. That is, in a highly arousing situation, attention is selectively narrowed to the road ahead, resulting in better lane control. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Infrared sensors for Earth observation missions

    NASA Astrophysics Data System (ADS)

    Ashcroft, P.; Thorne, P.; Weller, H.; Baker, I.

    2007-10-01

    SELEX S&AS is developing a family of infrared sensors for earth observation missions. The spectral bands cover shortwave infrared (SWIR) channels from around 1μm to long-wave infrared (LWIR) channels up to 15μm. Our mercury cadmium telluride (MCT) technology has enabled a sensor array design that can satisfy the requirements of all of the SWIR and medium-wave infrared (MWIR) bands with near-identical arrays. This is made possible by the combination of a set of existing technologies that together enable a high degree of flexibility in the pixel geometry, sensitivity, and photocurrent integration capacity. The solution employs a photodiode array under the control of a readout integrated circuit (ROIC). The ROIC allows flexible geometries and in-pixel redundancy to maximise operability and reliability, by combining the photocurrent from a number of photodiodes into a single pixel. Defective or inoperable diodes (or "sub-pixels") can be deselected with tolerable impact on the overall pixel performance. The arrays will be fabricated using the "loophole" process in MCT grown by liquid-phase epitaxy (LPE). These arrays are inherently robust, offer high quantum efficiencies and have been used in previous space programs. The use of loophole arrays also offers access to SELEX's avalanche photodiode (APD) technology, allowing low-noise, highly uniform gain at the pixel level where photon flux is very low.

  14. Forecasting natural hazards, performance of scientists, ethics, and the need for transparency

    PubMed Central

    Guzzetti, Fausto

    2016-01-01

    Landslides are one of several natural hazards. As other natural hazards, landslides are difficult to predict, and their forecasts are uncertain. The uncertainty depends on the poor understanding of the phenomena that control the slope failures, and on the inherent complexity and chaotic nature of the landslides. This is similar to other natural hazards, including hurricanes, earthquakes, volcanic eruptions, floods, and droughts. Due to the severe impact of landslides on the population, the environment, and the economy, forecasting landslides is of scientific interest and of societal relevance, and scientists attempting to forecast landslides face known and new problems intrinsic to the multifaceted interactions between science, decision-making, and the society. The problems include deciding on the authority and reliability of individual scientists and groups of scientists, and evaluating the performances of individual scientists, research teams, and their institutions. Related problems lay in the increasing subordination of research scientists to politics and decision-makers, and in the conceptual and operational models currently used to organize and pay for research, based on apparently objective criteria and metrics, considering science as any other human endeavor, and favoring science that produces results of direct and immediate application. The paper argues that the consequences of these problems have not been considered fully. PMID:27695154

  15. Forecasting natural hazards, performance of scientists, ethics, and the need for transparency.

    PubMed

    Guzzetti, Fausto

    2016-10-20

    Landslides are one of several natural hazards. As other natural hazards, landslides are difficult to predict, and their forecasts are uncertain. The uncertainty depends on the poor understanding of the phenomena that control the slope failures, and on the inherent complexity and chaotic nature of the landslides. This is similar to other natural hazards, including hurricanes, earthquakes, volcanic eruptions, floods, and droughts. Due to the severe impact of landslides on the population, the environment, and the economy, forecasting landslides is of scientific interest and of societal relevance, and scientists attempting to forecast landslides face known and new problems intrinsic to the multifaceted interactions between science, decision-making, and the society. The problems include deciding on the authority and reliability of individual scientists and groups of scientists, and evaluating the performances of individual scientists, research teams, and their institutions. Related problems lay in the increasing subordination of research scientists to politics and decision-makers, and in the conceptual and operational models currently used to organize and pay for research, based on apparently objective criteria and metrics, considering science as any other human endeavor, and favoring science that produces results of direct and immediate application. The paper argues that the consequences of these problems have not been considered fully.

  16. Modeling man: the monkey colony at the Carnegie Institution of Washington's Department of Embryology, 1925-1971.

    PubMed

    Wilson, Emily K

    2012-01-01

    Though better recognized for its immediate endeavors in human embryo research, the Carnegie Department of Embryology also employed a breeding colony of rhesus macaques for the purposes of studying human reproduction. This essay follows the course of the first enterprise in maintaining a primate colony for laboratory research and the overlapping scientific, social, and political circumstances that tolerated and cultivated the colony's continued operation from 1925 until 1971. Despite a new-found priority for reproductive sciences in the United States, by the early 1920s an unfertilized human ovum had not yet been seen and even the timing of ovulation remained unresolved. Progress would require an organized research approach that could extend beyond the limitations of working with scant and inherently restrictive human subjects or with common lab mammals like mice. In response, the Department of Embryology, under the Carnegie Institution of Washington (CIW), instituted a novel methodology using a particular primate species as a surrogate in studying normal human reproductive physiology. Over more than 40 years the monkey colony followed an unpremeditated trajectory that would contribute fundamentally to discoveries in human reproduction, early embryo development, reliable birth control methods, and to the establishment of the rhesus macaque as a common model organism.

  17. TASK ALLOCATION IN GEO-DISTRIBUTED CYBER-PHYSICAL SYSTEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aggarwal, Rachit; Smidts, Carol

    This paper studies the task allocation algorithm for a distributed test facility (DTF), which aims to assemble geo-distributed cyber (software) and physical (hardware in the loop components into a prototype cyber-physical system (CPS). This allows low cost testing on an early conceptual prototype (ECP) of the ultimate CPS (UCPS) to be developed. The DTF provides an instrumentation interface for carrying out reliability experiments remotely such as fault propagation analysis and in-situ testing of hardware and software components in a simulated environment. Unfortunately, the geo-distribution introduces an overhead that is not inherent to the UCPS, i.e. a significant time delay inmore » communication that threatens the stability of the ECP and is not an appropriate representation of the behavior of the UCPS. This can be mitigated by implementing a task allocation algorithm to find a suitable configuration and assign the software components to appropriate computational locations, dynamically. This would allow the ECP to operate more efficiently with less probability of being unstable due to the delays introduced by geo-distribution. The task allocation algorithm proposed in this work uses a Monte Carlo approach along with Dynamic Programming to identify the optimal network configuration to keep the time delays to a minimum.« less

  18. A Hybrid Readout Solution for GaN-Based Detectors Using CMOS Technology †

    PubMed Central

    Hancock, Bruce; Nikzad, Shouleh; Bell, L. Douglas; Kroep, Kees; Charbon, Edoardo

    2018-01-01

    Gallium nitride (GaN) and its alloys are becoming preferred materials for ultraviolet (UV) detectors due to their wide bandgap and tailorable out-of-band cutoff from 3.4 eV to 6.2 eV. GaN based avalanche photodiodes (APDs) are particularly suitable for their high photon sensitivity and quantum efficiency in the UV region and for their inherent insensitivity to visible wavelengths. Challenges exist however for practical utilization. With growing interests in such photodetectors, hybrid readout solutions are becoming prevalent with CMOS technology being adopted for its maturity, scalability, and reliability. In this paper, we describe our approach to combine GaN APDs with a CMOS readout circuit, comprising of a linear array of 1 × 8 capacitive transimpedance amplifiers (CTIAs), implemented in a 0.35 µm high voltage CMOS technology. Further, we present a simple, yet sustainable circuit technique to allow operation of APDs under high reverse biases, up to ≈80 V with verified measurement results. The readout offers a conversion gain of 0.43 µV/e−, obtaining avalanche gains up to 103. Several parameters of the CTIA are discussed followed by a perspective on possible hybridization, exploiting the advantages of a 3D-stacked technology. PMID:29401655

  19. 18 CFR 40.2 - Mandatory Reliability Standards.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...-POWER SYSTEM § 40.2 Mandatory Reliability Standards. (a) Each applicable user, owner or operator of the Bulk-Power System must comply with Commission-approved Reliability Standards developed by the Electric... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Mandatory Reliability...

  20. Wind Energy Forecasting: A Collaboration of the National Center for Atmospheric Research (NCAR) and Xcel Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parks, K.; Wan, Y. H.; Wiener, G.

    2011-10-01

    The focus of this report is the wind forecasting system developed during this contract period with results of performance through the end of 2010. The report is intentionally high-level, with technical details disseminated at various conferences and academic papers. At the end of 2010, Xcel Energy managed the output of 3372 megawatts of installed wind energy. The wind plants span three operating companies1, serving customers in eight states2, and three market structures3. The great majority of the wind energy is contracted through power purchase agreements (PPAs). The remainder is utility owned, Qualifying Facilities (QF), distributed resources (i.e., 'behind the meter'),more » or merchant entities within Xcel Energy's Balancing Authority footprints. Regardless of the contractual or ownership arrangements, the output of the wind energy is balanced by Xcel Energy's generation resources that include fossil, nuclear, and hydro based facilities that are owned or contracted via PPAs. These facilities are committed and dispatched or bid into day-ahead and real-time markets by Xcel Energy's Commercial Operations department. Wind energy complicates the short and long-term planning goals of least-cost, reliable operations. Due to the uncertainty of wind energy production, inherent suboptimal commitment and dispatch associated with imperfect wind forecasts drives up costs. For example, a gas combined cycle unit may be turned on, or committed, in anticipation of low winds. The reality is winds stayed high, forcing this unit and others to run, or be dispatched, to sub-optimal loading positions. In addition, commitment decisions are frequently irreversible due to minimum up and down time constraints. That is, a dispatcher lives with inefficient decisions made in prior periods. In general, uncertainty contributes to conservative operations - committing more units and keeping them on longer than may have been necessary for purposes of maintaining reliability. The downside is costs are higher. In organized electricity markets, units that are committed for reliability reasons are paid their offer price even when prevailing market prices are lower. Often, these uplift charges are allocated to market participants that caused the inefficient dispatch in the first place. Thus, wind energy facilities are burdened with their share of costs proportional to their forecast errors. For Xcel Energy, wind energy uncertainty costs manifest depending on specific market structures. In the Public Service of Colorado (PSCo), inefficient commitment and dispatch caused by wind uncertainty increases fuel costs. Wind resources participating in the Midwest Independent System Operator (MISO) footprint make substantial payments in the real-time markets to true-up their day-ahead positions and are additionally burdened with deviation charges called a Revenue Sufficiency Guarantee (RSG) to cover out of market costs associated with operations. Southwest Public Service (SPS) wind plants cause both commitment inefficiencies and are charged Southwest Power Pool (SPP) imbalance payments due to wind uncertainty and variability. Wind energy forecasting helps mitigate these costs. Wind integration studies for the PSCo and Northern States Power (NSP) operating companies have projected increasing costs as more wind is installed on the system due to forecast error. It follows that reducing forecast error would reduce these costs. This is echoed by large scale studies in neighboring regions and states that have recommended adoption of state-of-the-art wind forecasting tools in day-ahead and real-time planning and operations. Further, Xcel Energy concluded reduction of the normalized mean absolute error by one percent would have reduced costs in 2008 by over $1 million annually in PSCo alone. The value of reducing forecast error prompted Xcel Energy to make substantial investments in wind energy forecasting research and development.« less

  1. Time-domain diffuse optical tomography using silicon photomultipliers: feasibility study.

    PubMed

    Di Sieno, Laura; Zouaoui, Judy; Hervé, Lionel; Pifferi, Antonio; Farina, Andrea; Martinenghi, Edoardo; Derouard, Jacques; Dinten, Jean-Marc; Mora, Alberto Dalla

    2016-11-01

    Silicon photomultipliers (SiPMs) have been very recently introduced as the most promising detectors in the field of diffuse optics, in particular due to the inherent low cost and large active area. We also demonstrate the suitability of SiPMs for time-domain diffuse optical tomography (DOT). The study is based on both simulations and experimental measurements. Results clearly show excellent performances in terms of spatial localization of an absorbing perturbation, thus opening the way to the use of SiPMs for DOT, with the possibility to conceive a new generation of low-cost and reliable multichannel tomographic systems.

  2. Post-Test Analysis of a 10-Year Sodium Heat Pipe Life Test

    NASA Technical Reports Server (NTRS)

    Rosenfeld, John H.; Locci, Ivan E.; Sanzi, James L.; Hull, David R.; Geng, Steven M.

    2011-01-01

    High-temperature heat pipes are being evaluated for use in energy conversion applications such as fuel cells, gas turbine re-combustors, Stirling cycle heat sources; and with the resurgence of space nuclear power both as reactor heat removal elements and as radiator elements. Long operating life and reliable performance are critical requirements for these applications. Accordingly, long-term materials compatibility is being evaluated through the use of high-temperature life test heat pipes. Thermacore, Inc., has carried out a sodium heat pipe 10-year life test to establish long-term operating reliability. Sodium heat pipes have demonstrated favorable materials compatibility and heat transport characteristics at high operating temperatures in air over long time periods. A representative one-tenth segment Stirling Space Power Converter heat pipe with an Inconel 718 envelope and a stainless steel screen wick has operated for over 87,000 hr (10 years) at nearly 700 C. These life test results have demonstrated the potential for high-temperature heat pipes to serve as reliable energy conversion system components for power applications that require long operating lifetime with high reliability. Detailed design specifications, operating history, and post-test analysis of the heat pipe and sodium working fluid are described. Lessons learned and future life test plans are also discussed.

  3. Highly-reliable operation of 638-nm broad stripe laser diode with high wall-plug efficiency for display applications

    NASA Astrophysics Data System (ADS)

    Yagi, Tetsuya; Shimada, Naoyuki; Nishida, Takehiro; Mitsuyama, Hiroshi; Miyashita, Motoharu

    2013-03-01

    Laser based displays, as pico to cinema laser projectors have gathered much attention because of wide gamut, low power consumption, and so on. Laser light sources for the displays are operated mainly in CW, and heat management is one of the big issues. Therefore, highly efficient operation is necessitated. Also the light sources for the displays are requested to be highly reliable. 638 nm broad stripe laser diode (LD) was newly developed for high efficiency and highly reliable operation. An AlGaInP/GaAs red LD suffers from low wall plug efficiency (WPE) due to electron overflow from an active layer to a p-cladding layer. Large optical confinement factor (Γ) design with AlInP cladding layers is adopted to improve the WPE. The design has a disadvantage for reliable operation because the large Γ causes high optical density and brings a catastrophic optical degradation (COD) at a front facet. To overcome the disadvantage, a window-mirror structure is also adopted in the LD. The LD shows WPE of 35% at 25°C, highest record in the world, and highly stable operation at 35°C, 550 mW up to 8,000 hours without any catastrophic optical degradation.

  4. CERTS: Consortium for Electric Reliability Technology Solutions - Research Highlights

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eto, Joseph

    2003-07-30

    Historically, the U.S. electric power industry was vertically integrated, and utilities were responsible for system planning, operations, and reliability management. As the nation moves to a competitive market structure, these functions have been disaggregated, and no single entity is responsible for reliability management. As a result, new tools, technologies, systems, and management processes are needed to manage the reliability of the electricity grid. However, a number of simultaneous trends prevent electricity market participants from pursuing development of these reliability tools: utilities are preoccupied with restructuring their businesses, research funding has declined, and the formation of Independent System Operators (ISOs) andmore » Regional Transmission Organizations (RTOs) to operate the grid means that control of transmission assets is separate from ownership of these assets; at the same time, business uncertainty, and changing regulatory policies have created a climate in which needed investment for transmission infrastructure and tools for reliability management has dried up. To address the resulting emerging gaps in reliability R&D, CERTS has undertaken much-needed public interest research on reliability technologies for the electricity grid. CERTS' vision is to: (1) Transform the electricity grid into an intelligent network that can sense and respond automatically to changing flows of power and emerging problems; (2) Enhance reliability management through market mechanisms, including transparency of real-time information on the status of the grid; (3) Empower customers to manage their energy use and reliability needs in response to real-time market price signals; and (4) Seamlessly integrate distributed technologies--including those for generation, storage, controls, and communications--to support the reliability needs of both the grid and individual customers.« less

  5. Report: Eleven Years After Agreement, EPA Has Not Developed Reliable Emission Estimation Methods to Determine Whether Animal Feeding Operations Comply With Clean Air Act and Other Statutes

    EPA Pesticide Factsheets

    Report #17-P-0396, September 19, 2017. Until the EPA develops sound methods to estimate emissions, the agency cannot reliably determine whether animal feeding operations comply with applicable Clean Air Act requirements.

  6. Transit Reliability Information Program (TRIP) : Final Technical Report

    DOT National Transportation Integrated Search

    1984-05-01

    The Transit Reliability Information Program (TRIP) is a government-initiated program to assist the transit industry in satisfying its need for rail transit car subsystem reliability information. TRIP provided this assistance through the operation of ...

  7. Transit Reliability Information Program (TRIP) Phase I Report

    DOT National Transportation Integrated Search

    1981-06-01

    The Transit Reliability Information Program (TRIP) is a government initiated program to assist the transit industry in satisfying its need for transit reliability information. TRIP provides this assistance through the operation of a national reliabil...

  8. An Overview of a Trajectory-Based Solution for En Route and Terminal Area Self-Spacing: Fourth Revision

    NASA Technical Reports Server (NTRS)

    Abbott, Terence S.

    2013-01-01

    This paper presents an overview of the fourth major revision to an algorithm specifically designed to support NASA's Airborne Precision Spacing concept. This airborne self-spacing concept is trajectory-based, allowing for spacing operations prior to the aircraft being on a common path. Because this algorithm is trajectory-based, it also has the inherent ability to support required-time-of-arrival (RTA) operations. This algorithm was also designed specifically to support a standalone, non-integrated implementation in the spacing aircraft. Revisions to this algorithm were based on a change to the expected operational environment.

  9. Fuel cycle cost reduction through Westinghouse fuel design and core management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frank, F.J.; Scherpereel, L.R.

    1985-11-01

    This paper describes advances in Westinghouse nuclear fuel and their impact on fuel cycle cost. Recent fabrication development has been aimed at maintaining high integrity, increased operating flexibility, longer operating cycles, and improved core margins. Development efforts at Westinghouse toward meeting these directions have culminated in VANTAGE 5 fuel. The current trend toward longer operating cycles provides a further driving force to minimize the resulting inherent increase in fuel cycle costs by further increases in region discharge burnup. Westinghouse studies indicate the capability of currently offered products to meet cycle lengths up to 24 months.

  10. Multichannel, Active Low-Pass Filters

    NASA Technical Reports Server (NTRS)

    Lev, James J.

    1989-01-01

    Multichannel integrated circuits cascaded to obtain matched characteristics. Gain and phase characteristics of channels of multichannel, multistage, active, low-pass filter matched by making filter of cascaded multichannel integrated-circuit operational amplifiers. Concept takes advantage of inherent equality of electrical characteristics of nominally-identical circuit elements made on same integrated-circuit chip. Characteristics of channels vary identically with changes in temperature. If additional matched channels needed, chips containing more than two operational amplifiers apiece (e.g., commercial quad operational amplifliers) used. Concept applicable to variety of equipment requiring matched gain and phase in multiple channels - radar, test instruments, communication circuits, and equipment for electronic countermeasures.

  11. Safety and IVHM

    NASA Technical Reports Server (NTRS)

    Goebel, Kai

    2012-01-01

    When we address safety in a book on the business case for IVHM, the question arises whether safety isn t inherently in conflict with the need of operators to run their systems as efficiently (and as cost effectively) as possible. The answer may be that the system needs to be just as safe as needed, but not significantly more. That begs the next question: How safe is safe enough? Several regulatory bodies provide guidelines for operational safety, but irrespective of that, operators do not want their systems to be known as lacking safety. We illuminate the role of safety within the context of IVHM.

  12. Virtual Ultrasound Guidance for Inexperienced Operators

    NASA Technical Reports Server (NTRS)

    Caine, Timothy; Martin, Davis

    2012-01-01

    Medical ultrasound or echocardiographic studies are highly operator-dependent and generally require lengthy training and internship to perfect. To obtain quality echocardiographic images in remote environments, such as on-orbit, remote guidance of studies has been employed. This technique involves minimal training for the user, coupled with remote guidance from an expert. When real-time communication or expert guidance is not available, a more autonomous system of guiding an inexperienced operator through an ultrasound study is needed. One example would be missions beyond low Earth orbit, in which the time delay inherent with communication will make remote guidance impractical.

  13. openECA Platform and Analytics Alpha Test Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Russell

    The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.

  14. openECA Platform and Analytics Beta Demonstration Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Russell

    The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.

  15. PRELIMINARY HAZARDS SUMMARY REPORT FOR THE VALLECITOS SUPERHEAT REACTOR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murray, J.L.

    1961-02-01

    BS>The Vallecitos Superheat Reactor (VSR) is a light-watermoderated, thermal-spectrum reactor, cooled by a combination of moderator boiling and forced convection cooling with saturated steam. The reactor core consists of 32 fuel hurdles containing 5300 lb of UO/sub 2/ enriched in U/sub 235/ to 3.6%. The fuel elements are arranged in individual process tubes that direct the cooling steam flow and separate the steam from the water moderator. The reactor vessel is designed for 1250 psig and operates at 960 to 1000 psig. With the reactor operating at 12.5 Mw(t), the maximum fuel cladding temperature is 1250 deg F and themore » cooling steam is superheated to an average temperature of about 810 deg F at 905 psig. Nu clear operation of the reactor is controlled by 12 control rods, actuated by drives mounted on the bottom of the reactor vessel. The water moderator recirculates inside the reactor vessel and through the core region by natural convection. Inherent safety features of the reactor include the negative core reactivity effects upon heating the UO/sub 2/ fuel (Doppler effect), upon increasing the temperature or void content of the moderator in the operating condition, and upon unflooding the fuel process tubes in the hot condition. Snfety features designed into the reactor and plant systems include a system of sensors and devices to detect petentially unsafe operating conditions and to initiate automatically the appropriate countermeasures, a set of fast and reliable control rods for scramming the reactor if a potentially unsafe condition occurs, a manually-actuated liquid neutron poison system, and an emergency cooling system to provide continued steam flow through the reactor core in the event the reactor becomes isolated from either its normal source of steam supply or discharge. The release of radioactivity to unrestricted areas is maintained within permissible limits by monitoring the radioactivity of wastes and controlling their release. The reactor and many of its auxiliaries are housed within a high-integrity essentially leak-tight containment vessel. (auth)« less

  16. Balancing low cost with reliable operation in the rotordynamic design of the ALS Liquid Hydrogen Fuel Turbopump

    NASA Technical Reports Server (NTRS)

    Greenhill, L. M.

    1990-01-01

    The Air Force/NASA Advanced Launch System (ALS) Liquid Hydrogen Fuel Turbopump (FTP) has primary design goals of low cost and high reliability, with performance and weight having less importance. This approach is atypical compared with other rocket engine turbopump design efforts, such as on the Space Shuttle Main Engine (SSME), which emphasized high performance and low weight. Similar to the SSME turbopumps, the ALS FTP operates supercritically, which implies that stability and bearing loads strongly influence the design. In addition, the use of low cost/high reliability features in the ALS FTP such as hydrostatic bearings, relaxed seal clearances, and unshrouded turbine blades also have a negative influence on rotordynamics. This paper discusses the analysis conducted to achieve a balance between low cost and acceptable rotordynamic behavior, to ensure that the ALS FTP will operate reliably without subsynchronous instabilities or excessive bearing loads.

  17. [Evaluation of the reliability of freight elevator operators].

    PubMed

    Gosk, A; Borodulin-Nadzieja, L; Janocha, A; Salomon, E

    1991-01-01

    The study involved 58 workers employed at winding machines. Their reliability was estimated from the results of psychomotoric test precision, condition of the vegetative nervous system, and from the results of psychological tests. The tests were carried out at the laboratory and at the workplaces, with all distractive factors and functional connection of the work process present. We have found that the reliability of the workers may be affected by a variety of factors. Among the winding machine operators, work monotony can lead to "monotony syndrome". Among the signalists , the appreciation of great responsibility can lead to unpredictable and non-adequate reactions. From both groups, persons displaying a lower-than-average precision were isolated. All those persons demonstrated a reckless attitude and the opinion of their superiors about them was poor. Those persons constitute potential risk for the reliable operation of the discussed team.

  18. Leveraging accelerated testing of LED drivers to model the reliability of two-stage and multi-channel drivers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Lynn; Perkins, Curtis; Smith, Aaron

    The next wave of LED lighting technology is likely to be tunable white lighting (TWL) devices which can adjust the colour of the emitted light between warm white (~ 2700 K) and cool white (~ 6500 K). This type of lighting system uses LED assemblies of two or more colours each controlled by separate driver channels that independently adjust the current levels to achieve the desired lighting colour. Drivers used in TWL devices are inherently more complex than those found in simple SSL devices, due to the number of electrical components in the driver required to achieve this level ofmore » control. The reliability of such lighting systems can only be studied using accelerated stress tests (AST) that accelerate the aging process to time frames that can be accommodated in laboratory testing. This paper describes AST methods and findings developed from AST data that provide insights into the lifetime of the main components of one-channel and multi-channel LED devices. The use of AST protocols to confirm product reliability is necessary to ensure that the technology can meet the performance and lifetime requirements of the intended application.« less

  19. Binary phase locked loops for Omega receivers

    NASA Technical Reports Server (NTRS)

    Chamberlin, K.

    1974-01-01

    An all-digital phase lock loop (PLL) is considered because of a number of problems inherent in an employment of analog PLL. The digital PLL design presented solves these problems. A single loop measures all eight Omega time slots. Memory-aiding leads to the name of this design, the memory-aided phase lock loop (MAPLL). Basic operating principles are discussed and the superiority of MAPLL over the conventional digital phase lock loop with regard to the operational efficiency for Omega applications is demonstrated.

  20. Cell electrophoresis for diagnostic purposes. II. Critical evaluation of conventional cytopherometry.

    PubMed Central

    Hoffmann, W.; Kaufmann, R.; Steiner, R.; Werner, W.

    1981-01-01

    Determination of the electrophoretic mobility of test cells has been widely used in an attempt to detect so-called lymphokines in a laboratory test for cancer, but operational difficulties are inherent in conventional cytopherometers. This study therefore investigates the technical and operational aspects of cell electrophoresis, using the Zeiss cytopherometer; e.g. influence of electro-osmosis, focus uncertainty, movement due to convection and other sources of error. Implications and possible improvements in the test are discussed. PMID:7248145

  1. Optical RISC computer

    NASA Astrophysics Data System (ADS)

    Guilfoyle, Peter S.; Stone, Richard V.; Hessenbruch, John M.; Zeise, Frederick F.

    1993-07-01

    A second generation digital optical computer (DOC II) has been developed which utilizes a RISC based operating system as its host. This 32 bit, high performance (12.8 GByte/sec), computing platform demonstrates a number of basic principals that are inherent to parallel free space optical interconnects such as speed (up to 1012 bit operations per second) and low power 1.2 fJ per bit). Although DOC II is a general purpose machine, special purpose applications have been developed and are currently being evaluated on the optical platform.

  2. The Other End of the Spear: The Tooth-to-Tail Ratio (T3R) in Modern Military Operations

    DTIC Science & Technology

    2007-01-01

    units. Such a vehicle gave infantry much more of the fi repower and survivability inherent in heavy (mechanized infantry and armored) units while...The Other End of the Spear: The Tooth- to-Tail Ratio (T3R) in Modern Military Operations John J. McGrath The Long War Series Occasional Paper 23...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources

  3. Modeling water resources as a constraint in electricity capacity expansion models

    NASA Astrophysics Data System (ADS)

    Newmark, R. L.; Macknick, J.; Cohen, S.; Tidwell, V. C.; Woldeyesus, T.; Martinez, A.

    2013-12-01

    In the United States, the electric power sector is the largest withdrawer of freshwater in the nation. The primary demand for water from the electricity sector is for thermoelectric power plant cooling. Areas likely to see the largest near-term growth in population and energy usage, the Southwest and the Southeast, are also facing freshwater scarcity and have experienced water-related power reliability issues in the past decade. Lack of water may become a barrier for new conventionally-cooled power plants, and alternative cooling systems will impact technology cost and performance. Although water is integral to electricity generation, it has long been neglected as a constraint in future electricity system projections. Assessing the impact of water resource scarcity on energy infrastructure development is critical, both for conventional and renewable energy technologies. Efficiently utilizing all water types, including wastewater and brackish sources, or utilizing dry-cooling technologies, will be essential for transitioning to a low-carbon electricity system. This work provides the first demonstration of a national electric system capacity expansion model that incorporates water resources as a constraint on the current and future U.S. electricity system. The Regional Electricity Deployment System (ReEDS) model was enhanced to represent multiple cooling technology types and limited water resource availability in its optimization of electricity sector capacity expansion to 2050. The ReEDS model has high geographic and temporal resolution, making it a suitable model for incorporating water resources, which are inherently seasonal and watershed-specific. Cooling system technologies were assigned varying costs (capital, operations and maintenance), and performance parameters, reflecting inherent tradeoffs in water impacts and operating characteristics. Water rights supply curves were developed for each of the power balancing regions in ReEDS. Supply curves include costs and availability of freshwater (surface and groundwater) and alternative water resources (municipal wastewater and brackish groundwater). In each region, a new power plant must secure sufficient water rights for operation before being built. Water rights constraints thus influence the type of power plant, cooling system, or location of new generating capacity. Results indicate that the aggregate national generating capacity by fuel type and associated carbon dioxide emissions change marginally with the inclusion of water rights. Water resource withdrawals and consumption, however, can vary considerably. Regional water resource dynamics indicate substantial differences in the location where power plant-cooling system technology combinations are built. These localized impacts highlight the importance of considering water resources as a constraint in the electricity sector when evaluating costs, transmission infrastructure needs, and externalities. Further scenario evaluations include assessments of how climate change could affect the availability of water resources, and thus the development of the electricity sector.

  4. Big data analytics for the Future Circular Collider reliability and availability studies

    NASA Astrophysics Data System (ADS)

    Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter

    2017-10-01

    Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.

  5. Applications of Human Performance Reliability Evaluation Concepts and Demonstration Guidelines

    DTIC Science & Technology

    1977-03-15

    ship stops dead in the water and the AN/SQS-26 operator recommends a new heading (000°). At T + 14 minutes, the target ship begins a hard turn to...Various Simulated Conditions 82 9 Hunan Reliability for Each Simulated Operator (Baseline Run) 83 10 Human and Equipment Availabilit / under

  6. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Brunett, Acacia J.; Passerini, Stefano

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory (Argonne) participated in a two year collaboration to modernize and update the probabilistic risk assessment (PRA) for the PRISM sodium fast reactor. At a high level, the primary outcome of the project was the development of a next-generation PRA that is intended to enable risk-informed prioritization of safety- and reliability-focused research and development. A central Argonne task during this project was a reliability assessment of passive safety systems, which included the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedbacks of the metal fuel core. Both systems were examinedmore » utilizing a methodology derived from the Reliability Method for Passive Safety Functions (RMPS), with an emphasis on developing success criteria based on mechanistic system modeling while also maintaining consistency with the Fuel Damage Categories (FDCs) of the mechanistic source term assessment. This paper provides an overview of the reliability analyses of both systems, including highlights of the FMEAs, the construction of best-estimate models, uncertain parameter screening and propagation, and the quantification of system failure probability. In particular, special focus is given to the methodologies to perform the analysis of uncertainty propagation and the determination of the likelihood of violating FDC limits. Additionally, important lessons learned are also reviewed, such as optimal sampling methodologies for the discovery of low likelihood failure events and strategies for the combined treatment of aleatory and epistemic uncertainties.« less

  8. 18 CFR 292.308 - Standards for operating reliability.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... reliability. 292.308 Section 292.308 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... SMALL POWER PRODUCTION AND COGENERATION Arrangements Between Electric Utilities and Qualifying... may establish reasonable standards to ensure system safety and reliability of interconnected...

  9. Effect of read-mapping biases on detecting allele-specific expression from RNA-sequencing data

    PubMed Central

    Degner, Jacob F.; Marioni, John C.; Pai, Athma A.; Pickrell, Joseph K.; Nkadori, Everlyne; Gilad, Yoav; Pritchard, Jonathan K.

    2009-01-01

    Motivation: Next-generation sequencing has become an important tool for genome-wide quantification of DNA and RNA. However, a major technical hurdle lies in the need to map short sequence reads back to their correct locations in a reference genome. Here, we investigate the impact of SNP variation on the reliability of read-mapping in the context of detecting allele-specific expression (ASE). Results: We generated 16 million 35 bp reads from mRNA of each of two HapMap Yoruba individuals. When we mapped these reads to the human genome we found that, at heterozygous SNPs, there was a significant bias toward higher mapping rates of the allele in the reference sequence, compared with the alternative allele. Masking known SNP positions in the genome sequence eliminated the reference bias but, surprisingly, did not lead to more reliable results overall. We find that even after masking, ∼5–10% of SNPs still have an inherent bias toward more effective mapping of one allele. Filtering out inherently biased SNPs removes 40% of the top signals of ASE. The remaining SNPs showing ASE are enriched in genes previously known to harbor cis-regulatory variation or known to show uniparental imprinting. Our results have implications for a variety of applications involving detection of alternate alleles from short-read sequence data. Availability: Scripts, written in Perl and R, for simulating short reads, masking SNP variation in a reference genome and analyzing the simulation output are available upon request from JFD. Raw short read data were deposited in GEO (http://www.ncbi.nlm.nih.gov/geo/) under accession number GSE18156. Contact: jdegner@uchicago.edu; marioni@uchicago.edu; gilad@uchicago.edu; pritch@uchicago.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19808877

  10. Estimating the production, consumption and export of cannabis: The Dutch case.

    PubMed

    van der Giessen, Mark; van Ooyen-Houben, Marianne M J; Moolenaar, Debora E G

    2016-05-01

    Quantifying an illegal phenomenon like a drug market is inherently complex due to its hidden nature and the limited availability of reliable information. This article presents findings from a recent estimate of the production, consumption and export of Dutch cannabis and discusses the opportunities provided by, and limitations of, mathematical models for estimating the illegal cannabis market. The data collection consisted of a comprehensive literature study, secondary analyses on data from available registrations (2012-2014) and previous studies, and expert opinion. The cannabis market was quantified with several mathematical models. The data analysis included a Monte Carlo simulation to come to a 95% interval estimate (IE) and a sensitivity analysis to identify the most influential indicators. The annual production of Dutch cannabis was estimated to be between 171 and 965tons (95% IE of 271-613tons). The consumption was estimated to be between 28 and 119tons, depending on the inclusion or exclusion of non-residents (95% IE of 51-78tons or 32-49tons respectively). The export was estimated to be between 53 and 937tons (95% IE of 206-549tons or 231-573tons, respectively). Mathematical models are valuable tools for the systematic assessment of the size of illegal markets and determining the uncertainty inherent in the estimates. The estimates required the use of many assumptions and the availability of reliable indicators was limited. This uncertainty is reflected in the wide ranges of the estimates. The estimates are sensitive to 10 of the 45 indicators. These 10 account for 86-93% of the variation found. Further research should focus on improving the variables and the independence of the mathematical models. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. The Use of Race-Related Variables in Counseling Research

    ERIC Educational Resources Information Center

    Strom, Thad Q.; Lee, D. John; Trahan, Emily; Kaufman, Aimee; Pritchett, Tiffany

    2009-01-01

    This study provides a detailed analysis of all race-related articles published in prominent counseling journals between 1995 and 2004. Findings indicate that 75% of articles did not define race variables and in the absence of an operational definition, authors tended to conceptualize race as an inherent biological variable. (Contains 3 tables.)

  12. Developing an aviation exposure index to inform risk-based fire management decisions

    Treesearch

    Crystal S. Stonesifer; David E. Calkin; Matthew P. Thompson; Jeffrey D. Kaiden

    2014-01-01

    Wildland firefighting is an inherently dangerous activity, and aviation-related accidents in particular comprise a large share of firefighter fatalities. Due to limited understanding of operational factors that lead to aviation accidents, it is unclear how local decisionmakers, responsible for requesting aviation support, can mitigate the risk of an aviation accident...

  13. Assessing bioenergy harvest risks: Geospatially explicit tools for maintaining soil productivity in western US forests

    Treesearch

    Mark Kimsey; Deborah Page-Dumroese; Mark Coleman

    2011-01-01

    Biomass harvesting for energy production and forest health can impact the soil resource by altering inherent chemical, physical and biological properties. These impacts raise concern about damaging sensitive forest soils, even with the prospect of maintaining vigorous forest growth through biomass harvesting operations. Current forest biomass harvesting research...

  14. Green chemistry for chemical synthesis

    PubMed Central

    Li, Chao-Jun; Trost, Barry M.

    2008-01-01

    Green chemistry for chemical synthesis addresses our future challenges in working with chemical processes and products by inventing novel reactions that can maximize the desired products and minimize by-products, designing new synthetic schemes and apparati that can simplify operations in chemical productions, and seeking greener solvents that are inherently environmentally and ecologically benign. PMID:18768813

  15. Policies and Practices in the Bibliographic Control of United States Government Publications.

    ERIC Educational Resources Information Center

    Crowers, Clifford P., Ed.

    1974-01-01

    In an attempt to clarify the indexing and announcing controls for government documents, this issue of the Drexel Library Quarterly presents background information on several of the information controlling and access agencies, describes their operations, and points out their inherent problems and weaknesses. The agencies covered are the Government…

  16. Social Reproduction in Non-Formal Adult Education: The Case of Rural Mozambique

    ERIC Educational Resources Information Center

    Straubhaar, Rolf

    2014-01-01

    Using fieldnotes from the non-formal adult education classes run by a non-profit international education with ground operations in rural Mozambique, this article documents how the comments made by class facilitators and class participants in those classes reflect inherent power inequalities between non-profit staff and local participants. These…

  17. Management of sickle cell disease in patients undergoing cardiac surgery.

    PubMed

    Crawford, Todd C; Carter, Michael V; Patel, Rina K; Suarez-Pierre, Alejandro; Lin, Sophie Z; Magruder, Jonathan Trent; Grimm, Joshua C; Cameron, Duke E; Baumgartner, William A; Mandal, Kaushik

    2017-02-01

    Sickle cell disease is a life-limiting inherited hemoglobinopathy that poses inherent risk for surgical complications following cardiac operations. In this review, we discuss preoperative considerations, intraoperative decision-making, and postoperative strategies to optimize the care of a patient with sickle cell disease undergoing cardiac surgery. © 2017 Wiley Periodicals, Inc.

  18. A New Variable Weighting and Selection Procedure for K-Means Cluster Analysis

    ERIC Educational Resources Information Center

    Steinley, Douglas; Brusco, Michael J.

    2008-01-01

    A variance-to-range ratio variable weighting procedure is proposed. We show how this weighting method is theoretically grounded in the inherent variability found in data exhibiting cluster structure. In addition, a variable selection procedure is proposed to operate in conjunction with the variable weighting technique. The performances of these…

  19. Single-stage interpolation flaps in facial reconstruction.

    PubMed

    Hollmig, S Tyler; Leach, Brian C; Cook, Joel

    2014-09-01

    Relatively deep and complex surgical defects, particularly when adjacent to or involving free margins, present significant reconstructive challenges. When the use of local flaps is precluded by native anatomic restrictions, interpolation flaps may be modified to address these difficult wounds in a single operative session. To provide a framework to approach difficult soft tissue defects arising near or involving free margins and to demonstrate appropriate design and execution of single-stage interpolation flaps for reconstruction of these wounds. Examination of our utilization of these flaps based on an anatomic region and surgical approach. A region-based demonstration of flap conceptualization, design, and execution is provided. Tunneled, transposed, and deepithelialized variations of single-stage interpolation flaps provide versatile options for reconstruction of a variety of defects encroaching on or involving free margins. The inherently robust vascularity of these flaps supports importation of necessary tissue bulk while allowing aggressive contouring to restore an intricate native topography. Critical flap design allows access to distant tissue reservoirs and placement of favorable incision lines while preserving the inherent advantages of a single operative procedure.

  20. Validity, Reliability, and Performance Determinants of a New Job-Specific Anaerobic Work Capacity Test for the Norwegian Navy Special Operations Command.

    PubMed

    Angeltveit, Andreas; Paulsen, Gøran; Solberg, Paul A; Raastad, Truls

    2016-02-01

    Operators in Special Operation Forces (SOF) have a particularly demanding profession where physical and psychological capacities can be challenged to the extremes. The diversity of physical capacities needed depend on the mission. Consequently, tests used to monitor SOF operators' physical fitness should cover a broad range of physical capacities. Whereas tests for strength and aerobic endurance are established, there is no test for specific anaerobic work capacity described in the literature. The purpose of this study was therefore to evaluate the reliability, validity, and to identify performance determinants of a new test developed for testing specific anaerobic work capacity in SOF operators. Nineteen active young students were included in the concurrent validity part of the study. The students performed the evacuation (EVAC) test 3 times and the results were compared for reliability and with performance in the Wingate cycle test, 300-m sprint, and a maximal accumulated oxygen deficit (MAOD) test. In part II of the study, 21 Norwegian Navy Special Operations Command operators conducted the EVAC test, anthropometric measurements, a dual x-ray absorptiometry scan, leg press, isokinetic knee extensions, maximal oxygen uptake test, and countermovement jump (CMJ) test. The EVAC test showed good reliability after 1 familiarization trial (intraclass correlation = 0.89; coefficient of variance = 3.7%). The EVAC test correlated well with the Wingate test (r = -0.68), 300-m sprint time (r = 0.51), and 300-m mean power (W) (r = -0.67). No significant correlation was found with the MAOD test. In part II of the study, height, body mass, lean body mass, isokinetic knee extension torque, maximal oxygen uptake, and maximal power in a CMJ was significantly correlated with performance in the EVAC test. The EVAC test is a reliable and valid test for anaerobic work capacity for SOF operators, and muscle mass, leg strength, and leg power seem to be the most important determinants of performance.

  1. Reliability and minimal detectable change of physical performance measures in individuals with pre-manifest and manifest Huntington disease.

    PubMed

    Quinn, Lori; Khalil, Hanan; Dawes, Helen; Fritz, Nora E; Kegelmeyer, Deb; Kloos, Anne D; Gillard, Jonathan W; Busse, Monica

    2013-07-01

    Clinical intervention trials in people with Huntington disease (HD) have been limited by a lack of reliable and appropriate outcome measures. The purpose of this study was to determine the reliability and minimal detectable change (MDC) of various outcome measures that are potentially suitable for evaluating physical functioning in individuals with HD. This was a multicenter, prospective, observational study. Participants with pre-manifest and manifest HD (early, middle, and late stages) were recruited from 8 international sites to complete a battery of physical performance and functional measures at 2 assessments, separated by 1 week. Test-retest reliability (using intraclass correlation coefficients) and MDC values were calculated for all measures. Seventy-five individuals with HD (mean age=52.12 years, SD=11.82) participated in the study. Test-retest reliability was very high (>.90) for participants with manifest HD for the Six-Minute Walk Test (6MWT), 10-Meter Walk Test, Timed "Up & Go" Test (TUG), Berg Balance Scale (BBS), Physical Performance Test (PPT), Barthel Index, Rivermead Mobility Index, and Tinetti Mobility Test (TMT). Many MDC values suggested a relatively high degree of inherent variability, particularly in the middle stage of HD. Minimum detectable change values for participants with manifest HD that were relatively low across disease stages were found for the BBS (5), PPT (5), and TUG (2.98). For individuals with pre-manifest HD (n=11), the 6MWT and Four Square Step Test had high reliability and low MDC values. The sample size for the pre-manifest HD group was small. The BBS, PPT, and TUG appear most appropriate for clinical trials aimed at improving physical functioning in people with manifest HD. Further research in people with pre-manifest HD is necessary.

  2. Measurement of fetal head descent using the 'angle of progression' on transperineal ultrasound imaging is reliable regardless of fetal head station or ultrasound expertise.

    PubMed

    Dückelmann, A M; Bamberg, C; Michaelis, S A M; Lange, J; Nonnenmacher, A; Dudenhausen, J W; Kalache, K D

    2010-02-01

    To assess whether ultrasound experience or fetal head station affects the reliability of measurement of fetal head descent using the angle of progression on intrapartum ultrasound images obtained by a single experienced operator, and to determine reliability of measurements when images were acquired by different operators with variable ultrasound experience. One experienced obstetrician performed 44 transperineal ultrasound examinations of women at term and in prolonged second stage of labor with the fetus in the occipitoanterior position. Three midwives without ultrasound experience, three obstetricians with < 5 years' experience and three obstetricians with > 10 years' experience measured fetal head descent based on the angle of progression in the images obtained. The angle of progression was measured by two obstetricians in independent ultrasound examinations of 24 laboring women at term with the fetus in the cephalic position to allow assessment of the reliability of image acquisition. Intraclass correlation coefficients (ICCs) with 95% confidence interval (CI) were used to evaluate interobserver reliability and Bland-Altman analysis was used to assess interobserver agreement. In total, 444 measurements were performed and compared. Interobserver reliability with respect to offline image analysis was substantial (overall ICC, 0.72; 95% CI, 0.63-0.81). ICCs were 0.82 (95% CI, 0.70-0.89), 0.81 (95% CI, 0.71-0.88) and 0.61 (95% CI, 0.43-074) for observers with > 10 years', < 5 years' and no ultrasound experience, respectively. There were no significant differences between ICCs among observer groups according to ultrasound experience. Fetal head station did not affect reliability. Bland-Altman analysis indicated reasonable agreement between measurements obtained by two different operators with > 10 years' and < 5 years' ultrasound experience (bias, -1.09 degrees ; 95% limits of agreement, -8.76 to 6.58). The reliability of measurement of the angle of progression following separate image acquisition by two experienced operators was similar to the reliability of offline image analysis (ICC, 0.86; 95% CI, 0.70-0.93). Measurement of the angle of progression on transperineal ultrasound imaging is reliable regardless of fetal head station or the clinician's level of ultrasound experience.

  3. Reliability and economy -- Hydro electricity for Iran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jahromi-Shirazi, M.J.; Zarbakhsh, M.H.

    1998-12-31

    Reliability is the probability that a device or system will perform its function adequately, for the period of time intended, under the operating conditions intended. Reliability and economy are two important factors in operating any system, especially in power generation. Due to the high rate in population growth in Iran, the experts have estimated that the demand for electricity will be about 63,000 MW in the next 25 years, the installed power is now about 26,000 MW. Therefore, the energy policy decision made in Iran is to go to power generation by hydroelectric plants because of reliability, availability of watermore » resources and the economics of hydroelectric power.« less

  4. Reliability of Phase Velocity Measurements of Flexural Acoustic Waves in the Human Tibia In-Vivo.

    PubMed

    Vogl, Florian; Schnüriger, Karin; Gerber, Hans; Taylor, William R

    2016-01-01

    Axial-transmission acoustics have shown to be a promising technique to measure individual bone properties and detect bone pathologies. With the ultimate goal being the in-vivo application of such systems, quantification of the key aspects governing the reliability is crucial to bring this method towards clinical use. This work presents a systematic reliability study quantifying the sources of variability and their magnitudes of in-vivo measurements using axial-transmission acoustics. 42 healthy subjects were measured by an experienced operator twice per week, over a four-month period, resulting in over 150000 wave measurements. In a complementary study to assess the influence of different operators performing the measurements, 10 novice operators were trained, and each measured 5 subjects on a single occasion, using the same measurement protocol as in the first part of the study. The estimated standard error for the measurement protocol used to collect the study data was ∼ 17 m/s (∼ 4% of the grand mean) and the index of dependability, as a measure of reliability, was Φ = 0.81. It was shown that the method is suitable for multi-operator use and that the reliability can be improved efficiently by additional measurements with device repositioning, while additional measurements without repositioning cannot improve the reliability substantially. Phase velocity values were found to be significantly higher in males than in females (p < 10-5) and an intra-class correlation coefficient of r = 0.70 was found between the legs of each subject. The high reliability of this non-invasive approach and its intrinsic sensitivity to mechanical properties opens perspectives for the rapid and inexpensive clinical assessment of bone pathologies, as well as for monitoring programmes without any radiation exposure for the patient.

  5. Factors that Affect Operational Reliability of Turbojet Engines

    NASA Technical Reports Server (NTRS)

    1956-01-01

    The problem of improving operational reliability of turbojet engines is studied in a series of papers. Failure statistics for this engine are presented, the theory and experimental evidence on how engine failures occur are described, and the methods available for avoiding failure in operation are discussed. The individual papers of the series are Objectives, Failure Statistics, Foreign-Object Damage, Compressor Blades, Combustor Assembly, Nozzle Diaphrams, Turbine Buckets, Turbine Disks, Rolling Contact Bearings, Engine Fuel Controls, and Summary Discussion.

  6. Operation of Reliability Analysis Center (FY88)

    DTIC Science & Technology

    1989-10-01

    4.1 Current Projects 16 4.2 Completed Projects 22 6.0 FINANCIAL SUMMARY FY󈨜 23 7.0 INFORMATION FROM IAC USERS 24 7.1 User Feedback on IAC Services...22 6.0 FINANCIAL SUMMARY FY󈨜 Operating expenditures for carrying out the Reliability Analysis Center’s on-going operational functions and satisfying...Because the RAC does n~ot stand to benefit from either a favorable or unfavorable appraisal of any contractors design, an unbiased analysis can result

  7. Spaceflight tracking and data network operational reliability assessment for Skylab

    NASA Technical Reports Server (NTRS)

    Seneca, V. I.; Mlynarczyk, R. H.

    1974-01-01

    Data on the spaceflight communications equipment status during the Skylab mission were subjected to an operational reliability assessment. Reliability models were revised to reflect pertinent equipment changes accomplished prior to the beginning of the Skylab missions. Appropriate adjustments were made to fit the data to the models. The availabilities are based on the failure events resulting in the stations inability to support a function of functions and the MTBF's are based on all events including 'can support' and 'cannot support'. Data were received from eleven land-based stations and one ship.

  8. Sensitivity analysis by approximation formulas - Illustrative examples. [reliability analysis of six-component architectures

    NASA Technical Reports Server (NTRS)

    White, A. L.

    1983-01-01

    This paper examines the reliability of three architectures for six components. For each architecture, the probabilities of the failure states are given by algebraic formulas involving the component fault rate, the system recovery rate, and the operating time. The dominant failure modes are identified, and the change in reliability is considered with respect to changes in fault rate, recovery rate, and operating time. The major conclusions concern the influence of system architecture on failure modes and parameter requirements. Without this knowledge, a system designer may pick an inappropriate structure.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Logan, Jeffrey S.; Paranhos, Elizabeth; Kozak, Tracy G.

    This study focuses on onshore natural gas operations and examines the extent to which oil and gas firms have embraced certain organizational characteristics that lead to 'high reliability' - understood here as strong safety and reliability records over extended periods of operation. The key questions that motivated this study include whether onshore oil and gas firms engaged in exploration and production (E&P) and midstream (i.e., natural gas transmission and storage) are implementing practices characteristic of high reliability organizations (HROs) and the extent to which any such practices are being driven by industry innovations and standards and/or regulatory requirements.

  10. 76 FR 23470 - Version One Regional Reliability Standard for Transmission Operations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-27

    ... (BPA), WECC, and San Diego Gas & Electric Company (SDG&E). II. Discussion 16. The Commission approves... unknown operating state. 23. Similarly, BPA states that it is unnecessary to carry over from TOP-STD-007-0... as TOP-004-2. BPA also notes that the continent-wide Reliability Standard, TOP-007-0, does not...

  11. Evaluating the Impact of the 2017 Solar Eclipse on U.S. Western Interconnection Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veda, Santosh; Zhang, Yingchen; Tan, Jin

    With support from the U.S. Department of Energy (DOE) Solar Energy Technologies Office (SETO), the National Renewable Energy Laboratory (NREL) partnered with Peak Reliability to evaluate the impact of the August 21, 2017 total solar eclipse on the reliability and grid operations in the Western Electricity Coordinating Council (WECC) territory.

  12. An assessment of thermodynamic merits for current and potential future engine operating strategies

    DOE PAGES

    Wissink, Martin L.; Splitter, Derek A.; Dempsey, Adam B.; ...

    2017-02-01

    The present work compares the fundamental thermodynamic underpinnings (i.e., working fluid properties and heat release profile) of various combustion strategies with engine measurements. The approach employs a model that separately tracks the impacts on efficiency due to differences in rate of heat addition, volume change, mass addition, and molecular weight change for a given combination of working fluid, heat release profile, and engine geometry. Comparative analysis between measured and modeled efficiencies illustrates fundamental sources of efficiency reductions or opportunities inherent to various combustion regimes. Engine operating regimes chosen for analysis include stoichiometric spark-ignited combustion and lean compression-ignited combustion including HCCI,more » SA-HCCI, RCCI, GCI, and CDC. Within each combustion regime, effects such as engine load, combustion duration, combustion phasing, combustion chamber geometry, fuel properties, and charge dilution are explored. Model findings illustrate that even in the absence of losses such as heat transfer or incomplete combustion, the maximum possible thermal efficiency inherent to each operating strategy varies to a significant degree. Additionally, the experimentally measured losses are observed to be unique within a given operating strategy. The findings highlight the fact that in order to create a roadmap for future directions in ICE technologies, it is important to not only compare the absolute real-world efficiency of a given combustion strategy, but to also examine the measured efficiency in context of what is thermodynamically possible with the working fluid and boundary conditions prescribed by a strategy.« less

  13. An assessment of thermodynamic merits for current and potential future engine operating strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wissink, Martin L.; Splitter, Derek A.; Dempsey, Adam B.

    The present work compares the fundamental thermodynamic underpinnings (i.e., working fluid properties and heat release profile) of various combustion strategies with engine measurements. The approach employs a model that separately tracks the impacts on efficiency due to differences in rate of heat addition, volume change, mass addition, and molecular weight change for a given combination of working fluid, heat release profile, and engine geometry. Comparative analysis between measured and modeled efficiencies illustrates fundamental sources of efficiency reductions or opportunities inherent to various combustion regimes. Engine operating regimes chosen for analysis include stoichiometric spark-ignited combustion and lean compression-ignited combustion including HCCI,more » SA-HCCI, RCCI, GCI, and CDC. Within each combustion regime, effects such as engine load, combustion duration, combustion phasing, combustion chamber geometry, fuel properties, and charge dilution are explored. Model findings illustrate that even in the absence of losses such as heat transfer or incomplete combustion, the maximum possible thermal efficiency inherent to each operating strategy varies to a significant degree. Additionally, the experimentally measured losses are observed to be unique within a given operating strategy. The findings highlight the fact that in order to create a roadmap for future directions in ICE technologies, it is important to not only compare the absolute real-world efficiency of a given combustion strategy, but to also examine the measured efficiency in context of what is thermodynamically possible with the working fluid and boundary conditions prescribed by a strategy.« less

  14. Improving the Reliability and Modal Stability of High Power 870 nm AlGaAs CSP Laser Diodes for Applications to Free Space Communication Systems

    NASA Technical Reports Server (NTRS)

    Connolly, J. C.; Alphonse, G. A.; Carlin, D. B.; Ettenberg, M.

    1991-01-01

    The operating characteristics (power-current, beam divergence, etc.) and reliability assessment of high-power CSP lasers is discussed. The emission wavelength of these lasers was optimized at 860 to 880 nm. The operational characteristics of a new laser, the inverse channel substrate planar (ICSP) laser, grown by metalorganic chemical vapor deposition (MOCVD), is discussed and the reliability assessment of this laser is reported. The highlights of this study include a reduction in the threshold current value for the laser to 15 mA and a degradation rate of less than 2 kW/hr for the lasers operating at 60 mW of peak output power.

  15. On-orbit spacecraft reliability

    NASA Technical Reports Server (NTRS)

    Bloomquist, C.; Demars, D.; Graham, W.; Henmi, P.

    1978-01-01

    Operational and historic data for 350 spacecraft from 52 U.S. space programs were analyzed for on-orbit reliability. Failure rates estimates are made for on-orbit operation of spacecraft subsystems, components, and piece parts, as well as estimates of failure probability for the same elements during launch. Confidence intervals for both parameters are also given. The results indicate that: (1) the success of spacecraft operation is only slightly affected by most reported incidents of anomalous behavior; (2) the occurrence of the majority of anomalous incidents could have been prevented piror to launch; (3) no detrimental effect of spacecraft dormancy is evident; (4) cycled components in general are not demonstrably less reliable than uncycled components; and (5) application of product assurance elements is conductive to spacecraft success.

  16. Inter- and intra-operator reliability and repeatability of shear wave elastography in the liver: a study in healthy volunteers.

    PubMed

    Hudson, John M; Milot, Laurent; Parry, Craig; Williams, Ross; Burns, Peter N

    2013-06-01

    This study assessed the reproducibility of shear wave elastography (SWE) in the liver of healthy volunteers. Intra- and inter-operator reliability and repeatability were quantified in three different liver segments in a sample of 15 subjects, scanned during four independent sessions (two scans on day 1, two scans 1 wk later) by two operators. A total of 1440 measurements were made. Reproducibility was assessed using the intra-class correlation coefficient (ICC) and a repeated measures analysis of variance. The shear wave speed was measured and used to estimate Young's modulus using the Supersonics Imagine Aixplorer. The median Young's modulus measured through the inter-costal space was 5.55 ± 0.74 kPa. The intra-operator reliability was better for same-day evaluations (ICC = 0.91) than the inter-operator reliability (ICC = 0.78). Intra-observer agreement decreased when scans were repeated on a different day. Inter-session repeatability was between 3.3% and 9.9% for intra-day repeated scans, compared with to 6.5%-12% for inter-day repeated scans. No significant difference was observed in subjects with a body mass index greater or less than 25 kg/m(2). Copyright © 2013 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  17. APPLICATION OF TRAVEL TIME RELIABILITY FOR PERFORMANCE ORIENTED OPERATIONAL PLANNING OF EXPRESSWAYS

    NASA Astrophysics Data System (ADS)

    Mehran, Babak; Nakamura, Hideki

    Evaluation of impacts of congestion improvement scheme s on travel time reliability is very significant for road authorities since travel time reliability repr esents operational performance of expressway segments. In this paper, a methodology is presented to estimate travel tim e reliability prior to implementation of congestion relief schemes based on travel time variation modeling as a function of demand, capacity, weather conditions and road accident s. For subject expressway segmen ts, traffic conditions are modeled over a whole year considering demand and capacity as random variables. Patterns of demand and capacity are generated for each five minute interval by appl ying Monte-Carlo simulation technique, and accidents are randomly generated based on a model that links acci dent rate to traffic conditions. A whole year analysis is performed by comparing de mand and available capacity for each scenario and queue length is estimated through shockwave analysis for each time in terval. Travel times are estimated from refined speed-flow relationships developed for intercity expressways and buffer time index is estimated consequently as a measure of travel time reliability. For validation, estimated reliability indices are compared with measured values from empirical data, and it is shown that the proposed method is suitable for operational evaluation and planning purposes.

  18. Reliability modelling and analysis of thermal MEMS

    NASA Astrophysics Data System (ADS)

    Muratet, Sylvaine; Lavu, Srikanth; Fourniols, Jean-Yves; Bell, George; Desmulliez, Marc P. Y.

    2006-04-01

    This paper presents a MEMS reliability study methodology based on the novel concept of 'virtual prototyping'. This methodology can be used for the development of reliable sensors or actuators and also to characterize their behaviour in specific use conditions and applications. The methodology is demonstrated on the U-shaped micro electro thermal actuator used as test vehicle. To demonstrate this approach, a 'virtual prototype' has been developed with the modeling tools MatLab and VHDL-AMS. A best practice FMEA (Failure Mode and Effect Analysis) is applied on the thermal MEMS to investigate and assess the failure mechanisms. Reliability study is performed by injecting the identified defaults into the 'virtual prototype'. The reliability characterization methodology predicts the evolution of the behavior of these MEMS as a function of the number of cycles of operation and specific operational conditions.

  19. Key Residential Building Equipment Technologies for Control and Grid Support PART I (Residential)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Starke, Michael R; Onar, Omer C; DeVault, Robert C

    2011-09-01

    Electrical energy consumption of the residential sector is a crucial area of research that has in the past primarily focused on increasing the efficiency of household devices such as water heaters, dishwashers, air conditioners, and clothes washer and dryer units. However, the focus of this research is shifting as objectives such as developing the smart grid and ensuring that the power system remains reliable come to the fore, along with the increasing need to reduce energy use and costs. Load research has started to focus on mechanisms to support the power system through demand reduction and/or reliability services. The powermore » system relies on matching generation and load, and day-ahead and real-time energy markets capture most of this need. However, a separate set of grid services exist to address the discrepancies in load and generation arising from contingencies and operational mismatches, and to ensure that the transmission system is available for delivery of power from generation to load. Currently, these grid services are mostly provided by generation resources. The addition of renewable resources with their inherent variability can complicate the issue of power system reliability and lead to the increased need for grid services. Using load as a resource, through demand response programs, can fill the additional need for flexible resources and even reduce costly energy peaks. Loads have been shown to have response that is equal to or better than generation in some cases. Furthermore, price-incentivized demand response programs have been shown to reduce the peak energy requirements, thereby affecting the wholesale market efficiency and overall energy prices. The residential sector is not only the largest consumer of electrical energy in the United States, but also has the highest potential to provide demand reduction and power system support, as technological advancements in load control, sensor technologies, and communication are made. The prevailing loads based on the largest electrical energy consumers in the residential sector are space heating and cooling, washer and dryer, water heating, lighting, computers and electronics, dishwasher and range, and refrigeration. As the largest loads, these loads provide the highest potential for delivering demand response and reliability services. Many residential loads have inherent flexibility that is related to the purpose of the load. Depending on the load type, electric power consumption levels can either be ramped, changed in a step-change fashion, or completely removed. Loads with only on-off capability (such as clothes washers and dryers) provide less flexibility than resources that can be ramped or step-changed. Add-on devices may be able to provide extra demand response capabilities. Still, operating residential loads effectively requires awareness of the delicate balance of occupants health and comfort and electrical energy consumption. This report is Phase I of a series of reports aimed at identifying gaps in automated home energy management systems for incorporation of building appliances, vehicles, and renewable adoption into a smart grid, specifically with the intent of examining demand response and load factor control for power system support. The objective is to capture existing gaps in load control, energy management systems, and sensor technology with consideration of PHEV and renewable technologies to establish areas of research for the Department of Energy. In this report, (1) data is collected and examined from state of the art homes to characterize the primary residential loads as well as PHEVs and photovoltaic for potential adoption into energy management control strategies; and (2) demand response rules and requirements across the various demand response programs are examined for potential participation of residential loads. This report will be followed by a Phase II report aimed at identifying the current state of technology of energy management systems, sensors, and communication technologies for demand response and load factor control applications for the residential sector. The purpose is to cover the gaps that exist in the information captured by the sensors for energy management system to be able to provide demand response and load factor control. The vision is the development of an energy management system or other controlling enterprise hardware and software that is not only able to control loads, PHEVs, and renewable generation for demand response and load factor control, but also to do so with consumer comforts in mind and in an optimal fashion.« less

  20. Data reduction of isotope-resolved LC-MS spectra.

    PubMed

    Du, Peicheng; Sudha, Rajagopalan; Prystowsky, Michael B; Angeletti, Ruth Hogue

    2007-06-01

    Data reduction of liquid chromatography-mass spectrometry (LC-MS) spectra can be a challenge due to the inherent complexity of biological samples, noise and non-flat baseline. We present a new algorithm, LCMS-2D, for reliable data reduction of LC-MS proteomics data. LCMS-2D can reliably reduce LC-MS spectra with multiple scans to a list of elution peaks, and subsequently to a list of peptide masses. It is capable of noise removal, and deconvoluting peaks that overlap in m/z, in retention time, or both, by using a novel iterative peak-picking step, a 'rescue' step, and a modified variable selection method. LCMS-2D performs well with three sets of annotated LC-MS spectra, yielding results that are better than those from PepList, msInspect and the vendor software BioAnalyst. The software LCMS-2D is available under the GNU general public license from http://www.bioc.aecom.yu.edu/labs/angellab/as a standalone C program running on LINUX.

  1. Agreement and Reliability of Tinnitus Loudness Matching and Pitch Likeness Rating

    PubMed Central

    Hoare, Derek J.; Edmondson-Jones, Mark; Gander, Phillip E.; Hall, Deborah A.

    2014-01-01

    The ability to reproducibly match tinnitus loudness and pitch is important to research and clinical management. Here we examine agreement and reliability of tinnitus loudness matching and pitch likeness ratings when using a computer-based method to measure the tinnitus spectrum and estimate a dominant tinnitus pitch, using tonal or narrowband sounds. Group level data indicated a significant effect of time between test session 1 and 2 for loudness matching, likely procedural or perceptual learning, which needs to be accounted in study design. Pitch likeness rating across multiple frequencies appeared inherently more variable and with no systematic effect of time. Dominant pitch estimates reached a level of clinical acceptability when sessions were spaced two weeks apart. However when dominant tinnitus pitch assessments were separated by three months, acceptable agreement was achieved only for group mean data, not for individual estimates. This has implications for prescription of some sound-based interventions that rely on accurate measures of individual dominant tinnitus pitch. PMID:25478690

  2. Context-Aided Sensor Fusion for Enhanced Urban Navigation

    PubMed Central

    Martí, Enrique David; Martín, David; García, Jesús; de la Escalera, Arturo; Molina, José Manuel; Armingol, José María

    2012-01-01

    The deployment of Intelligent Vehicles in urban environments requires reliable estimation of positioning for urban navigation. The inherent complexity of this kind of environments fosters the development of novel systems which should provide reliable and precise solutions to the vehicle. This article details an advanced GNSS/IMU fusion system based on a context-aided Unscented Kalman filter for navigation in urban conditions. The constrained non-linear filter is here conditioned by a contextual knowledge module which reasons about sensor quality and driving context in order to adapt it to the situation, while at the same time it carries out a continuous estimation and correction of INS drift errors. An exhaustive analysis has been carried out with available data in order to characterize the behavior of available sensors and take it into account in the developed solution. The performance is then analyzed with an extensive dataset containing representative situations. The proposed solution suits the use of fusion algorithms for deploying Intelligent Transport Systems in urban environments. PMID:23223080

  3. Characterizing cognitive control abilities in children with 16p11.2 deletion using adaptive 'video game' technology: a pilot study.

    PubMed

    Anguera, J A; Brandes-Aitken, A N; Rolle, C E; Skinner, S N; Desai, S S; Bower, J D; Martucci, W E; Chung, W K; Sherr, E H; Marco, E J

    2016-09-20

    Assessing cognitive abilities in children is challenging for two primary reasons: lack of testing engagement can lead to low testing sensitivity and inherent performance variability. Here we sought to explore whether an engaging, adaptive digital cognitive platform built to look and feel like a video game would reliably measure attention-based abilities in children with and without neurodevelopmental disabilities related to a known genetic condition, 16p11.2 deletion. We assessed 20 children with 16p11.2 deletion, a genetic variation implicated in attention deficit/hyperactivity disorder and autism, as well as 16 siblings without the deletion and 75 neurotypical age-matched children. Deletion carriers showed significantly slower response times and greater response variability when compared with all non-carriers; by comparison, traditional non-adaptive selective attention assessments were unable to discriminate group differences. This phenotypic characterization highlights the potential power of administering tools that integrate adaptive psychophysical mechanics into video-game-style mechanics to achieve robust, reliable measurements.

  4. New approaches to trials in glomerulonephritis.

    PubMed

    Craig, Jonathan C; Tong, Allison; Strippoli, Giovanni F M

    2017-01-01

    Randomized controlled trials are required to reliably identify interventions to improve the outcomes for people with glomerulonephritis (GN). Unfortunately, although easier, observational studies are inherently unreliable even though the findings of both study designs agree most of the time. Currently there are ∼790 trials in GN, but suboptimal design and reporting, together with small sample sizes, mean that they may not be reliable for decision making. If the history is somewhat bleak, the future looks bright, with recent initiatives to improve the quality, size and relevance of clinical trials in nephrology, including greater patient engagement, trial networks, core outcome sets, registry-based trials and adaptive designs. Given the current state of the evidence informing the care of people with GN, disruptive technologies and pervasive culture change is required to ensure that the potential of trials to improve the health of people with this complex condition is to be realized. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  5. Context-aided sensor fusion for enhanced urban navigation.

    PubMed

    Martí, Enrique David; Martín, David; García, Jesús; de la Escalera, Arturo; Molina, José Manuel; Armingol, José María

    2012-12-06

     The deployment of Intelligent Vehicles in urban environments requires reliable estimation of positioning for urban navigation. The inherent complexity of this kind of environments fosters the development of novel systems which should provide reliable and precise solutions to the vehicle. This article details an advanced GNSS/IMU fusion system based on a context-aided Unscented Kalman filter for navigation in urban conditions. The constrained non-linear filter is here conditioned by a contextual knowledge module which reasons about sensor quality and driving context in order to adapt it to the situation, while at the same time it carries out a continuous estimation and correction of INS drift errors. An exhaustive analysis has been carried out with available data in order to characterize the behavior of available sensors and take it into account in the developed solution. The performance is then analyzed with an extensive dataset containing representative situations. The proposed solution suits the use of fusion algorithms for deploying Intelligent Transport Systems in urban environments.

  6. Assessing segment- and corridor-based travel-time reliability on urban freeways : final report.

    DOT National Transportation Integrated Search

    2016-09-01

    Travel time and its reliability are intuitive performance measures for freeway traffic operations. The objective of this project was to quantify segment-based and corridor-based travel time reliability measures on urban freeways. To achieve this obje...

  7. Human Reliability Analysis in Support of Risk Assessment for Positive Train Control

    DOT National Transportation Integrated Search

    2003-06-01

    This report describes an approach to evaluating the reliability of human actions that are modeled in a probabilistic risk assessment : (PRA) of train control operations. This approach to human reliability analysis (HRA) has been applied in the case o...

  8. 76 FR 16240 - Mandatory Reliability Standards for Interconnection Reliability Operating Limits

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ... Standards. The Reliability Standards were designed to prevent instability, uncontrolled separation, or... designed to prevent instability, uncontrolled separation, or cascading outages that adversely impact the... instability, uncontrolled separation, or cascading outages. See NERC Glossary, available at http://www.nerc...

  9. Research on the application of PPP model in the Chinese construction and operation of new energy vehicle charging facilities

    NASA Astrophysics Data System (ADS)

    Zhu, Liping

    2017-05-01

    New energy car charging equipment is the development and popularization of new energy vehicles. It has the nature of quasi-public goods. Due to the large number of construction projects, wide distribution, big investment, it needs huge sums of money. PPP mode is a new financing model and has the inherent driving force to lead the idea the technology and the system innovation. The government and the social subject cooperate on the basis of the spirit of contract thus achieve benefit sharing. This mode effectively improve the operation of new energy vehicle charging facilities operating efficiency

  10. Quasi-CW Laser Diode Bar Life Tests

    NASA Technical Reports Server (NTRS)

    Stephen, Mark A.; Krainak, Michael A.; Dallas, Joseph L.

    1997-01-01

    NASA's Goddard Space Flight Center is developing technology for satellite-based, high peak power, LIDAR transmitters requiring 3-5 years of reliable operation. Semi-conductor laser diodes provide high efficiency pumping of solid state lasers with the promise of long-lived, reliable operation. 100-watt quasi- CW laser diode bars have been baselined for the next generation laser altimeters. Multi-billion shot lifetimes are required. The authors have monitored the performance of several diodes for billions of shots and investigated operational modes for improving diode lifetime.

  11. Experience with modified aerospace reliability and quality assurance method for wind turbines

    NASA Technical Reports Server (NTRS)

    Klein, W. E.

    1982-01-01

    The SR&QA approach assures that the machine is not hazardous to the public or operating personnel, can operate unattended on a utility grid, demonstrates reliability operation, and helps establish the quality assurance and maintainability requirements for future wind turbine projects. The approach consisted of modified failure modes and effects analysis (FMEA) during the design phase, minimal hardware inspection during parts fabrication, and three simple documents to control activities during machine construction and operation. Five years experience shows that this low cost approach works well enough that it should be considered by others for similar projects.

  12. Nearest-neighbor guided evaluation of data reliability and its applications.

    PubMed

    Boongoen, Tossapon; Shen, Qiang

    2010-12-01

    The intuition of data reliability has recently been incorporated into the main stream of research on ordered weighted averaging (OWA) operators. Instead of relying on human-guided variables, the aggregation behavior is determined in accordance with the underlying characteristics of the data being aggregated. Data-oriented operators such as the dependent OWA (DOWA) utilize centralized data structures to generate reliable weights, however. Despite their simplicity, the approach taken by these operators neglects entirely any local data structure that represents a strong agreement or consensus. To address this issue, the cluster-based OWA (Clus-DOWA) operator has been proposed. It employs a cluster-based reliability measure that is effective to differentiate the accountability of different input arguments. Yet, its actual application is constrained by the high computational requirement. This paper presents a more efficient nearest-neighbor-based reliability assessment for which an expensive clustering process is not required. The proposed measure can be perceived as a stress function, from which the OWA weights and associated decision-support explanations can be generated. To illustrate the potential of this measure, it is applied to both the problem of information aggregation for alias detection and the problem of unsupervised feature selection (in which unreliable features are excluded from an actual learning process). Experimental results demonstrate that these techniques usually outperform their conventional state-of-the-art counterparts.

  13. General Aviation Aircraft Reliability Study

    NASA Technical Reports Server (NTRS)

    Pettit, Duane; Turnbull, Andrew; Roelant, Henk A. (Technical Monitor)

    2001-01-01

    This reliability study was performed in order to provide the aviation community with an estimate of Complex General Aviation (GA) Aircraft System reliability. To successfully improve the safety and reliability for the next generation of GA aircraft, a study of current GA aircraft attributes was prudent. This was accomplished by benchmarking the reliability of operational Complex GA Aircraft Systems. Specifically, Complex GA Aircraft System reliability was estimated using data obtained from the logbooks of a random sample of the Complex GA Aircraft population.

  14. Mission Control Technologies: A New Way of Designing and Evolving Mission Systems

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Walton, Joan; Saddler, Harry

    2006-01-01

    Current mission operations systems are built as a collection of monolithic software applications. Each application serves the needs of a specific user base associated with a discipline or functional role. Built to accomplish specific tasks, each application embodies specialized functional knowledge and has its own data storage, data models, programmatic interfaces, user interfaces, and customized business logic. In effect, each application creates its own walled-off environment. While individual applications are sometimes reused across multiple missions, it is expensive and time consuming to maintain these systems, and both costly and risky to upgrade them in the light of new requirements or modify them for new purposes. It is even more expensive to achieve new integrated activities across a set of monolithic applications. These problems impact the lifecycle cost (especially design, development, testing, training, maintenance, and integration) of each new mission operations system. They also inhibit system innovation and evolution. This in turn hinders NASA's ability to adopt new operations paradigms, including increasingly automated space systems, such as autonomous rovers, autonomous onboard crew systems, and integrated control of human and robotic missions. Hence, in order to achieve NASA's vision affordably and reliably, we need to consider and mature new ways to build mission control systems that overcome the problems inherent in systems of monolithic applications. The keys to the solution are modularity and interoperability. Modularity will increase extensibility (evolution), reusability, and maintainability. Interoperability will enable composition of larger systems out of smaller parts, and enable the construction of new integrated activities that tie together, at a deep level, the capabilities of many of the components. Modularity and interoperability together contribute to flexibility. The Mission Control Technologies (MCT) Project, a collaboration of multiple NASA Centers, led by NASA Ames Research Center, is building a framework to enable software to be assembled from flexible collections of components and services.

  15. The engineered biofiltration system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pisotti, D.A.

    1997-12-31

    For years, biofiltration has meant compost, peat, bark, leave mulch, or any combination of these as the substrate to house microorganisms. This has lead to a number of operational and maintenance problems, including: compaction, channeling, anaerobic zones, dry spots, pressure drop, and media degradation. All of these cause reduced efficiency and increased maintenance and increased operational costs. For these reasons inert media, including plastic beads and low grade carbons have been added to the media for buffering capacity, resists compaction, channeling and to increase efficiency. This has led to search for a more reliable and sturdy media. The media themore » authors chose was activated carbon. Pelletized activated carbon was the ideal candidate due to its uniform size and shape, its inherent hardness, adsorptive capacity, and its ability to withstand microbial degradation. The pressure drop of the system will remain constant after microbial growth occurs, due to the ability to wash the media bed. Carbon allows for the removal of excess biomass which can not be performed on organic media, this is one of the problems leading to media degradation, too many microbes and not enough food (i.e. VOCs). Carbon also allows for spike or increased loads to be treated without performance suffering. Carbon also has tremendous surface area, which allows more microorganisms to be present in a smaller volume, therefore reducing the overall size of the biofilter vessel. This paper will discuss further the findings of a pilot test that was performed using activated carbon as the media for microbial growth. This paper will show the performance of the carbon based biofilter system with respect to pressure drop, residence time, removal efficiency, microbial populations, temperature, moisture, and water requirements. The pilot unit is 350 acfm and operated for 4 months on an air stream in which the contaminant concentrations varied greatly every few minutes.« less

  16. On the Reliability of Photovoltaic Short-Circuit Current Temperature Coefficient Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osterwald, Carl R.; Campanelli, Mark; Kelly, George J.

    2015-06-14

    The changes in short-circuit current of photovoltaic (PV) cells and modules with temperature are routinely modeled through a single parameter, the temperature coefficient (TC). This parameter is vital for the translation equations used in system sizing, yet in practice is very difficult to measure. In this paper, we discuss these inherent problems and demonstrate how they can introduce unacceptably large errors in PV ratings. A method for quantifying the spectral dependence of TCs is derived, and then used to demonstrate that databases of module parameters commonly contain values that are physically unreasonable. Possible ways to reduce measurement errors are alsomore » discussed.« less

  17. Design Considerations of a Solid State Thermal Energy Storage

    NASA Astrophysics Data System (ADS)

    Janbozorgi, Mohammad; Houssainy, Sammy; Thacker, Ariana; Ip, Peggy; Ismail, Walid; Kavehpour, Pirouz

    2016-11-01

    With the growing governmental restrictions on carbon emission, renewable energies are becoming more prevalent. A reliable use of a renewable source however requires a built-in storage to overcome the inherent intermittent nature of the available energy. Thermal design of a solid state energy storage has been investigated for optimal performance. The impact of flow regime, laminar vs. turbulent, on the design and sizing of the system is also studied. The implications of low thermal conductivity of the storage material are discussed and a design that maximizes the round trip efficiency is presented. This study was supported by Award No. EPC-14-027 Granted by California Energy Commission (CEC).

  18. Design of testbed and emulation tools

    NASA Technical Reports Server (NTRS)

    Lundstrom, S. F.; Flynn, M. J.

    1986-01-01

    The research summarized was concerned with the design of testbed and emulation tools suitable to assist in projecting, with reasonable accuracy, the expected performance of highly concurrent computing systems on large, complete applications. Such testbed and emulation tools are intended for the eventual use of those exploring new concurrent system architectures and organizations, either as users or as designers of such systems. While a range of alternatives was considered, a software based set of hierarchical tools was chosen to provide maximum flexibility, to ease in moving to new computers as technology improves and to take advantage of the inherent reliability and availability of commercially available computing systems.

  19. Immune Response to Mycobacterial Infection: Lessons from Flow Cytometry

    PubMed Central

    Rovina, Nikoletta; Panagiotou, Marios; Koulouris, Nikolaos G.

    2013-01-01

    Detecting and treating active and latent tuberculosis are pivotal elements for effective infection control; yet, due to their significant inherent limitations, the diagnostic means for these two stages of tuberculosis (TB) to date remain suboptimal. This paper reviews the current diagnostic tools for mycobacterial infection and focuses on the application of flow cytometry as a promising method for rapid and reliable diagnosis of mycobacterial infection as well as discrimination between active and latent TB: it summarizes diagnostic biomarkers distinguishing the two states of infection and also features of the distinct immune response against Mycobacterium tuberculosis (Mtb) at certain stages of infection as revealed by flow cytometry to date. PMID:24376464

  20. Immune response to mycobacterial infection: lessons from flow cytometry.

    PubMed

    Rovina, Nikoletta; Panagiotou, Marios; Pontikis, Konstantinos; Kyriakopoulou, Magdalini; Koulouris, Nikolaos G; Koutsoukou, Antonia

    2013-01-01

    Detecting and treating active and latent tuberculosis are pivotal elements for effective infection control; yet, due to their significant inherent limitations, the diagnostic means for these two stages of tuberculosis (TB) to date remain suboptimal. This paper reviews the current diagnostic tools for mycobacterial infection and focuses on the application of flow cytometry as a promising method for rapid and reliable diagnosis of mycobacterial infection as well as discrimination between active and latent TB: it summarizes diagnostic biomarkers distinguishing the two states of infection and also features of the distinct immune response against Mycobacterium tuberculosis (Mtb) at certain stages of infection as revealed by flow cytometry to date.

Top