Sample records for enable full reliability

  1. Reliability-Based Control Design for Uncertain Systems

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.

    2005-01-01

    This paper presents a robust control design methodology for systems with probabilistic parametric uncertainty. Control design is carried out by solving a reliability-based multi-objective optimization problem where the probability of violating design requirements is minimized. Simultaneously, failure domains are optimally enlarged to enable global improvements in the closed-loop performance. To enable an efficient numerical implementation, a hybrid approach for estimating reliability metrics is developed. This approach, which integrates deterministic sampling and asymptotic approximations, greatly reduces the numerical burden associated with complex probabilistic computations without compromising the accuracy of the results. Examples using output-feedback and full-state feedback with state estimation are used to demonstrate the ideas proposed.

  2. Enabling High-Energy, High-Voltage Lithium-Ion Cells: Standardization of Coin-Cell Assembly, Electrochemical Testing, and Evaluation of Full Cells

    DOE PAGES

    Long, Brandon R.; Rinaldo, Steven G.; Gallagher, Kevin G.; ...

    2016-11-09

    Coin-cells are often the test format of choice for laboratories engaged in battery research and development as they provide a convenient platform for rapid testing of new materials on a small scale. However, reliable, reproducible data via the coin-cell format is inherently difficult, particularly in the full-cell configuration. In addition, statistical evaluation to prove the consistency and reliability of such data is often neglected. Herein we report on several studies aimed at formalizing physical process parameters and coin-cell construction related to full cells. Statistical analysis and performance benchmarking approaches are advocated as a means to more confidently track changes inmore » cell performance. Finally, we show that trends in the electrochemical data obtained from coin-cells can be reliable and informative when standardized approaches are implemented in a consistent manner.« less

  3. The use of video clips in teleconsultation for preschool children with movement disorders.

    PubMed

    Gorter, Hetty; Lucas, Cees; Groothuis-Oudshoorn, Karin; Maathuis, Carel; van Wijlen-Hempel, Rietje; Elvers, Hans

    2013-01-01

    To investigate the reliability and validity of video clips in assessing movement disorders in preschool children. The study group included 27 children with neuromotor concerns. The explorative validity group included children with motor problems (n = 21) or with typical development (n = 9). Hempel screening was used for live observation of the child, full recording, and short video clips. The explorative study tested the validity of the clinical classifications "typical" or "suspect." Agreement between live observation and the full recording was almost perfect; Agreement for the clinical classification "typical" or "suspect" was substantial. Agreement between the full recording and short video clips was substantial to moderate. The explorative validity study, based on short video clips and the presence of a neuromotor developmental disorder, showed substantial agreement. Hempel screening enables reliable and valid observation of video clips, but further research is necessary to demonstrate the predictive value.

  4. Structural Testing at the NWTC Helps Improve Blade Design and Increase System Reliability; NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2015-08-01

    Since 1990, the National Renewable Energy Laboratory’s (NREL's) National Wind Technology Center (NWTC) has tested more than 150 wind turbine blades. NWTC researchers can test full-scale and subcomponent articles, conduct data analyses, and provide engineering expertise on best design practices. Structural testing of wind turbine blades enables designers, manufacturers, and owners to validate designs and assess structural performance to specific load conditions. Rigorous structural testing can reveal design and manufacturing problems at an early stage of development that can lead to overall improvements in design and increase system reliability.

  5. PCR Amplification Strategies towards full-length HIV-1 Genome sequencing.

    PubMed

    Liu, Chao Chun; Ji, Hezhao

    2018-06-26

    The advent of next generation sequencing has enabled greater resolution of viral diversity and improved feasibility of full viral genome sequencing allowing routine HIV-1 full genome sequencing in both research and diagnostic settings. Regardless of the sequencing platform selected, successful PCR amplification of the HIV-1 genome is essential for sequencing template preparation. As such, full HIV-1 genome amplification is a crucial step in dictating the successful and reliable sequencing downstream. Here we reviewed existing PCR protocols leading to HIV-1 full genome sequencing. In addition to the discussion on basic considerations on relevant PCR design, the advantages as well as the pitfalls of published protocols were reviewed. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  6. Transmission overhaul estimates for partial and full replacement at repair

    NASA Technical Reports Server (NTRS)

    Savage, M.; Lewicki, D. G.

    1991-01-01

    Timely transmission overhauls increase in-flight service reliability greater than the calculated design reliabilities of the individual aircraft transmission components. Although necessary for aircraft safety, transmission overhauls contribute significantly to aircraft expense. Predictions of a transmission's maintenance needs at the design stage should enable the development of more cost effective and reliable transmissions in the future. The frequency is estimated of overhaul along with the number of transmissions or components needed to support the overhaul schedule. Two methods based on the two parameter Weibull statistical distribution for component life are used to estimate the time between transmission overhauls. These methods predict transmission lives for maintenance schedules which repair the transmission with a complete system replacement or repair only failed components of the transmission. An example illustrates the methods.

  7. High-power VCSEL systems and applications

    NASA Astrophysics Data System (ADS)

    Moench, Holger; Conrads, Ralf; Deppe, Carsten; Derra, Guenther; Gronenborn, Stephan; Gu, Xi; Heusler, Gero; Kolb, Johanna; Miller, Michael; Pekarski, Pavel; Pollmann-Retsch, Jens; Pruijmboom, Armand; Weichmann, Ulrich

    2015-03-01

    Easy system design, compactness and a uniform power distribution define the basic advantages of high power VCSEL systems. Full addressability in space and time add new dimensions for optimization and enable "digital photonic production". Many thermal processes benefit from the improved control i.e. heat is applied exactly where and when it is needed. The compact VCSEL systems can be integrated into most manufacturing equipment, replacing batch processes using large furnaces and reducing energy consumption. This paper will present how recent technological development of high power VCSEL systems will extend efficiency and flexibility of thermal processes and replace not only laser systems, lamps and furnaces but enable new ways of production. High power VCSEL systems are made from many VCSEL chips, each comprising thousands of low power VCSELs. Systems scalable in power from watts to multiple ten kilowatts and with various form factors utilize a common modular building block concept. Designs for reliable high power VCSEL arrays and systems can be developed and tested on each building block level and benefit from the low power density and excellent reliability of the VCSELs. Furthermore advanced assembly concepts aim to reduce the number of individual processes and components and make the whole system even more simple and reliable.

  8. In-Orbit Servicing: The Master Enabler

    NASA Technical Reports Server (NTRS)

    Reed, Benjamin B.; Kienlen, Michael; Naasz, Bo; Roberts, Brian; Deweese, Keith

    2015-01-01

    Some of the most noteworthy missions in space exploration have occurred in the last two decades and owe their success to on-orbit servicing. The tremendously successful Hubble Space Telescope repair and upgrade missions, as well as the completed assembly of the International Space Station (ISS) and its full utilization, lead us to the next chapter and set of challenges. These include fully exploiting the many space systems already launched, assembling large structures in situ thereby enabling new scientific discoveries, and providing systems that reliably and cost-effectively support the next steps in space exploration. In-orbit servicing is a tool--a tool that can serve as the master enabler to create space architectures that would otherwise be unattainable. This paper will survey how NASA's satellite-servicing technology development efforts are being applied to the planning and execution of two such ambitious missions, specifically asteroid capture and the in-space assembly of a very large life-finding telescope.

  9. The Master Enabler: In Orbit Servicing

    NASA Technical Reports Server (NTRS)

    Reed, Benjamin B.; Kienlen, Michael; Naasz, Bo; Roberts, Brian; Deweese, Keith; Cassidy, Justin

    2015-01-01

    Some of the most noteworthy missions in space exploration have occurred in the last two decades and owe their success to on-orbit servicing. The tremendously successful Hubble Space Telescope repair and upgrade missions, as well as the completed assembly of the International Space Station (ISS) and its full utilization, lead us to the next chapter and set of challenges. These include fully exploiting the many space systems already launched, assembling large structures in situ thereby enabling new scientific discoveries, and providing systems that reliably and cost-effectively support the next steps in space exploration. In-orbit servicing is a tool--a tool that can serve as the master enabler to create space architectures that would otherwise be unattainable. This paper will survey how NASA's satellite-servicing technology development efforts are being applied to the planning and execution of two such ambitious missions, specifically asteroid capture and the in-space assembly of a very large life-finding telescope.

  10. The "Master Enabler" - In-Orbit Servicing

    NASA Technical Reports Server (NTRS)

    Reed, Benjamin; Kienlen, Michael; Naasz, Bo; Roberts, Brian; Deweese, Keith; Cassidy, Justin

    2015-01-01

    Some of the most noteworthy missions in space exploration have occurred in the last two decades and owe their success to on-orbit servicing. The tremendously successful Hubble Space Telescope repair and upgrade missions, as well as the completed assembly of the International Space Station (ISS) and its full utilization, lead us to the next chapter and set of challenges. These include fully exploiting the many space systems already launched, assembling large structures in situ thereby enabling new scientific discoveries, and providing systems that reliably and cost-effectively support the next steps in space exploration. In-orbit servicing is a tool-a tool that can serve as the master enabler to create space architectures that would otherwise be unattainable. This paper will survey how NASA's satellite-servicing technology development efforts are being applied to the planning and execution of two such ambitious missions, specifically asteroid capture and the in-space assembly of a very large life-finding telescope.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, L.; Britt, J.; Birkmire, R.

    ITN Energy Systems, Inc., and Global Solar Energy, Inc., assisted by NREL's PV Manufacturing R&D program, have continued to advance CIGS production technology by developing trajectory-oriented predictive/control models, fault-tolerance control, control platform development, in-situ sensors, and process improvements. Modeling activities included developing physics-based and empirical models for CIGS and sputter-deposition processing, implementing model-based control, and applying predictive models to the construction of new evaporation sources and for control. Model-based control is enabled by implementing reduced or empirical models into a control platform. Reliability improvement activities include implementing preventive maintenance schedules; detecting failed sensors/equipment and reconfiguring to tinue processing; and systematicmore » development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which in turn have been enabled by control and reliability improvements due to this PV Manufacturing R&D program.« less

  12. RF-MEMS capacitive switches with high reliability

    DOEpatents

    Goldsmith, Charles L.; Auciello, Orlando H.; Carlisle, John A.; Sampath, Suresh; Sumant, Anirudha V.; Carpick, Robert W.; Hwang, James; Mancini, Derrick C.; Gudeman, Chris

    2013-09-03

    A reliable long life RF-MEMS capacitive switch is provided with a dielectric layer comprising a "fast discharge diamond dielectric layer" and enabling rapid switch recovery, dielectric layer charging and discharging that is efficient and effective to enable RF-MEMS switch operation to greater than or equal to 100 billion cycles.

  13. Subscale and Full-Scale Testing of Buckling-Critical Launch Vehicle Shell Structures

    NASA Technical Reports Server (NTRS)

    Hilburger, Mark W.; Haynie, Waddy T.; Lovejoy, Andrew E.; Roberts, Michael G.; Norris, Jeffery P.; Waters, W. Allen; Herring, Helen M.

    2012-01-01

    New analysis-based shell buckling design factors (aka knockdown factors), along with associated design and analysis technologies, are being developed by NASA for the design of launch vehicle structures. Preliminary design studies indicate that implementation of these new knockdown factors can enable significant reductions in mass and mass-growth in these vehicles and can help mitigate some of NASA s launch vehicle development and performance risks by reducing the reliance on testing, providing high-fidelity estimates of structural performance, reliability, robustness, and enable increased payload capability. However, in order to validate any new analysis-based design data or methods, a series of carefully designed and executed structural tests are required at both the subscale and full-scale level. This paper describes recent buckling test efforts at NASA on two different orthogrid-stiffened metallic cylindrical shell test articles. One of the test articles was an 8-ft-diameter orthogrid-stiffened cylinder and was subjected to an axial compression load. The second test article was a 27.5-ft-diameter Space Shuttle External Tank-derived cylinder and was subjected to combined internal pressure and axial compression.

  14. Reusable Solid Rocket Motor - Accomplishment, Lessons, and a Culture of Success

    NASA Technical Reports Server (NTRS)

    Moore, D. R.; Phelps, W. J.

    2011-01-01

    The Reusable Solid Rocket Motor (RSRM) represents the largest solid rocket motor (SRM) ever flown and the only human-rated solid motor. High reliability of the RSRM has been the result of challenges addressed and lessons learned. Advancements have resulted by applying attention to process control, testing, and postflight through timely and thorough communication in dealing with all issues. A structured and disciplined approach was taken to identify and disposition all concerns. Careful consideration and application of alternate opinions was embraced. Focus was placed on process control, ground test programs, and postflight assessment. Process control is mandatory for an SRM, because an acceptance test of the delivered product is not feasible. The RSRM maintained both full-scale and subscale test articles, which enabled continuous improvement of design and evaluation of process control and material behavior. Additionally RSRM reliability was achieved through attention to detail in post flight assessment to observe any shift in performance. The postflight analysis and inspections provided invaluable reliability data as it enables observation of actual flight performance, most of which would not be available if the motors were not recovered. RSRM reusability offered unique opportunities to learn about the hardware. NASA is moving forward with the Space Launch System that incorporates propulsion systems that takes advantage of the heritage Shuttle and Ares solid motor programs. These unique challenges, features of the RSRM, materials and manufacturing issues, and design improvements will be discussed in the paper.

  15. Local Debonding and Fiber Breakage in Composite Materials Modeled Accurately

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2001-01-01

    A prerequisite for full utilization of composite materials in aerospace components is accurate design and life prediction tools that enable the assessment of component performance and reliability. Such tools assist both structural analysts, who design and optimize structures composed of composite materials, and materials scientists who design and optimize the composite materials themselves. NASA Glenn Research Center's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package (http://www.grc.nasa.gov/WWW/LPB/mac) addresses this need for composite design and life prediction tools by providing a widely applicable and accurate approach to modeling composite materials. Furthermore, MAC/GMC serves as a platform for incorporating new local models and capabilities that are under development at NASA, thus enabling these new capabilities to progress rapidly to a stage in which they can be employed by the code's end users.

  16. Synthesis, Characterization And Modeling Of Functionally Graded Multifunctional Hybrid Composites For Extreme Environments

    DTIC Science & Technology

    2017-04-04

    research thrust areas are designed to enable the development of reliable, damage tolerant, lightweight structures with excellent thermal management...46 2. RESEARCH THRUST AREA: MULTISCALE CHARACTERIZATION AND MODELING .................................... 56 2.1 DESIGN OF MATERIALS...The research thrust areas are designed to enable the development of reliable, damage tolerant, lightweight structures with excellent thermal

  17. Travel reliability inventory for Chicago.

    DOT National Transportation Integrated Search

    2013-04-01

    The overarching goal of this research project is to enable state DOTs to document and monitor the reliability performance : of their highway networks. To this end, a computer tool, TRIC, was developed to produce travel reliability inventories from : ...

  18. Power Electronics Packaging Reliability | Transportation Research | NREL

    Science.gov Websites

    interface materials, are a key enabling technology for compact, lightweight, low-cost, and reliable power , reliability, and cost. High-temperature bonded interface materials are an important facilitating technology for compact, lightweight, low-cost, reliable power electronics packaging that fully utilizes the

  19. Electrochemical disinfection of repeatedly recycled blackwater in a free-standing, additive-free toilet.

    PubMed

    Hawkins, Brian T; Sellgren, Katelyn L; Klem, Ethan J D; Piascik, Jeffrey R; Stoner, Brian R

    2017-11-01

    Decentralized, energy-efficient waste water treatment technologies enabling water reuse are needed to sustainably address sanitation needs in water- and energy-scarce environments. Here, we describe the effects of repeated recycling of disinfected blackwater (as flush liquid) on the energy required to achieve full disinfection with an electrochemical process in a prototype toilet system. The recycled liquid rapidly reached a steady state with total solids reliably ranging between 0.50 and 0.65% and conductivity between 20 and 23 mS/cm through many flush cycles over 15 weeks. The increase in accumulated solids was associated with increased energy demand and wide variation in the free chlorine contact time required to achieve complete disinfection. Further studies on the system at steady state revealed that running at higher voltage modestly improves energy efficiency, and established running parameters that reliably achieve disinfection at fixed run times. These results will guide prototype testing in the field.

  20. The Development of a Motor-Free Short-Form of the Wechsler Intelligence Scale for Children-Fifth Edition.

    PubMed

    Piovesana, Adina M; Harrison, Jessica L; Ducat, Jacob J

    2017-12-01

    This study aimed to develop a motor-free short-form of the Wechsler Intelligence Scale for Children-Fifth Edition (WISC-V) that allows clinicians to estimate the Full Scale Intelligence Quotients of youths with motor impairments. Using the reliabilities and intercorrelations of six WISC-V motor-free subtests, psychometric methodologies were applied to develop look-up tables for four Motor-free Short-form indices: Verbal Comprehension Short-form, Perceptual Reasoning Short-form, Working Memory Short-form, and a Motor-free Intelligence Quotient. Index-level discrepancy tables were developed using the same methods to allow clinicians to statistically compare visual, verbal, and working memory abilities. The short-form indices had excellent reliabilities ( r = .92-.97) comparable to the original WISC-V. This motor-free short-form of the WISC-V is a reliable alternative for the assessment of intellectual functioning in youths with motor impairments. Clinicians are provided with user-friendly look-up tables, index level discrepancy tables, and base rates, displayed similar to those in the WISC-V manuals to enable interpretation of assessment results.

  1. Software-defined optical network for metro-scale geographically distributed data centers.

    PubMed

    Samadi, Payman; Wen, Ke; Xu, Junjie; Bergman, Keren

    2016-05-30

    The emergence of cloud computing and big data has rapidly increased the deployment of small and mid-sized data centers. Enterprises and cloud providers require an agile network among these data centers to empower application reliability and flexible scalability. We present a software-defined inter data center network to enable on-demand scale out of data centers on a metro-scale optical network. The architecture consists of a combined space/wavelength switching platform and a Software-Defined Networking (SDN) control plane equipped with a wavelength and routing assignment module. It enables establishing transparent and bandwidth-selective connections from L2/L3 switches, on-demand. The architecture is evaluated in a testbed consisting of 3 data centers, 5-25 km apart. We successfully demonstrated end-to-end bulk data transfer and Virtual Machine (VM) migrations across data centers with less than 100 ms connection setup time and close to full link capacity utilization.

  2. Thermo-electrochemical instrumentation of cylindrical Li-ion cells

    NASA Astrophysics Data System (ADS)

    McTurk, Euan; Amietszajew, Tazdin; Fleming, Joe; Bhagat, Rohit

    2018-03-01

    The performance evaluation and optimisation of commercially available lithium-ion cells is typically based upon their full cell potential and surface temperature measurements, despite these parameters not being fully representative of the electrochemical processes taking place in the core of the cell or at each electrode. Several methods were devised to obtain the cell core temperature and electrode-specific potential profiles of cylindrical Li-ion cells. Optical fibres with Bragg Gratings were found to produce reliable core temperature data, while their small mechanical profile allowed for low-impact instrumentation method. A pure metallic lithium reference electrode insertion method was identified, avoiding interference with other elements of the cell while ensuring good contact, enabling in-situ observations of the per-electrode electrochemical responses. Our thermo-electrochemical instrumentation technique has enabled us to collect unprecedented cell data, and has subsequently been used in advanced studies exploring the real-world performance limits of commercial cells.

  3. Measure of Truck Delay and Reliability at the Corridor Level

    DOT National Transportation Integrated Search

    2018-04-01

    Freight transportation provides a significant contribution to our nations economy. A reliable and accessible freight network enables business in the Twin Cities to be more competitive in the Upper Midwest region. Accurate and reliable freight data...

  4. The Healthy Brain Network Serial Scanning Initiative: a resource for evaluating inter-individual differences and their reliabilities across scan conditions and sessions

    PubMed Central

    O’Connor, David; Potler, Natan Vega; Kovacs, Meagan; Xu, Ting; Ai, Lei; Pellman, John; Vanderwal, Tamara; Parra, Lucas C.; Cohen, Samantha; Ghosh, Satrajit; Escalera, Jasmine; Grant-Villegas, Natalie; Osman, Yael; Bui, Anastasia; Craddock, R. Cameron

    2017-01-01

    Abstract Background: Although typically measured during the resting state, a growing literature is illustrating the ability to map intrinsic connectivity with functional MRI during task and naturalistic viewing conditions. These paradigms are drawing excitement due to their greater tolerability in clinical and developing populations and because they enable a wider range of analyses (e.g., inter-subject correlations). To be clinically useful, the test-retest reliability of connectivity measured during these paradigms needs to be established. This resource provides data for evaluating test-retest reliability for full-brain connectivity patterns detected during each of four scan conditions that differ with respect to level of engagement (rest, abstract animations, movie clips, flanker task). Data are provided for 13 participants, each scanned in 12 sessions with 10 minutes for each scan of the four conditions. Diffusion kurtosis imaging data was also obtained at each session. Findings: Technical validation and demonstrative reliability analyses were carried out at the connection-level using the Intraclass Correlation Coefficient and at network-level representations of the data using the Image Intraclass Correlation Coefficient. Variation in intrinsic functional connectivity across sessions was generally found to be greater than that attributable to scan condition. Between-condition reliability was generally high, particularly for the frontoparietal and default networks. Between-session reliabilities obtained separately for the different scan conditions were comparable, though notably lower than between-condition reliabilities. Conclusions: This resource provides a test-bed for quantifying the reliability of connectivity indices across subjects, conditions and time. The resource can be used to compare and optimize different frameworks for measuring connectivity and data collection parameters such as scan length. Additionally, investigators can explore the unique perspectives of the brain's functional architecture offered by each of the scan conditions. PMID:28369458

  5. Physician Enabling Skills Questionnaire

    PubMed Central

    Hudon, Catherine; Lambert, Mireille; Almirall, José

    2015-01-01

    Abstract Objective To evaluate the reliability and validity of the newly developed Physician Enabling Skills Questionnaire (PESQ) by assessing its internal consistency, test-retest reliability, concurrent validity with patient-centred care, and predictive validity with patient activation and patient enablement. Design Validation study. Setting Saguenay, Que. Participants One hundred patients with at least 1 chronic disease who presented in a waiting room of a regional health centre family medicine unit. Main outcome measures Family physicians’ enabling skills, measured with the PESQ at 2 points in time (ie, while in the waiting room at the family medicine unit and 2 weeks later through a mail survey); patient-centred care, assessed with the Patient Perception of Patient-Centredness instrument; patient activation, assessed with the Patient Activation Measure; and patient enablement, assessed with the Patient Enablement Instrument. Results The internal consistency of the 6 subscales of the PESQ was adequate (Cronbach α = .69 to .92). The test-retest reliability was very good (r = 0.90; 95% CI 0.84 to 0.93). Concurrent validity with the Patient Perception of Patient-Centredness instrument was good (r = −0.67; 95% CI −0.78 to −0.53; P < .001). The PESQ accounts for 11% of the total variance with the Patient Activation Measure (r2 = 0.11; P = .002) and 19% of the variance with the Patient Enablement Instrument (r2 = 0.19; P < .001). Conclusion The newly developed PESQ presents good psychometric properties, allowing for its use in practice and research. PMID:26889507

  6. 77 FR 53877 - Commission Information Collection Activities (FERC-715); Comment Request; Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-04

    ...; A detailed description of the transmission planning reliability criteria used to evaluate system... reliability criteria are applied and the steps taken in performing transmission planning studies); and A... reliability criteria using its stated assessment practices. The FERC-715 enables the Commission to use the...

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, L.

    ITN Energy Systems, Inc., and Global Solar Energy, Inc., with the assistance of NREL's PV Manufacturing R&D program, have continued the advancement of CIGS production technology through the development of trajectory-oriented predictive/control models, fault-tolerance control, control-platform development, in-situ sensors, and process improvements. Modeling activities to date include the development of physics-based and empirical models for CIGS and sputter-deposition processing, implementation of model-based control, and application of predictive models to the construction of new evaporation sources and for control. Model-based control is enabled through implementation of reduced or empirical models into a control platform. Reliability improvement activities include implementation of preventivemore » maintenance schedules; detection of failed sensors/equipment and reconfiguration to continue processing; and systematic development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which, in turn, have been enabled by control and reliability improvements due to this PV Manufacturing R&D program. This has resulted in substantial improvements of flexible CIGS PV module performance and efficiency.« less

  8. Electrochemical disinfection of repeatedly recycled blackwater in a free‐standing, additive‐free toilet

    PubMed Central

    Sellgren, Katelyn L.; Klem, Ethan J. D.; Piascik, Jeffrey R.; Stoner, Brian R.

    2017-01-01

    Abstract Decentralized, energy‐efficient waste water treatment technologies enabling water reuse are needed to sustainably address sanitation needs in water‐ and energy‐scarce environments. Here, we describe the effects of repeated recycling of disinfected blackwater (as flush liquid) on the energy required to achieve full disinfection with an electrochemical process in a prototype toilet system. The recycled liquid rapidly reached a steady state with total solids reliably ranging between 0.50 and 0.65% and conductivity between 20 and 23 mS/cm through many flush cycles over 15 weeks. The increase in accumulated solids was associated with increased energy demand and wide variation in the free chlorine contact time required to achieve complete disinfection. Further studies on the system at steady state revealed that running at higher voltage modestly improves energy efficiency, and established running parameters that reliably achieve disinfection at fixed run times. These results will guide prototype testing in the field. PMID:29242713

  9. Developmental Validation of Short Tandem Repeat Reagent Kit for Forensic DNA Profiling of Canine Biological Materials

    PubMed Central

    Dayton, Melody; Koskinen, Mikko T; Tom, Bradley K; Mattila, Anna-Maria; Johnston, Eric; Halverson, Joy; Fantin, Dennis; DeNise, Sue; Budowle, Bruce; Smith, David Glenn; Kanthaswamy, Sree

    2009-01-01

    Aim To develop a reagent kit that enables multiplex polymerase chain reaction (PCR) amplification of 18 short tandem repeats (STR) and the canine sex-determining Zinc Finger marker. Methods Validation studies to determine the robustness and reliability in forensic DNA typing of this multiplex assay included sensitivity testing, reproducibility studies, intra- and inter-locus color balance studies, annealing temperature and cycle number studies, peak height ratio determination, characterization of artifacts such as stutter percentages and dye blobs, mixture analyses, species-specificity, case type samples analyses and population studies. Results The kit robustly amplified domesticated dog samples and consistently generated full 19-locus profiles from as little as 125 pg of dog DNA. In addition, wolf DNA samples could be analyzed with the kit. Conclusion The kit, which produces robust, reliable, and reproducible results, will be made available for the forensic research community after modifications based on this study’s evaluation to comply with the quality standards expected for forensic casework. PMID:19480022

  10. Reliability and precision of pellet-group counts for estimating landscape-level deer density

    Treesearch

    David S. deCalesta

    2013-01-01

    This study provides hitherto unavailable methodology for reliably and precisely estimating deer density within forested landscapes, enabling quantitative rather than qualitative deer management. Reliability and precision of the deer pellet-group technique were evaluated in 1 small and 2 large forested landscapes. Density estimates, adjusted to reflect deer harvest and...

  11. Managing Reliability in the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dellin, T.A.

    1998-11-23

    The rapid pace of change at Ike end of the 20th Century should continue unabated well into the 21st Century. The driver will be the marketplace imperative of "faster, better, cheaper." This imperative has already stimulated a revolution-in-engineering in design and manufacturing. In contrast, to date, reliability engineering has not undergone a similar level of change. It is critical that we implement a corresponding revolution-in-reliability-engineering as we enter the new millennium. If we are still using 20th Century reliability approaches in the 21st Century, then reliability issues will be the limiting factor in faster, better, and cheaper. At the heartmore » of this reliability revolution will be a science-based approach to reliability engineering. Science-based reliability will enable building-in reliability, application-specific products, virtual qualification, and predictive maintenance. The purpose of this paper is to stimulate a dialogue on the future of reliability engineering. We will try to gaze into the crystal ball and predict some key issues that will drive reliability programs in the new millennium. In the 21st Century, we will demand more of our reliability programs. We will need the ability to make accurate reliability predictions that will enable optimizing cost, performance and time-to-market to meet the needs of every market segment. We will require that all of these new capabilities be in place prior to the stint of a product development cycle. The management of reliability programs will be driven by quantifiable metrics of value added to the organization business objectives.« less

  12. NCBI2RDF: enabling full RDF-based access to NCBI databases.

    PubMed

    Anguita, Alberto; García-Remesal, Miguel; de la Iglesia, Diana; Maojo, Victor

    2013-01-01

    RDF has become the standard technology for enabling interoperability among heterogeneous biomedical databases. The NCBI provides access to a large set of life sciences databases through a common interface called Entrez. However, the latter does not provide RDF-based access to such databases, and, therefore, they cannot be integrated with other RDF-compliant databases and accessed via SPARQL query interfaces. This paper presents the NCBI2RDF system, aimed at providing RDF-based access to the complete NCBI data repository. This API creates a virtual endpoint for servicing SPARQL queries over different NCBI repositories and presenting to users the query results in SPARQL results format, thus enabling this data to be integrated and/or stored with other RDF-compliant repositories. SPARQL queries are dynamically resolved, decomposed, and forwarded to the NCBI-provided E-utilities programmatic interface to access the NCBI data. Furthermore, we show how our approach increases the expressiveness of the native NCBI querying system, allowing several databases to be accessed simultaneously. This feature significantly boosts productivity when working with complex queries and saves time and effort to biomedical researchers. Our approach has been validated with a large number of SPARQL queries, thus proving its reliability and enhanced capabilities in biomedical environments.

  13. Integrating Nuclear and Renewable Electricity in a Low-Carbon World: MIT-Japan Future of Nuclear Power Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haratyk, Geoffrey; Komiyama, Ryoichi; Forsberg, Charles

    Affordable reliable energy made possible a large middle class in the industrial world. Concerns about climate change require a transition to nuclear, wind, and solar—but these energy sources in current forms do not have the capability to meet the requirements for variable affordable energy. Researchers from the Massachusetts Institute of Technology, the University of Tokyo, the Tokyo Institute of Technology and the Institute for Energy Economics are undertaking a series of studies to address how to make this transition to a low carbon world. Three areas are being investigated. The first area is the development of electricity grid models tomore » understand the impacts of different choices of technologies and different limits on greenhouse gas emissions. The second area is the development of technologies to enable variable electricity to the grid while capital-intensive nuclear, wind and solar generating plants operate at full capacity to minimize costs. Technologies to enable meeting variable electricity demand while operating plants at high-capacity factors include use of heat and hydrogen storage. The third area is the development of electricity market rules to enable transition to a low-carbon grid.« less

  14. NASA's Orbital Space Plane Risk Reduction Strategy

    NASA Technical Reports Server (NTRS)

    Dumbacher, Dan

    2003-01-01

    This paper documents the transformation of NASA s Space Launch Initiative (SLI) Second Generation Reusable Launch Vehicle Program under the revised Integrated Space Transportation Plan, announced November 2002. Outlining the technology development approach followed by the original SLI, this paper gives insight into the current risk-reduction strategy that will enable confident development of the Nation s first orbital space plane (OSP). The OSP will perform an astronaut and contingency cargo transportation function, with an early crew rescue capability, thus enabling increased crew size and enhanced science operations aboard the International Space Station. The OSP design chosen for full-scale development will take advantage of the latest innovations American industry has to offer. The OSP Program identifies critical technologies that must be advanced to field a safe, reliable, affordable space transportation system for U.S. access to the Station and low-Earth orbit. OSP flight demonstrators will test crew safety features, validate autonomous operations, and mature thermal protection systems. Additional enabling technologies may be identified during the OSP design process as part of an overall risk-management strategy. The OSP Program uses a comprehensive and evolutionary systems acquisition approach, while applying appropriate lessons learned.

  15. High-reliability computing for the smarter planet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, Heather M; Graham, Paul; Manuzzato, Andrea

    2010-01-01

    The geometric rate of improvement of transistor size and integrated circuit performance, known as Moore's Law, has been an engine of growth for our economy, enabling new products and services, creating new value and wealth, increasing safety, and removing menial tasks from our daily lives. Affordable, highly integrated components have enabled both life-saving technologies and rich entertainment applications. Anti-lock brakes, insulin monitors, and GPS-enabled emergency response systems save lives. Cell phones, internet appliances, virtual worlds, realistic video games, and mp3 players enrich our lives and connect us together. Over the past 40 years of silicon scaling, the increasing capabilities ofmore » inexpensive computation have transformed our society through automation and ubiquitous communications. In this paper, we will present the concept of the smarter planet, how reliability failures affect current systems, and methods that can be used to increase the reliable adoption of new automation in the future. We will illustrate these issues using a number of different electronic devices in a couple of different scenarios. Recently IBM has been presenting the idea of a 'smarter planet.' In smarter planet documents, IBM discusses increased computer automation of roadways, banking, healthcare, and infrastructure, as automation could create more efficient systems. A necessary component of the smarter planet concept is to ensure that these new systems have very high reliability. Even extremely rare reliability problems can easily escalate to problematic scenarios when implemented at very large scales. For life-critical systems, such as automobiles, infrastructure, medical implantables, and avionic systems, unmitigated failures could be dangerous. As more automation moves into these types of critical systems, reliability failures will need to be managed. As computer automation continues to increase in our society, the need for greater radiation reliability is necessary. Already critical infrastructure is failing too frequently. In this paper, we will introduce the Cross-Layer Reliability concept for designing more reliable computer systems.« less

  16. Solid State Inflation Balloon Active Deorbiter: Scalable Low-Cost Deorbit System for Small Satellites

    NASA Technical Reports Server (NTRS)

    Huang, Adam

    2016-01-01

    The goal of the Solid State Inflation Balloon Active Deorbiter project is to develop and demonstrate a scalable, simple, reliable, and low-cost active deorbiting system capable of controlling the downrange point of impact for the full-range of small satellites from 1 kg to 180 kg. The key enabling technology being developed is the Solid State Gas Generator (SSGG) chip, generating pure nitrogen gas from sodium azide (NaN3) micro-crystals. Coupled with a metalized nonelastic drag balloon, the complete Solid State Inflation Balloon (SSIB) system is capable of repeated inflation/deflation cycles. The SSGG minimizes size, weight, electrical power, and cost when compared to the current state of the art.

  17. Phase 1 Space Fission Propulsion System Design Considerations

    NASA Technical Reports Server (NTRS)

    Houts, Mike; VanDyke, Melissa; Godfroy, Tom; Pedersen, Kevin; Martin, James; Carter, Robert; Dickens, Ricky; Salvail, Pat; Hrbud, Ivana; Rodgers, Stephen L. (Technical Monitor)

    2001-01-01

    Fission technology can enable rapid, affordable access to any point in the solar system. If fission propulsion systems are to be developed to their full potential; however, near-term customers must be identified and initial fission systems successfully developed, launched, and operated. Studies conducted in fiscal year 2001 (IISTP, 2001) show that fission electric propulsion (FEP) systems operating at 80 kWe or above could enhance or enable numerous robotic outer solar system missions of interest. At these power levels it is possible to develop safe, affordable systems that meet mission performance requirements. In selecting the system design to pursue, seven evaluation criteria were identified: safety, reliability, testability, specific mass, cost, schedule, and programmatic risk. A top-level comparison of three potential concepts was performed: an SP-100 based pumped liquid lithium system, a direct gas cooled system, and a heatpipe cooled system. For power levels up to at least 500 kWt (enabling electric power levels of 125-175 kWe, given 25-35% power conversion efficiency) the heatpipe system has advantages related to several criteria and is competitive with respect to all. Hardware-based research and development has further increased confidence in the heatpipe approach. Successful development and utilization of a "Phase 1" fission electric propulsion system will enable advanced Phase 2 and Phase 3 systems capable of providing rapid, affordable access to any point in the solar system.

  18. Phase 1 space fission propulsion system design considerations

    NASA Astrophysics Data System (ADS)

    Houts, Mike; van Dyke, Melissa; Godfroy, Tom; Pedersen, Kevin; Martin, James; Dickens, Ricky; Salvail, Pat; Hrbud, Ivana; Carter, Robert

    2002-01-01

    Fission technology can enable rapid, affordable access to any point in the solar system. If fission propulsion systems are to be developed to their full potential; however, near-term customers must be identified and initial fission systems successfully developed, launched, and operated. Studies conducted in fiscal year 2001 (IISTP, 2001) show that fission electric propulsion (FEP) systems operating at 80 kWe or above could enhance or enable numerous robotic outer solar system missions of interest. At these power levels it is possible to develop safe, affordable systems that meet mission performance requirements. In selecting the system design to pursue, seven evaluation criteria were identified: safety, reliability, testability, specific mass, cost, schedule, and programmatic risk. A top-level comparison of three potential concepts was performed: an SP-100 based pumped liquid lithium system, a direct gas cooled system, and a heatpipe cooled system. For power levels up to at least 500 kWt (enabling electric power levels of 125-175 kWe, given 25-35% power conversion efficiency) the heatpipe system has advantages related to several criteria and is competitive with respect to all. Hardware-based research and development has further increased confidence in the heatpipe approach. Successful development and utilization of a ``Phase 1'' fission electric propulsion system will enable advanced Phase 2 and Phase 3 systems capable of providing rapid, affordable access to any point in the solar system. .

  19. The Healthy Brain Network Serial Scanning Initiative: a resource for evaluating inter-individual differences and their reliabilities across scan conditions and sessions.

    PubMed

    O'Connor, David; Potler, Natan Vega; Kovacs, Meagan; Xu, Ting; Ai, Lei; Pellman, John; Vanderwal, Tamara; Parra, Lucas C; Cohen, Samantha; Ghosh, Satrajit; Escalera, Jasmine; Grant-Villegas, Natalie; Osman, Yael; Bui, Anastasia; Craddock, R Cameron; Milham, Michael P

    2017-02-01

    Although typically measured during the resting state, a growing literature is illustrating the ability to map intrinsic connectivity with functional MRI during task and naturalistic viewing conditions. These paradigms are drawing excitement due to their greater tolerability in clinical and developing populations and because they enable a wider range of analyses (e.g., inter-subject correlations). To be clinically useful, the test-retest reliability of connectivity measured during these paradigms needs to be established. This resource provides data for evaluating test-retest reliability for full-brain connectivity patterns detected during each of four scan conditions that differ with respect to level of engagement (rest, abstract animations, movie clips, flanker task). Data are provided for 13 participants, each scanned in 12 sessions with 10 minutes for each scan of the four conditions. Diffusion kurtosis imaging data was also obtained at each session. Technical validation and demonstrative reliability analyses were carried out at the connection-level using the Intraclass Correlation Coefficient and at network-level representations of the data using the Image Intraclass Correlation Coefficient. Variation in intrinsic functional connectivity across sessions was generally found to be greater than that attributable to scan condition. Between-condition reliability was generally high, particularly for the frontoparietal and default networks. Between-session reliabilities obtained separately for the different scan conditions were comparable, though notably lower than between-condition reliabilities. This resource provides a test-bed for quantifying the reliability of connectivity indices across subjects, conditions and time. The resource can be used to compare and optimize different frameworks for measuring connectivity and data collection parameters such as scan length. Additionally, investigators can explore the unique perspectives of the brain's functional architecture offered by each of the scan conditions. © The Author 2017. Published by Oxford University Press.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kisner, R.; Melin, A.; Burress, T.

    The objective of this project is to demonstrate improved reliability and increased performance made possible by deeply embedding instrumentation and controls (I&C) in nuclear power plant (NPP) components and systems. The project is employing a highly instrumented canned rotor, magnetic bearing, fluoride salt pump as its I&C technology demonstration platform. I&C is intimately part of the basic millisecond-by-millisecond functioning of the system; treating I&C as an integral part of the system design is innovative and will allow significant improvement in capabilities and performance. As systems become more complex and greater performance is required, traditional I&C design techniques become inadequate andmore » more advanced I&C needs to be applied. New I&C techniques enable optimal and reliable performance and tolerance of noise and uncertainties in the system rather than merely monitoring quasistable performance. Traditionally, I&C has been incorporated in NPP components after the design is nearly complete; adequate performance was obtained through over-design. By incorporating I&C at the beginning of the design phase, the control system can provide superior performance and reliability and enable designs that are otherwise impossible. This report describes the progress and status of the project and provides a conceptual design overview for the platform to demonstrate the performance and reliability improvements enabled by advanced embedded I&C.« less

  1. A new approach to power quality and electricity reliability monitoring-case study illustrations of the capabilities of the I-GridTM system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Divan, Deepak; Brumsickle, William; Eto, Joseph

    2003-04-01

    This report describes a new approach for collecting information on power quality and reliability and making it available in the public domain. Making this information readily available in a form that is meaningful to electricity consumers is necessary for enabling more informed private and public decisions regarding electricity reliability. The system dramatically reduces the cost (and expertise) needed for customers to obtain information on the most significant power quality events, called voltage sags and interruptions. The system also offers widespread access to information on power quality collected from multiple sites and the potential for capturing information on the impacts ofmore » power quality problems, together enabling a wide variety of analysis and benchmarking to improve system reliability. Six case studies demonstrate selected functionality and capabilities of the system, including: Linking measured power quality events to process interruption and downtime; Demonstrating the ability to correlate events recorded by multiple monitors to narrow and confirm the causes of power quality events; and Benchmarking power quality and reliability on a firm and regional basis.« less

  2. 76 FR 16277 - System Restoration Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ... system restoration process. The Commission also approves the NERC's proposal to retire four existing EOP... prepare personnel to enable effective coordination of the system restoration process. The Commission also..., through the Reliability Standard development process, a modification to EOP-005-1 that identifies time...

  3. Developing Confidence Limits For Reliability Of Software

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1991-01-01

    Technique developed for estimating reliability of software by use of Moranda geometric de-eutrophication model. Pivotal method enables straightforward construction of exact bounds with associated degree of statistical confidence about reliability of software. Confidence limits thus derived provide precise means of assessing quality of software. Limits take into account number of bugs found while testing and effects of sampling variation associated with random order of discovering bugs.

  4. Topics in Measurement: Reliability and Validity.

    ERIC Educational Resources Information Center

    Dick, Walter; Hagerty, Nancy

    This text was developed on an autoinstructional basis to familiarize the reader with the various interpretations of reliability and validity, their measurement and evaluation, and factors influencing their measurement. The text enables those with prior knowledge of statistics to increase their understanding of variance and correlation. Review…

  5. Oral health care in remote Kimberley Aboriginal communities: the characteristics and perceptions of dental volunteers.

    PubMed

    Patel, J; Hearn, L; Slack-Smith, L M

    2015-09-01

    Aboriginal Australians face significant disparities in oral health and this is particularly the case in remote communities where access to dental services can be difficult. Using volunteers to provide dental care in the remote Kimberley region of Western Australia is a novel approach. This study comprised an anonymous online survey of volunteers working with the Kimberley Dental Team (KDT). The survey had a response fraction of 66% and explored volunteer demographic characteristics, factors that motivated their involvement, perceptions of oral health among Aboriginal communities, and barriers and enablers to oral health in remote Aboriginal communities. Volunteers were more likely to be female, middle-aged and engaged in full-time employment. The two most common reasons reported for volunteering were to assist the community and visit the Kimberley region. Education and access to reliable, culturally appropriate care were perceived as enablers to good oral health for Aboriginal people in the Kimberley while limited access to services, poor nutrition and lack of government support were cited as barriers. Volunteers providing dental services to remote areas in Western Australia had a diverse demographic profile. However, they share similar motivating factors and views on the current barriers and enablers to good oral health in remote Aboriginal communities. © 2015 Australian Dental Association.

  6. NCBI2RDF: Enabling Full RDF-Based Access to NCBI Databases

    PubMed Central

    Anguita, Alberto; García-Remesal, Miguel; de la Iglesia, Diana; Maojo, Victor

    2013-01-01

    RDF has become the standard technology for enabling interoperability among heterogeneous biomedical databases. The NCBI provides access to a large set of life sciences databases through a common interface called Entrez. However, the latter does not provide RDF-based access to such databases, and, therefore, they cannot be integrated with other RDF-compliant databases and accessed via SPARQL query interfaces. This paper presents the NCBI2RDF system, aimed at providing RDF-based access to the complete NCBI data repository. This API creates a virtual endpoint for servicing SPARQL queries over different NCBI repositories and presenting to users the query results in SPARQL results format, thus enabling this data to be integrated and/or stored with other RDF-compliant repositories. SPARQL queries are dynamically resolved, decomposed, and forwarded to the NCBI-provided E-utilities programmatic interface to access the NCBI data. Furthermore, we show how our approach increases the expressiveness of the native NCBI querying system, allowing several databases to be accessed simultaneously. This feature significantly boosts productivity when working with complex queries and saves time and effort to biomedical researchers. Our approach has been validated with a large number of SPARQL queries, thus proving its reliability and enhanced capabilities in biomedical environments. PMID:23984425

  7. Single chip lidar with discrete beam steering by digital micromirror device.

    PubMed

    Smith, Braden; Hellman, Brandon; Gin, Adley; Espinoza, Alonzo; Takashima, Yuzuru

    2017-06-26

    A novel method of beam steering enables a large field of view and reliable single chip light detection and ranging (lidar) by utilizing a mass-produced digital micromirror device (DMD). Using a short pulsed laser, the micromirrors' rotation is frozen in mid-transition, which forms a programmable blazed grating. The blazed grating efficiently redistributes the light to a single diffraction order, among several. We demonstrated time of flight measurements for five discrete angles using this beam steering method with a nano second 905nm laser and Si avalanche diode. A distance accuracy of < 1 cm over a 1 m distance range, a 48° full field of view, and a measurement rate of 3.34k points/s is demonstrated.

  8. FPGA-based trigger system for the LUX dark matter experiment

    NASA Astrophysics Data System (ADS)

    Akerib, D. S.; Araújo, H. M.; Bai, X.; Bailey, A. J.; Balajthy, J.; Beltrame, P.; Bernard, E. P.; Bernstein, A.; Biesiadzinski, T. P.; Boulton, E. M.; Bradley, A.; Bramante, R.; Cahn, S. B.; Carmona-Benitez, M. C.; Chan, C.; Chapman, J. J.; Chiller, A. A.; Chiller, C.; Currie, A.; Cutter, J. E.; Davison, T. J. R.; de Viveiros, L.; Dobi, A.; Dobson, J. E. Y.; Druszkiewicz, E.; Edwards, B. N.; Faham, C. H.; Fiorucci, S.; Gaitskell, R. J.; Gehman, V. M.; Ghag, C.; Gibson, K. R.; Gilchriese, M. G. D.; Hall, C. R.; Hanhardt, M.; Haselschwardt, S. J.; Hertel, S. A.; Hogan, D. P.; Horn, M.; Huang, D. Q.; Ignarra, C. M.; Ihm, M.; Jacobsen, R. G.; Ji, W.; Kazkaz, K.; Khaitan, D.; Knoche, R.; Larsen, N. A.; Lee, C.; Lenardo, B. G.; Lesko, K. T.; Lindote, A.; Lopes, M. I.; Malling, D. C.; Manalaysay, A. G.; Mannino, R. L.; Marzioni, M. F.; McKinsey, D. N.; Mei, D.-M.; Mock, J.; Moongweluwan, M.; Morad, J. A.; Murphy, A. St. J.; Nehrkorn, C.; Nelson, H. N.; Neves, F.; O`Sullivan, K.; Oliver-Mallory, K. C.; Ott, R. A.; Palladino, K. J.; Pangilinan, M.; Pease, E. K.; Phelps, P.; Reichhart, L.; Rhyne, C.; Shaw, S.; Shutt, T. A.; Silva, C.; Skulski, W.; Solovov, V. N.; Sorensen, P.; Stephenson, S.; Sumner, T. J.; Szydagis, M.; Taylor, D. J.; Taylor, W.; Tennyson, B. P.; Terman, P. A.; Tiedt, D. R.; To, W. H.; Tripathi, M.; Tvrznikova, L.; Uvarov, S.; Verbus, J. R.; Webb, R. C.; White, J. T.; Whitis, T. J.; Witherell, M. S.; Wolfs, F. L. H.; Yin, J.; Young, S. K.; Zhang, C.

    2016-05-01

    LUX is a two-phase (liquid/gas) xenon time projection chamber designed to detect nuclear recoils resulting from interactions with dark matter particles. Signals from the detector are processed with an FPGA-based digital trigger system that analyzes the incoming data in real-time, with just a few microsecond latency. The system enables first pass selection of events of interest based on their pulse shape characteristics and 3D localization of the interactions. It has been shown to be > 99 % efficient in triggering on S2 signals induced by only few extracted liquid electrons. It is continuously and reliably operating since its full underground deployment in early 2013. This document is an overview of the systems capabilities, its inner workings, and its performance.

  9. The modified patient enablement instrument: a Portuguese cross-cultural adaptation, validity and reliability study.

    PubMed

    Remelhe, Mafalda; Teixeira, Pedro M; Lopes, Irene; Silva, Luís; Correia de Sousa, Jaime

    2017-01-12

    Enabling patients with asthma to obtain the knowledge, confidence and skills they need in order to assume a major role in the management of their disease is cost effective. It should be an integral part of any plan for long-term control of asthma. The modified Patient Enablement Instrument (mPEI) is an easily administered questionnaire that was adapted in the United Kingdom to measure patient enablement in asthma, but its applicability in Portugal is not known. Validity and reliability of questionnaires should be tested before use in settings different from those of the original version. The purpose of this study was to test the applicability of the mPEI to Portuguese asthma patients after translation and cross-cultural adaptation, and to verify the structural validity, internal consistency and reproducibility of the instrument. The mPEI was translated to Portuguese and back translated to English. Its content validity was assessed by a debriefing interview with 10 asthma patients. The translated instrument was then administered to a random sample of 142 patients with persistent asthma. Structural validity and internal consistency were assessed. For reproducibility analysis, 86 patients completed the instrument again 7 days later. Item-scale correlations and exploratory factor analysis were used to assess structural validity. Cronbach's alpha was used to test internal consistency, and the intra-class correlation coefficient was used for the analysis of reproducibility. All items of the Portuguese version of the mPEI were found to be equivalent to the original English version. There were strong item-scale correlations that confirmed construct validity, with a one component structure and good internal consistency (Cronbach's alpha >0.8) as well as high test-retest reliability (ICC=0.85). The mPEI showed sound psychometric properties for the evaluation of enablement in patients with asthma making it a reliable instrument for use in research and clinical practice in Portugal. Further studies are needed to confirm its responsiveness.

  10. Software For Computing Reliability Of Other Software

    NASA Technical Reports Server (NTRS)

    Nikora, Allen; Antczak, Thomas M.; Lyu, Michael

    1995-01-01

    Computer Aided Software Reliability Estimation (CASRE) computer program developed for use in measuring reliability of other software. Easier for non-specialists in reliability to use than many other currently available programs developed for same purpose. CASRE incorporates mathematical modeling capabilities of public-domain Statistical Modeling and Estimation of Reliability Functions for Software (SMERFS) computer program and runs in Windows software environment. Provides menu-driven command interface; enabling and disabling of menu options guides user through (1) selection of set of failure data, (2) execution of mathematical model, and (3) analysis of results from model. Written in C language.

  11. A new technique in the global reliability of cyclic communications network

    NASA Technical Reports Server (NTRS)

    Sjogren, Jon A.

    1989-01-01

    The global reliability of a communications network is the probability that given any pair of nodes, there exists a viable path between them. A characterization of connectivity, for a given class of networks, can enable one to find this reliability. Such a characterization is described for a useful class of undirected networks called daisy-chained or braided networks. This leads to a new method of quickly computing the global reliability of these networks. Asymptotic behavior in terms of component reliability is related to geometric properties of the given graph. Generalization of the technique is discussed.

  12. In vitro culture of embryonic mouse intestinal epithelium: cell differentiation and introduction of reporter genes.

    PubMed

    Quinlan, Jonathan M; Yu, Wei-Yuan; Hornsey, Mark A; Tosh, David; Slack, Jonathan M W

    2006-05-25

    Study of the normal development of the intestinal epithelium has been hampered by a lack of suitable model systems, in particular ones that enable the introduction of exogenous genes. Production of such a system would advance our understanding of normal epithelial development and help to shed light on the pathogenesis of intestinal neoplasia. The criteria for a reliable culture system include the ability to perform real time observations and manipulations in vitro, the preparation of wholemounts for immunostaining and the potential for introducing genes. The new culture system involves growing mouse embryo intestinal explants on fibronectin-coated coverslips in basal Eagle's medium+20% fetal bovine serum. Initially the cultures maintain expression of the intestinal transcription factor Cdx2 together with columnar epithelial (cytokeratin 8) and mesenchymal (smooth muscle actin) markers. Over a few days of culture, differentiation markers appear characteristic of absorptive epithelium (sucrase-isomaltase), goblet cells (Periodic Acid Schiff positive), enteroendocrine cells (chromogranin A) and Paneth cells (lysozyme). Three different approaches were tested to express genes in the developing cultures: transfection, electroporation and adenoviral infection. All could introduce genes into the mesenchyme, but only to a small extent into the epithelium. However the efficiency of adenovirus infection can be greatly improved by a limited enzyme digestion, which makes accessible the lateral faces of cells bearing the Coxsackie and Adenovirus Receptor. This enables reliable delivery of genes into epithelial cells. We describe a new in vitro culture system for the small intestine of the mouse embryo that recapitulates its normal development. The system both provides a model for studying normal development of the intestinal epithelium and also allows for the manipulation of gene expression. The explants can be cultured for up to two weeks, they form the full repertoire of intestinal epithelial cell types (enterocytes, goblet cells, Paneth cells and enteroendocrine cells) and the method for gene introduction into the epithelium is efficient and reliable.

  13. Developing diagnostic SNP panels for the identification of true fruit flies (Diptera: Tephritidae) within the limits of COI-based species delimitation

    PubMed Central

    2013-01-01

    Background Rapid and reliable identification of quarantine pests is essential for plant inspection services to prevent introduction of invasive species. For insects, this may be a serious problem when dealing with morphologically similar cryptic species complexes and early developmental stages that lack distinctive characters useful for taxonomic identification. DNA based barcoding could solve many of these problems. The standard barcode fragment, an approx. 650 base pairs long sequence of the 5′end of the mitochondrial cytochrome oxidase I (COI), enables differentiation of a very wide range of arthropods. However, problems remain in some taxa, such as Tephritidae, where recent genetic differentiation among some of the described species hinders accurate molecular discrimination. Results In order to explore the full species discrimination potential of COI, we sequenced the barcoding region of the COI gene of a range of economically important Tephritid species and complemented these data with all GenBank and BOLD entries for the systematic group available as of January 2012. We explored the limits of species delimitation of this barcode fragment among 193 putative Tephritid species and established operational taxonomic units (OTUs), between which discrimination is reliably possible. Furthermore, to enable future development of rapid diagnostic assays based on this sequence information, we characterized all single nucleotide polymorphisms (SNPs) and established “near-minimal” sets of SNPs that differentiate among all included OTUs with at least three and four SNPs, respectively. Conclusions We found that although several species cannot be differentiated based on the genetic diversity observed in COI and hence form composite OTUs, 85% of all OTUs correspond to described species. Because our SNP panels are developed based on all currently available sequence information and rely on a minimal pairwise difference of three SNPs, they are highly reliable and hence represent an important resource for developing taxon-specific diagnostic assays. For selected cases, possible explanations that may cause composite OTUs are discussed. PMID:23718854

  14. The use of evidence-based guidance to enable reliable and accurate measurements of the home environment

    PubMed Central

    Atwal, Anita; McIntyre, Anne

    2017-01-01

    Introduction High quality guidance in home strategies is needed to enable older people to measure their home environment and become involved in the provision of assistive devices and to promote consistency among professionals. This study aims to investigate the reliability of such guidance and its ability to promote accuracy of results when measurements are taken by both older people and professionals. Method Twenty-five health professionals and 26 older people participated in a within-group design to test the accuracy of measurements taken (that is, person’s popliteal height, baths, toilets, beds, stairs and chairs). Data were analysed with descriptive analysis and the Wilcoxon test. The intra-rater reliability was assessed by correlating measurements taken at two different times with guidance use. Results The intra-rater reliability analysis revealed statistical significance (P < 0.05) for all measurements except for the bath internal width. The guidance enabled participants to take 90% of measurements that they were not able to complete otherwise, 80.55% of which lay within the acceptable suggested margin of variation. Accuracy was supported by the significant reduction in the standard deviation of the actual measurements and accuracy scores. Conclusion This evidence-based guidance can be used in its current format by older people and professionals to facilitate appropriate measurements. Yet, some users might need help from carers or specialists depending on their impairments. PMID:29386701

  15. The use of evidence-based guidance to enable reliable and accurate measurements of the home environment.

    PubMed

    Spiliotopoulou, Georgia; Atwal, Anita; McIntyre, Anne

    2018-01-01

    High quality guidance in home strategies is needed to enable older people to measure their home environment and become involved in the provision of assistive devices and to promote consistency among professionals. This study aims to investigate the reliability of such guidance and its ability to promote accuracy of results when measurements are taken by both older people and professionals. Twenty-five health professionals and 26 older people participated in a within-group design to test the accuracy of measurements taken (that is, person's popliteal height, baths, toilets, beds, stairs and chairs). Data were analysed with descriptive analysis and the Wilcoxon test. The intra-rater reliability was assessed by correlating measurements taken at two different times with guidance use. The intra-rater reliability analysis revealed statistical significance ( P  < 0.05) for all measurements except for the bath internal width. The guidance enabled participants to take 90% of measurements that they were not able to complete otherwise, 80.55% of which lay within the acceptable suggested margin of variation. Accuracy was supported by the significant reduction in the standard deviation of the actual measurements and accuracy scores. This evidence-based guidance can be used in its current format by older people and professionals to facilitate appropriate measurements. Yet, some users might need help from carers or specialists depending on their impairments.

  16. Interventions to assist health consumers to find reliable online health information: a comprehensive review.

    PubMed

    Lee, Kenneth; Hoti, Kreshnik; Hughes, Jeffery D; Emmerton, Lynne M

    2014-01-01

    Health information on the Internet is ubiquitous, and its use by health consumers prevalent. Finding and understanding relevant online health information, and determining content reliability, pose real challenges for many health consumers. To identify the types of interventions that have been implemented to assist health consumers to find reliable online health information, and where possible, describe and compare the types of outcomes studied. PubMed, PsycINFO, CINAHL Plus and Cochrane Library databases; WorldCat and Scirus 'gray literature' search engines; and manual review of reference lists of selected publications. Publications were selected by firstly screening title, abstract, and then full text. Seven publications met the inclusion criteria, and were summarized in a data extraction form. The form incorporated the PICOS (Population Intervention Comparators Outcomes and Study Design) Model. Two eligible gray literature papers were also reported. Relevant data from included studies were tabulated to enable descriptive comparison. A brief critique of each study was included in the tables. This review was unable to follow systematic review methods due to the paucity of research and humanistic interventions reported. While extensive, the gray literature search may have had limited reach in some countries. The paucity of research on this topic limits conclusions that may be drawn. The few eligible studies predominantly adopted a didactic approach to assisting health consumers, whereby consumers were either taught how to find credible websites, or how to use the Internet. Common types of outcomes studied include knowledge and skills pertaining to Internet use and searching for reliable health information. These outcomes were predominantly self-assessed by participants. There is potential for further research to explore other avenues for assisting health consumers to find reliable online health information, and to assess outcomes via objective measures.

  17. ENABLING COMMERCIALIZATION OF A LEAD-FREE COATING MANUFACTURING PROCESS - PHASE I

    EPA Science Inventory

    This Phase I SBIR program addresses the need for a manufacturing process that enables high reliability Pb-free tin coatings. Pb-free tin solders used in electronics applications have demonstrated whisker growth, due in part to compressive stresses within the deposit, causing ...

  18. A Vision for Spaceflight Reliability: NASA's Objectives Based Strategy

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Evans, John; Hall, Tony

    2015-01-01

    In defining the direction for a new Reliability and Maintainability standard, OSMA has extracted the essential objectives that our programs need, to undertake a reliable mission. These objectives have been structured to lead mission planning through construction of an objective hierarchy, which defines the critical approaches for achieving high reliability and maintainability (R M). Creating a hierarchy, as a basis for assurance implementation, is a proven approach; yet, it holds the opportunity to enable new directions, as NASA moves forward in tackling the challenges of space exploration.

  19. Intrauterine Pressure (IUP) Telemetry in Pregnant and Parturient Rats: Potential Applications for Spacecraft and Centrifugation Studies

    NASA Technical Reports Server (NTRS)

    Ronca, A. E.; Baer, L. A.; Wade, C. E.

    2003-01-01

    Rats exposed to spaceflight or centrifugation from mid-to late pregnancy undergo either more or fewer labor contractions at birth, respectively, as compared to those in normal Earth gravity (1-g). In this paper, we report the development and validation of a new telemetric method for quantifying intrauterine pressure (IUP) in freely-moving, late pregnant and parturient rats. We plan to utilize this technique for studies of labor in altered gravity, specifically, to ascertain forces of uterine during birth, which we believe may be changed in micro- and hypergravity. The technique we describe yields precise, reliable measures of the forces experienced by rat fetuses during parturition. A small, surgically-implantable telemetric pressure sensor was fitted within a fluid-filled balloon. The total volume of the sensor-balloon assembly matched that of a full term rat fetus. Real-time videorecordings of sensor-implanted rat dams and non- implanted control dams enabled us to characterize effects of the intrauterine implant on behavioral aspects of parturition. Contraction frequency, duration, pup-to-pup birth intervals and pup-oriented activities of the dams measured during the peri-birth period were unaffected by the sensor implant. These findings establish intrauterine telemetry as a reliable, non-invasive technique for quantifying intrauterine pressures associated with parturition on Earth and in altered gravity environments. This new technology, readily amenable to spaceflight and centrifugation platforms, will enable us to answer key questions regarding the role of altered labor frequency labor in the adaptation of newborn mammals to hypo- and hypergravity.

  20. Shape memory alloy actuation for a variable area fan nozzle

    NASA Astrophysics Data System (ADS)

    Rey, Nancy; Tillman, Gregory; Miller, Robin M.; Wynosky, Thomas; Larkin, Michael J.; Flamm, Jeffrey D.; Bangert, Linda S.

    2001-06-01

    The ability to control fan nozzle exit area is an enabling technology for next generation high-bypass-ratio turbofan engines. Performance benefits for such designs are estimated at up to 9% in thrust specific fuel consumption (TSFC) relative to current fixed-geometry engines. Conventionally actuated variable area fan nozzle (VAN) concepts tend to be heavy and complicated, with significant aircraft integration, reliability and packaging issues. The goal of this effort was to eliminate these undesirable features and formulate a design that meets or exceeds leakage, durability, reliability, maintenance and manufacturing cost goals. A Shape Memory Alloy (SMA) bundled cable actuator acting to move an array of flaps around the fan nozzle annulus is a concept that meets these requirements. The SMA bundled cable actuator developed by the United Technologies Corporation (Patents Pending) provides significant work output (greater than 2200 in-lb per flap, through the range of motion) in a compact package and minimizes system complexity. Results of a detailed design study indicate substantial engine performance, weight, and range benefits. The SMA- based actuation system is roughly two times lighter than a conventional mechanical system, with significant aircraft direct operating cost savings (2-3%) and range improvements (5-6%) relative to a fixed-geometry nozzle geared turbofan. A full-scale sector model of this VAN system was built and then tested at the Jet Exit Test Facility at NASA Langley to demonstrate the system's ability to achieve 20% area variation of the nozzle under full scale aerodynamic loads. The actuator exceeded requirements, achieving repeated actuation against full-scale loads representative of typical cruise as well as greater than worst-case (ultimate) aerodynamic conditions. Based on these encouraging results, work is continuing with the goal of a flight test on a C-17 transport aircraft.

  1. Structural Testing of the Blade Reliability Collaborative Effect of Defect Wind Turbine Blades

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Desmond, M.; Hughes, S.; Paquette, J.

    Two 8.3-meter (m) wind turbine blades intentionally constructed with manufacturing flaws were tested to failure at the National Wind Technology Center (NWTC) at the National Renewable Energy Laboratory (NREL) south of Boulder, Colorado. Two blades were tested; one blade was manufactured with a fiberglass spar cap and the second blade was manufactured with a carbon fiber spar cap. Test loading primarily consisted of flap fatigue loading of the blades, with one quasi-static ultimate load case applied to the carbon fiber spar cap blade. Results of the test program were intended to provide the full-scale test data needed for validation ofmore » model and coupon test results of the effect of defects in wind turbine blade composite materials. Testing was part of the Blade Reliability Collaborative (BRC) led by Sandia National Laboratories (SNL). The BRC seeks to develop a deeper understanding of the causes of unexpected blade failures (Paquette 2012), and to develop methods to enable blades to survive to their expected operational lifetime. Recent work in the BRC includes examining and characterizing flaws and defects known to exist in wind turbine blades from manufacturing processes (Riddle et al. 2011). Recent results from reliability databases show that wind turbine rotor blades continue to be a leading contributor to turbine downtime (Paquette 2012).« less

  2. Generalizability and Decision Studies to Inform Observational and Experimental Research in Classroom Settings

    ERIC Educational Resources Information Center

    Bottema-Beutel, Kristen; Lloyd, Blair; Carter, Erik W.; Asmus, Jennifer M.

    2014-01-01

    Attaining reliable estimates of observational measures can be challenging in school and classroom settings, as behavior can be influenced by multiple contextual factors. Generalizability (G) studies can enable researchers to estimate the reliability of observational data, and decision (D) studies can inform how many observation sessions are…

  3. Reliable Assessment with CyberTutor, a Web-Based Homework Tutor.

    ERIC Educational Resources Information Center

    Pritchard, David E.; Morote, Elsa-Sofia

    This paper demonstrates that an electronic tutoring program can collect data that enables a far more reliable assessment of students' skills than a standard examination. Socratic electronic homework tutor, CyberTutor can integrate effectively instruction and assessment. CyberTutor assessment has about 62 times less variance due to random test…

  4. TOXNET and Beyond - Using the NLMs Environmental Health and Toxicology Portal-February

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Templin-Branner, W.

    2010-02-24

    The purpose of this training is to familiarize participants with reliable online environmental health and toxicology information, from the National Library of Medicine and other reliable sources. Skills and knowledge acquired in this training class will enable participants to access, utilize, and refer others to environmental health and toxicology information.

  5. Complete polarization characterization of single plasmonic nanoparticle enabled by a novel Dark-field Mueller matrix spectroscopy system

    PubMed Central

    Chandel, Shubham; Soni, Jalpa; Ray, Subir kumar; Das, Anwesh; Ghosh, Anirudha; Raj, Satyabrata; Ghosh, Nirmalya

    2016-01-01

    Information on the polarization properties of scattered light from plasmonic systems are of paramount importance due to fundamental interest and potential applications. However, such studies are severely compromised due to the experimental difficulties in recording full polarization response of plasmonic nanostructures. Here, we report on a novel Mueller matrix spectroscopic system capable of acquiring complete polarization information from single isolated plasmonic nanoparticle/nanostructure. The outstanding issues pertaining to reliable measurements of full 4 × 4 spectroscopic scattering Mueller matrices from single nanoparticle/nanostructures are overcome by integrating an efficient Mueller matrix measurement scheme and a robust eigenvalue calibration method with a dark-field microscopic spectroscopy arrangement. Feasibility of quantitative Mueller matrix polarimetry and its potential utility is illustrated on a simple plasmonic system, that of gold nanorods. The demonstrated ability to record full polarization information over a broad wavelength range and to quantify the intrinsic plasmon polarimetry characteristics via Mueller matrix inverse analysis should lead to a novel route towards quantitative understanding, analysis/interpretation of a number of intricate plasmonic effects and may also prove useful towards development of polarization-controlled novel sensing schemes. PMID:27212687

  6. Capacitive micromachined ultrasonic transducers (CMUTs) with isolation posts.

    PubMed

    Huang, Yongli; Zhuang, Xuefeng; Haeggstrom, Edward O; Ergun, A Sanli; Cheng, Ching-Hsiang; Khuri-Yakub, Butrus T

    2008-03-01

    In this paper, an improved design of a capacitive micromachined ultrasonic transducer (CMUT) is presented. The design improvement aims to address the reliability issues of a CMUT and to extend the device operation beyond the contact (collapse) voltage. The major design novelty is the isolation posts in the vacuum cavities of the CMUT cells instead of full-coverage insulation layers in conventional CMUTs. This eliminates the contact voltage drifting due to charging caused by the insulation layer, and enables repeatable CMUT operation in the post-contact regime. Ultrasonic tests of the CMUTs with isolation posts (PostCMUTs) in air (electrical input impedance and capacitance vs. bias voltage) and immersion (transmission and reception) indicate acoustic performance similar to that obtained from conventional CMUTs while no undesired side effects of this new design is observed.

  7. FPGA-based trigger system for the LUX dark matter experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akerib, D. S.; Araújo, H. M.; Bai, X.

    LUX is a two-phase (liquid/gas) xenon time projection chamber designed to detect nuclear recoils resulting from interactions with dark matter particles. Signals from the detector are processed with an FPGA-based digital trigger system that analyzes the incoming data in real-time, with just a few microsecond latency. The system enables first pass selection of events of interest based on their pulse shape characteristics and 3D localization of the interactions. It has been shown to be >99% efficient in triggering on S2 signals induced by only few extracted liquid electrons. It is continuously and reliably operating since its full underground deployment inmore » early 2013. This document is an overview of the systems capabilities, its inner workings, and its performance.« less

  8. LC-MS Data Processing with MAVEN: A Metabolomic Analysis and Visualization Engine

    PubMed Central

    Clasquin, Michelle F.; Melamud, Eugene; Rabinowitz, Joshua D.

    2014-01-01

    MAVEN is an open-source software program for interactive processing of LC-MS-based metabolomics data. MAVEN enables rapid and reliable metabolite quantitation from multiple reaction monitoring data or high-resolution full-scan mass spectrometry data. It automatically detects and reports peak intensities for isotope-labeled metabolites. Menu-driven, click-based navigation allows visualization of raw and analyzed data. Here we provide a User Guide for MAVEN. Step-by-step instructions are provided for data import, peak alignment across samples, identification of metabolites that differ strongly between biological conditions, quantitation and visualization of isotope-labeling patterns, and export of tables of metabolite-specific peak intensities. Together, these instructions describe a workflow that allows efficient processing of raw LC-MS data into a form ready for biological analysis. PMID:22389014

  9. Conservative restoration of a traumatically involved central incisor.

    PubMed

    Bassett, Joyce

    2012-04-01

    The use of a direct composite material known for excellent polishability, polish retention, and wear resistance is described in this case of a fractured central incisor restoration. The method used enabled the clinician to conserve tooth structure and maintain full control of the outcome while creating an esthetically imperceptible, reliable, and durable restoration for a young male patient. Emphasized in this case are the techniques of layering, contouring, and polishing of a nanocomposite used to maximize esthetics and meet patient expectations. To further ensure imperceptibility, the author recommends first facilitating color shade selection for both body and dentin-especially in two-shade or multiple-shade restorations-by placing the composite in its planned area of the restoration and curing it in its proper thickness to allow a preview and recipe map.

  10. FPGA-based trigger system for the LUX dark matter experiment

    DOE PAGES

    Akerib, D. S.; Araújo, H. M.; Bai, X.; ...

    2016-02-17

    We present that LUX is a two-phase (liquid/gas) xenon time projection chamber designed to detect nuclear recoils resulting from interactions with dark matter particles. Signals from the detector are processed with an FPGA-based digital trigger system that analyzes the incoming data in real-time, with just a few microsecond latency. The system enables first pass selection of events of interest based on their pulse shape characteristics and 3D localization of the interactions. It has been shown to be > 99% efficient in triggering on S2 signals induced by only few extracted liquid electrons. It is continuously and reliably operating since itsmore » full underground deployment in early 2013. Finally, this document is an overview of the systems capabilities, its inner workings, and its performance.« less

  11. LC-MS data processing with MAVEN: a metabolomic analysis and visualization engine.

    PubMed

    Clasquin, Michelle F; Melamud, Eugene; Rabinowitz, Joshua D

    2012-03-01

    MAVEN is an open-source software program for interactive processing of LC-MS-based metabolomics data. MAVEN enables rapid and reliable metabolite quantitation from multiple reaction monitoring data or high-resolution full-scan mass spectrometry data. It automatically detects and reports peak intensities for isotope-labeled metabolites. Menu-driven, click-based navigation allows visualization of raw and analyzed data. Here we provide a User Guide for MAVEN. Step-by-step instructions are provided for data import, peak alignment across samples, identification of metabolites that differ strongly between biological conditions, quantitation and visualization of isotope-labeling patterns, and export of tables of metabolite-specific peak intensities. Together, these instructions describe a workflow that allows efficient processing of raw LC-MS data into a form ready for biological analysis.

  12. FPGA-based trigger system for the LUX dark matter experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akerib, D. S.; Araújo, H. M.; Bai, X.

    We present that LUX is a two-phase (liquid/gas) xenon time projection chamber designed to detect nuclear recoils resulting from interactions with dark matter particles. Signals from the detector are processed with an FPGA-based digital trigger system that analyzes the incoming data in real-time, with just a few microsecond latency. The system enables first pass selection of events of interest based on their pulse shape characteristics and 3D localization of the interactions. It has been shown to be > 99% efficient in triggering on S2 signals induced by only few extracted liquid electrons. It is continuously and reliably operating since itsmore » full underground deployment in early 2013. Finally, this document is an overview of the systems capabilities, its inner workings, and its performance.« less

  13. Fraternity as "Enabling Environment:" Does Membership Lead to Gambling Problems?

    ERIC Educational Resources Information Center

    Biddix, J. Patrick; Hardy, Thomas W.

    2008-01-01

    Researchers have suggested that fraternity membership is the most reliable predictor of gambling and gambling problems on campus. The purpose of this study was to determine if problematic gambling could be linked to specific aspects of fraternity membership. Though the null hypothesis (no enabling environment) failed to be rejected, descriptive…

  14. Combating the Reliability Challenge of GPU Register File at Low Supply Voltage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Jingweijia; Song, Shuaiwen; Yan, Kaige

    Supply voltage reduction is an effective approach to significantly reduce GPU energy consumption. As the largest on-chip storage structure, the GPU register file becomes the reliability hotspot that prevents further supply voltage reduction below the safe limit (Vmin) due to process variation effects. This work addresses the reliability challenge of the GPU register file at low supply voltages, which is an essential first step for aggressive supply voltage reduction of the entire GPU chip. We propose GR-Guard, an architectural solution that leverages long register dead time to enable reliable operations from unreliable register file at low voltages.

  15. Solvent-free optical recording of structural colours on pre-imprinted photocrosslinkable nanostructures

    NASA Astrophysics Data System (ADS)

    Jiang, Hao; Rezaei, Mohamad; Abdolahi, Mahssa; Kaminska, Bozena

    2017-09-01

    Optical digital information storage media, despite their ever-increasing storage capacity and data transfer rate, are vulnerable to the potential risk of turning inaccessible. For this reason, long-term eye-readable full-colour optical archival storage is in high demand for preserving valuable information from cultural, intellectual, and scholarly resources. However, the concurrent requirements in recording colours inexpensively and precisely, and preserving colours for the very long term (for at least 100 years), have not yet been met by existing storage techniques. Structural colours hold the promise to overcome such challenges. However, there is still the lack of an inexpensive, rapid, reliable, and solvent-free optical patterning technique for recording structural colours. In this paper, we introduce an enabling technique based on optical and thermal patterning of nanoimprinted SU-8 nanocone arrays. Using photocrosslinking and thermoplastic flow of SU-8, diffractive structural colours of nanocone arrays are recorded using ultra-violet (UV) exposure followed by the thermal development and reshaping of nanocones. Different thermal treatment procedures in reshaping nanocones are investigated and compared, and two-step progressive baking is found to allow the controllable reshaping of nanocones. The height of the nanocones and brightness of diffractive colours are modulated by varying the UV exposure dose to enable grey-scale patterning. An example of recorded full-colour image through half-tone patterning is also demonstrated. The presented technique requires only low-power continuous-wave UV light and is very promising to be adopted for professional and consumer archival storage applications.

  16. Validity and reliability of the Japanese version of the FIM + FAM in patients with cerebrovascular accident.

    PubMed

    Miki, Emi; Yamane, Shingo; Yamaoka, Mai; Fujii, Hiroe; Ueno, Hiroka; Kawahara, Toshie; Tanaka, Keiko; Tamashiro, Hiroaki; Inoue, Eiji; Okamoto, Takatsugu; Kuriyama, Masaru

    2016-09-01

    The study aim was to investigate the validity and reliability of the Functional Independence Measure and Functional Assessment Measure (FIM + FAM), which is unfamiliar in Japan, by using its Japanese version (FIM + FAM-j) in patients with cerebrovascular accident (CVA). Forty-two CVA patients participated. Criterion validity was examined by correlating the full scale and subscales of FIM + FAM-j with several well-established measurements using Spearman's correlation coefficient. Reliability was evaluated by internal consistency (tested by Cronbach's alpha coefficient) and intra-rater reliability (tested by Kendall's tau correlation coefficient). Good-to-excellent criterion validity was found between the full scale and motor subscales of the FIM + FAM-j and the Barthel Index, National Institutes of Health Stroke Scale, modified Rankin Scale, and lower extremity Brunnstrom Recovery Stage. High internal consistency was observed within the full-scale FIM + FAM-j and the motor and cognitive subscales (Cronbach's alphas were 0.968, 0.954, and 0.948, respectively). Additionally, good intra-rater reliability was observed within the full scale and motor subscales, and excellent reliability for the cognitive subscales (taus were 0.83, 0.80, and 0.98, respectively). This study showed that the FIM + FAM-j demonstrated acceptable levels of validity and reliability when used for CVA as a measure of disability.

  17. A Quantitative Risk Analysis Framework for Evaluating and Monitoring Operational Reliability of Cloud Computing

    ERIC Educational Resources Information Center

    Islam, Muhammad Faysal

    2013-01-01

    Cloud computing offers the advantage of on-demand, reliable and cost efficient computing solutions without the capital investment and management resources to build and maintain in-house data centers and network infrastructures. Scalability of cloud solutions enable consumers to upgrade or downsize their services as needed. In a cloud environment,…

  18. openECA Platform and Analytics Alpha Test Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Russell

    The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.

  19. openECA Platform and Analytics Beta Demonstration Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Russell

    The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.

  20. Anatomy of Data Integration

    PubMed Central

    Brazhnik, Olga; Jones, John F.

    2007-01-01

    Producing reliable information is the ultimate goal of data processing. The ocean of data created with the advances of science and technologies calls for integration of data coming from heterogeneous sources that are diverse in their purposes, business rules, underlying models and enabling technologies. Reference models, Semantic Web, standards, ontology, and other technologies enable fast and efficient merging of heterogeneous data, while the reliability of produced information is largely defined by how well the data represent the reality. In this paper we initiate a framework for assessing the informational value of data that includes data dimensions; aligning data quality with business practices; identifying authoritative sources and integration keys; merging models; uniting updates of varying frequency and overlapping or gapped data sets. PMID:17071142

  1. Phase 1 Space Fission Propulsion Energy Source Design

    NASA Technical Reports Server (NTRS)

    Houts, Mike; VanDyke, Melissa; Godfroy, Tom; Pedersen, Kevin; Martin, James; Dickens, Ricky; Salvail, Pat; Hrbud, Ivana; Carter, Robert; Rodgers, Stephen L. (Technical Monitor)

    2002-01-01

    Fission technology can enable rapid, affordable access to any point in the solar system. If fission propulsion systems are to be developed to their full potential; however, near-term customers must be identified and initial fission systems successfully developed, launched, and operated. Studies conducted in fiscal year 2001 (IISTP, 2001) show that fission electric propulsion (FEP) systems with a specific mass at or below 50 kg/kWjet could enhance or enable numerous robotic outer solar system missions of interest. At the required specific mass, it is possible to develop safe, affordable systems that meet mission requirements. To help select the system design to pursue, eight evaluation criteria were identified: system integration, safety, reliability, testability, specific mass, cost, schedule, and programmatic risk. A top-level comparison of four potential concepts was performed: a Testable, Passive, Redundant Reactor (TPRR), a Testable Multi-Cell In-Core Thermionic Reactor (TMCT), a Direct Gas Cooled Reactor (DGCR), and a Pumped Liquid Metal Reactor.(PLMR). Development of any of the four systems appears feasible. However, for power levels up to at least 500 kWt (enabling electric power levels of 125-175 kWe, given 25-35% power conversion efficiency) the TPRR has advantages related to several criteria and is competitive with respect to all. Hardware-based research and development has further increased confidence in the TPRR approach. Successful development and utilization of a "Phase I" fission electric propulsion system will enable advanced Phase 2 and Phase 3 systems capable of providing rapid, affordable access to any point in the solar system.

  2. Reliability analysis applied to structural tests

    NASA Technical Reports Server (NTRS)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  3. The multidriver: A reliable multicast service using the Xpress Transfer Protocol

    NASA Technical Reports Server (NTRS)

    Dempsey, Bert J.; Fenton, John C.; Weaver, Alfred C.

    1990-01-01

    A reliable multicast facility extends traditional point-to-point virtual circuit reliability to one-to-many communication. Such services can provide more efficient use of network resources, a powerful distributed name binding capability, and reduced latency in multidestination message delivery. These benefits will be especially valuable in real-time environments where reliable multicast can enable new applications and increase the availability and the reliability of data and services. We present a unique multicast service that exploits features in the next-generation, real-time transfer layer protocol, the Xpress Transfer Protocol (XTP). In its reliable mode, the service offers error, flow, and rate-controlled multidestination delivery of arbitrary-sized messages, with provision for the coordination of reliable reverse channels. Performance measurements on a single-segment Proteon ProNET-4 4 Mbps 802.5 token ring with heterogeneous nodes are discussed.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Joyce Jihyun; Schetrit, Oren; Yin, Rongxin

    Demand response (DR) – allowing customers to respond to reliability requests and market prices by changing electricity use from their normal consumption pattern – continues to be seen as an attractive means of demand-side management and a fundamental smart-grid improvement that links supply and demand. From October 2011 to December 2013, the Demand Response Research Center at Lawrence Berkeley National Laboratory, the New York State Energy Research and Development Authority, and partners Honeywell and Akuacom, have conducted a demonstration project enabling Automated Demand Response (Auto-DR) in large commercial buildings located in New York City using Open Automated Demand Response (OpenADR)more » communication protocols. In particular, this project focuses on demonstrating how the OpenADR platform, enabled by Akuacom, can automate and simplify interactions between buildings and various stakeholders in New York State and enable the automation of customers’ price response to yield bill savings under dynamic pricing. In this paper, the cost control opportunities under day-ahead hourly pricing and Auto-DR control strategies are presented for four demonstration buildings; present the breakdown of Auto-DR enablement costs; summarize the field test results and their load impact; and show potential bill savings by enabling automated price response under Consolidated Edison’s Mandatory Hourly Pricing (MHP) tariff. For one of the sites, the potential bill savings at the site’s current retail rate are shown. Facility managers were given granular equipment-level opt-out capability to ensure full control of the sites during the Auto-DR implementation. The expected bill savings ranged from 1.1% to 8.0% of the total MHP bill. The automation and enablement costs ranged from $70 to $725 per kW shed. The results show that OpenADR can facilitate the automation of price response, deliver savings to the customers and opt-out capability of the implementation retains control of the sites by facility managers.« less

  5. Guidelines for the functional annotation of microRNAs using the Gene Ontology

    PubMed Central

    D'Eustachio, Peter; Smith, Jennifer R.; Zampetaki, Anna

    2016-01-01

    MicroRNA regulation of developmental and cellular processes is a relatively new field of study, and the available research data have not been organized to enable its inclusion in pathway and network analysis tools. The association of gene products with terms from the Gene Ontology is an effective method to analyze functional data, but until recently there has been no substantial effort dedicated to applying Gene Ontology terms to microRNAs. Consequently, when performing functional analysis of microRNA data sets, researchers have had to rely instead on the functional annotations associated with the genes encoding microRNA targets. In consultation with experts in the field of microRNA research, we have created comprehensive recommendations for the Gene Ontology curation of microRNAs. This curation manual will enable provision of a high-quality, reliable set of functional annotations for the advancement of microRNA research. Here we describe the key aspects of the work, including development of the Gene Ontology to represent this data, standards for describing the data, and guidelines to support curators making these annotations. The full microRNA curation guidelines are available on the GO Consortium wiki (http://wiki.geneontology.org/index.php/MicroRNA_GO_annotation_manual). PMID:26917558

  6. Behavioral and Physiological Analyses of Parturition In Pregnant Rats: Insights Derived from Intrauterine Telemetry

    NASA Technical Reports Server (NTRS)

    Villareal, J.; Mallery, E.; Lynch, A.; Mills, N.; Baer, L.; Wade, C.; Ronca, A.; Dalton, Donnie (Technical Monitor)

    2002-01-01

    During labor and birth, fetuses are exposed to considerable physical stimulation associated with labor contractions and expulsion from the womb These forces are important for the neonates' adaptation to tile extrauterine environment. To further our understanding of the relationship between labor and postpartum outcome, we developed a novel method for measuring intrauterine pressure (IUP) in freely-moving, late pregnant and parturient rats that enables us to make precise, reliable measures of the forces experienced by rat fetuses during parturition. A small (1.25 x 4 cm) telemetric blood pressure sensor was fitted within a fluid-filled balloon, similar in size to a full term rat fetus. On Gestational day (G) 19 of the rats' 22/23 day pregnancy, each dam was anesthetized and a balloon/sensor unit surgically implanted within the uterus following removal of two fetuses. Comparisons were made between sensor-implanted dams (IMPL) and a control conditions: 1) LAP-R, laparotomy with two fetuses removed or 2) LAP-NR, laparotomy with no fetuses removed. IUP signals were sampled at 10s intervals from the IMPL dams during labor and birth. Dams in all three conditions were videorecorded enabling us to analyze the effect of the implant on behavioral expressions of parturition. Contraction frequency, duration, pup-to-pup birth intervals and pup-oriented activities of the dams measured from one hour prior to the first pup birth until the birth of the third pup were unaffected by the sensor implant. Intrauterine telemetry of freely-moving dams offers significant advantages over conventional hardwired IUP measurement techniques. These findings establish and validate intrauterine telemetry as a reliable, non-invasive technique for quantifying pressures associated with parturition.

  7. Flexible Electrostatic Technologies for Capture and Handling, Phase 1

    NASA Technical Reports Server (NTRS)

    Bryan, Thomas

    2015-01-01

    Fundamental to many of NASA's in-space transportation missions is the capture and handling of various objects and vehicles in various orbits for servicing, debris disposal, sample retrieval, and assembly without the benefit of sufficient grapple fixtures and docking ports. To perform similar material handling tasks on Earth, pincher grippers, suction grippers, or magnetic chucks are used, but are unable to reliably grip aluminum and composite spacecraft, insulation, radiators, solar arrays, or extra-terrestrial objects in the vacuum of outer space without dedicated handles in the right places. The electronic Flexible Electrostatic Technologies for space Capture and Handling (FETCH) will enable reliable and compliant gripping (soft dock) of practically any object in various orbits or surfaces without dedicated mechanical features, very low impact capture, and built-in proximity sensing without any conventional actuators. Originally developed to handle semiconductor and glass wafers during vacuum chamber processing without contamination, the normal rigid wafer handling chucks are replaced with thin metal foil segments laminated in flexible insulation driven by commercial off-the-shelf solid state, high-voltage power supplies. Preliminary testing in NASA Marshall Space Flight Center's (MSFC's) Flat Floor Robotics Lab demonstrated compliant alignment and gripping with a full-sized, 150-lb microsat mockup and translation before a clean release with a flip of a switch. The flexible electrostatic gripper pads can be adapted to various space applications with different sizes, shapes, and foil electrode layouts even with openings through the gripper pads for addition of guidance sensors or injection of permanent adhesives. With gripping forces estimated between 0.5 and 2.5 lb/in2 or 70-300 lb/ft2 of surface contact, the FETCH can turn on and off rapidly and repeatedly to enable sample handling, soft docking, in-space assembly, precision relocation, and surface translation for accurate anchoring.

  8. Developing a Science Process Skills Test for Secondary Students: Validity and Reliability Study

    ERIC Educational Resources Information Center

    Feyzioglu, Burak; Demirdag, Baris; Akyildiz, Murat; Altun, Eralp

    2012-01-01

    Science process skills are claimed to enable an individual to improve their own life visions and give a scientific view/literacy as a standard of their understanding about the nature of science. The main purpose of this study was to develop a test for measuring a valid, reliable and practical test for Science Process Skills (SPS) in secondary…

  9. The importance of data quality for generating reliable distribution models for rare, elusive, and cryptic species

    Treesearch

    Keith B. Aubry; Catherine M. Raley; Kevin S. McKelvey

    2017-01-01

    The availability of spatially referenced environmental data and species occurrence records in online databases enable practitioners to easily generate species distribution models (SDMs) for a broad array of taxa. Such databases often include occurrence records of unknown reliability, yet little information is available on the influence of data quality on SDMs generated...

  10. Probabilistic Prediction of Lifetimes of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  11. Implementation of a personnel reliability program as a facilitator of biosafety and biosecurity culture in BSL-3 and BSL-4 laboratories.

    PubMed

    Higgins, Jacki J; Weaver, Patrick; Fitch, J Patrick; Johnson, Barbara; Pearl, R Marene

    2013-06-01

    In late 2010, the National Biodefense Analysis and Countermeasures Center (NBACC) implemented a Personnel Reliability Program (PRP) with the goal of enabling active participation by its staff to drive and improve the biosafety and biosecurity culture at the organization. A philosophical keystone for accomplishment of NBACC's scientific mission is simultaneous excellence in operations and outreach. Its personnel reliability program builds on this approach to: (1) enable and support a culture of responsibility based on human performance principles, (2) maintain compliance with regulations, and (3) address the risk associated with the insider threat. Recently, the Code of Federal Regulations (CFR) governing use and possession of biological select agents and toxins (BSAT) was amended to require a pre-access suitability assessment and ongoing evaluation for staff accessing Tier 1 BSAT. These 2 new requirements are in addition to the already required Federal Bureau of Investigation (FBI) Security Risk Assessment (SRA). Two years prior to the release of these guidelines, NBACC developed its PRP to supplement the SRA requirement as a means to empower personnel and foster an operational environment where any and all work with BSAT is conducted in a safe, secure, and reliable manner.

  12. Electroencephalography as a post-stroke assessment method: An updated review.

    PubMed

    Monge-Pereira, E; Molina-Rueda, F; Rivas-Montero, F M; Ibáñez, J; Serrano, J I; Alguacil-Diego, I M; Miangolarra-Page, J C

    Given that stroke is currently a serious problem in the population, employing more reliable and objective techniques for determining diagnosis and prognosis is necessary in order to enable effective clinical decision-making. EEG is a simple, low-cost, non-invasive tool that can provide information about the changes occurring in the cerebral cortex during the recovery process after stroke. EEG provides data on the evolution of cortical activation patterns which can be used to establish a prognosis geared toward harnessing each patient's full potential. This strategy can be used to prevent compensation and maladaptive plasticity, redirect treatments, and develop new interventions that will let stroke patients reach their new maximum motor levels. Copyright © 2014 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. Automated robotic equipment for ultrasonic inspection of pressurizer heater wells

    DOEpatents

    Nachbar, Henry D.; DeRossi, Raymond S.; Mullins, Lawrence E.

    1993-01-01

    A robotic device for remotely inspecting pressurizer heater wells is provided which has the advantages of quickly, precisely, and reliably acquiring data at reasonable cost while also reducing radiation exposure of an operator. The device comprises a prober assembly including a probe which enters a heater well, gathers data regarding the condition of the heater well and transmits a signal carrying that data; a mounting device for mounting the probe assembly at the opening of the heater well so that the probe can enter the heater well; a first motor mounted on the mounting device for providing movement of the probe assembly in an axial direction; and a second motor mounted on the mounting device for providing rotation of the probe assembly. This arrangement enables full inspection of the heater well to be carried out.

  14. Overview of Probabilistic Methods for SAE G-11 Meeting for Reliability and Uncertainty Quantification for DoD TACOM Initiative with SAE G-11 Division

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting during October 6-8 at the Best Western Sterling Inn, Sterling Heights (Detroit), Michigan is co-sponsored by US Army Tank-automotive & Armaments Command (TACOM). The meeting will provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11's Probabilistic Methods Committee is to "enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development."

  15. Generalizability and decision studies to inform observational and experimental research in classroom settings.

    PubMed

    Bottema-Beutel, Kristen; Lloyd, Blair; Carter, Erik W; Asmus, Jennifer M

    2014-11-01

    Attaining reliable estimates of observational measures can be challenging in school and classroom settings, as behavior can be influenced by multiple contextual factors. Generalizability (G) studies can enable researchers to estimate the reliability of observational data, and decision (D) studies can inform how many observation sessions are necessary to achieve a criterion level of reliability. We conducted G and D studies using observational data from a randomized control trial focusing on social and academic participation of students with severe disabilities in inclusive secondary classrooms. Results highlight the importance of anchoring observational decisions to reliability estimates from existing or pilot data sets. We outline steps for conducting G and D studies and address options when reliability estimates are lower than desired.

  16. Independent predictors of reliability between full time employee-dependent acquisition of functional outcomes compared to non-full time employee-dependent methodologies: a prospective single institutional study.

    PubMed

    Adogwa, Owoicho; Elsamadicy, Aladine A; Cheng, Joseph; Bagley, Carlos

    2016-03-01

    The prospective acquisition of reliable patient-reported outcomes (PROs) measures demonstrating the effectiveness of spine surgery, or lack thereof, remains a challenge. The aims of this study are to compare the reliability of functional outcomes metrics obtained using full time employee (FTE) vs. non-FTE-dependent methodologies and to determine the independent predictors of response reliability using non FTE-dependent methodologies. One hundred and nineteen adult patients (male: 65, female: 54) undergoing one- and two-level lumbar fusions at Duke University Medical Center were enrolled in this prospective study. Enrollment criteria included available demographic, clinical and baseline functional outcomes data. All patients were administered two similar sets of baseline questionnaires-(I) phone interviews (FTE-dependent) and (II) hardcopy in clinic (patient self-survey, non-FTE-dependent). All patients had at least a two-week washout period between phone interviews and in-clinic self-surveys to minimize effect of recall. Questionnaires included Oswestry disability index (ODI) and Visual Analog Back and Leg Pain Scale (VAS-BP/LP). Reliability was assessed by the degree to which patient responses to baseline questionnaires differed between both time points. About 26.89% had a history an anxiety disorder and 28.57% reported a history of depression. At least 97.47% of patients had a High School Diploma or GED, with 49.57% attaining a 4-year college degree or post-graduate degree. 29.94% reported full-time employment and 14.28% were on disability. There was a very high correlation between baseline PRO's data captured between FTE-dependent compared to non-FTE-dependent methodologies (r=0.89). In a multivariate logistic regression model, the absence of anxiety and depression, higher levels of education (college or greater) and full-time employment, were independently associated with high response reliability using non-FTE-dependent methodologies. Our study suggests that capturing health-related quality of life data using non-FTE-dependent methodologies is highly reliable and maybe a more cost-effective alternative. Well-educated patients who are employed full-time appear to be the most reliable.

  17. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  18. Creating Highly Reliable Accountable Care Organizations.

    PubMed

    Vogus, Timothy J; Singer, Sara J

    2016-12-01

    Accountable Care Organizations' (ACOs) pursuit of the triple aim of higher quality, lower cost, and improved population health has met with mixed results. To improve the design and implementation of ACOs we look to organizations that manage similarly complex, dynamic, and tightly coupled conditions while sustaining exceptional performance known as high-reliability organizations. We describe the key processes through which organizations achieve reliability, the leadership and organizational practices that enable it, and the role that professionals can play when charged with enacting it. Specifically, we present concrete practices and processes from health care organizations pursuing high-reliability and from early ACOs to illustrate how the triple aim may be met by cultivating mindful organizing, practicing reliability-enhancing leadership, and identifying and supporting reliability professionals. We conclude by proposing a set of research questions to advance the study of ACOs and high-reliability research. © The Author(s) 2016.

  19. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  20. Space Flight Middleware: Remote AMS over DTN for Delay-Tolerant Messaging

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott

    2011-01-01

    This paper describes a technique for implementing scalable, reliable, multi-source multipoint data distribution in space flight communications -- Delay-Tolerant Reliable Multicast (DTRM) -- that is fully supported by the "Remote AMS" (RAMS) protocol of the Asynchronous Message Service (AMS) proposed for standardization within the Consultative Committee for Space Data Systems (CCSDS). The DTRM architecture enables applications to easily "publish" messages that will be reliably and efficiently delivered to an arbitrary number of "subscribing" applications residing anywhere in the space network, whether in the same subnet or in a subnet on a remote planet or vehicle separated by many light minutes of interplanetary space. The architecture comprises multiple levels of protocol, each included for a specific purpose and allocated specific responsibilities: "application AMS" traffic performs end-system data introduction and delivery subject to access control; underlying "remote AMS" directs this application traffic to populations of recipients at remote locations in a multicast distribution tree, enabling the architecture to scale up to large networks; further underlying Delay-Tolerant Networking (DTN) Bundle Protocol (BP) advances RAMS protocol data units through the distribution tree using delay-tolerant storeand- forward methods; and further underlying reliable "convergence-layer" protocols ensure successful data transfer over each segment of the end-to-end route. The result is scalable, reliable, delay-tolerant multi-source multicast that is largely self-configuring.

  1. Fly-by-Wire Systems Enable Safer, More Efficient Flight

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Using the ultra-reliable Apollo Guidance Computer that enabled the Apollo Moon missions, Dryden Flight Research Center engineers, in partnership with industry leaders such as Cambridge, Massachusetts-based Draper Laboratory, demonstrated that digital computers could be used to fly aircraft. Digital fly-by-wire systems have since been incorporated into large airliners, military jets, revolutionary new aircraft, and even cars and submarines.

  2. Thermal Management and Reliability of Automotive Power Electronics and Electric Machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narumanchi, Sreekant V; Bennion, Kevin S; Cousineau, Justine E

    Low-cost, high-performance thermal management technologies are helping meet aggressive power density, specific power, cost, and reliability targets for power electronics and electric machines. The National Renewable Energy Laboratory is working closely with numerous industry and research partners to help influence development of components that meet aggressive performance and cost targets through development and characterization of cooling technologies, and thermal characterization and improvements of passive stack materials and interfaces. Thermomechanical reliability and lifetime estimation models are important enablers for industry in cost-and time-effective design.

  3. Studies of Physical Education in the United States Using SOFIT: A Review.

    PubMed

    McKenzie, Thomas L; Smith, Nicole J

    2017-12-01

    An objective database for physical education (PE) is important for policy and practice decisions, and the System for Observing Fitness Instruction Time (SOFIT) has been identified as an appropriate surveillance tool for PE across the nation. The purpose of this review was to assess peer-reviewed studies using SOFIT to study K-12 PE in U.S. schools. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses informed the review, and 10 databases were searched for English-language articles published through 2016. A total of 704 records identifying SOFIT were located, and 137 full texts were read. Two authors reviewed full-text articles, and a data extraction tool was developed to select studies and main topics for synthesis. Twenty-nine studies that included direct observations of 12,256 PE lessons met inclusion criteria; 17 were conducted in elementary schools, 9 in secondary schools, and 3 in combined-level schools. Inconsistent reporting among studies was evident, including not all identifying the number of classes and teachers involved. All studies reported student physical activity, but fewer reported observer reliabilities (88%), lesson context (76%), teacher behavior (38%), and PE dosage (34%). The most frequently analyzed independent variables were teacher preparation (48%), lesson location (38%), and student gender (31%). SOFIT can be used reliably in diverse settings. Inconsistent reporting about study procedures and variables analyzed, however, limited comparisons among studies. Adherence to an established protocol and more consistent reporting would more fully enable the development of a viable database for PE in U.S. schools.

  4. Feasibility of Using Neural Network Models to Accelerate the Testing of Mechanical Systems

    NASA Technical Reports Server (NTRS)

    Fusaro, Robert L.

    1998-01-01

    Verification testing is an important aspect of the design process for mechanical mechanisms, and full-scale, full-length life testing is typically used to qualify any new component for use in space. However, as the required life specification is increased, full-length life tests become more costly and lengthen the development time. At the NASA Lewis Research Center, we theorized that neural network systems may be able to model the operation of a mechanical device. If so, the resulting neural network models could simulate long-term mechanical testing with data from a short-term test. This combination of computer modeling and short-term mechanical testing could then be used to verify the reliability of mechanical systems, thereby eliminating the costs associated with long-term testing. Neural network models could also enable designers to predict the performance of mechanisms at the conceptual design stage by entering the critical parameters as input and running the model to predict performance. The purpose of this study was to assess the potential of using neural networks to predict the performance and life of mechanical systems. To do this, we generated a neural network system to model wear obtained from three accelerated testing devices: 1) A pin-on-disk tribometer; 2) A line-contact rub-shoe tribometer; 3) A four-ball tribometer.

  5. 76 FR 34086 - Mandatory Guidelines for Federal Workplace Drug Testing Programs; Request for Information...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-10

    ... standards that require the use of the best available technology for ensuring the full reliability and... available technology for ensuring the full reliability and accuracy of urine drug tests, while reflecting..., cutoffs, specimen validity, collection, collection devices, and testing. II. Solicitation of Comments: As...

  6. A data management system to enable urgent natural disaster computing

    NASA Astrophysics Data System (ADS)

    Leong, Siew Hoon; Kranzlmüller, Dieter; Frank, Anton

    2014-05-01

    Civil protection, in particular natural disaster management, is very important to most nations and civilians in the world. When disasters like flash floods, earthquakes and tsunamis are expected or have taken place, it is of utmost importance to make timely decisions for managing the affected areas and reduce casualties. Computer simulations can generate information and provide predictions to facilitate this decision making process. Getting the data to the required resources is a critical requirement to enable the timely computation of the predictions. An urgent data management system to support natural disaster computing is thus necessary to effectively carry out data activities within a stipulated deadline. Since the trigger of a natural disaster is usually unpredictable, it is not always possible to prepare required resources well in advance. As such, an urgent data management system for natural disaster computing has to be able to work with any type of resources. Additional requirements include the need to manage deadlines and huge volume of data, fault tolerance, reliable, flexibility to changes, ease of usage, etc. The proposed data management platform includes a service manager to provide a uniform and extensible interface for the supported data protocols, a configuration manager to check and retrieve configurations of available resources, a scheduler manager to ensure that the deadlines can be met, a fault tolerance manager to increase the reliability of the platform and a data manager to initiate and perform the data activities. These managers will enable the selection of the most appropriate resource, transfer protocol, etc. such that the hard deadline of an urgent computation can be met for a particular urgent activity, e.g. data staging or computation. We associated 2 types of deadlines [2] with an urgent computing system. Soft-hard deadline: Missing a soft-firm deadline will render the computation less useful resulting in a cost that can have severe consequences Hard deadline: Missing a hard deadline renders the computation useless and results in full catastrophic consequences. A prototype of this system has a REST-based service manager. The REST-based implementation provides a uniform interface that is easy to use. New and upcoming file transfer protocols can easily be extended and accessed via the service manager. The service manager interacts with the other four managers to coordinate the data activities so that the fundamental natural disaster urgent computing requirement, i.e. deadline, can be fulfilled in a reliable manner. A data activity can include data storing, data archiving and data storing. Reliability is ensured by the choice of a network of managers organisation model[1] the configuration manager and the fault tolerance manager. With this proposed design, an easy to use, resource-independent data management system that can support and fulfill the computation of a natural disaster prediction within stipulated deadlines can thus be realised. References [1] H. G. Hegering, S. Abeck, and B. Neumair, Integrated management of networked systems - concepts, architectures, and their operational application, Morgan Kaufmann Publishers, 340 Pine Stret, Sixth Floor, San Francisco, CA 94104-3205, USA, 1999. [2] H. Kopetz, Real-time systems design principles for distributed embedded applications, second edition, Springer, LLC, 233 Spring Street, New York, NY 10013, USA, 2011. [3] S. H. Leong, A. Frank, and D. Kranzlmu¨ ller, Leveraging e-infrastructures for urgent computing, Procedia Computer Science 18 (2013), no. 0, 2177 - 2186, 2013 International Conference on Computational Science. [4] N. Trebon, Enabling urgent computing within the existing distributed computing infrastructure, Ph.D. thesis, University of Chicago, August 2011, http://people.cs.uchicago.edu/~ntrebon/docs/dissertation.pdf.

  7. Advanced Sea Base Enabler (ASE) Capstone Design Project

    DTIC Science & Technology

    2009-09-21

    Additionally, a study that examines a potential fleet architecture , which looks at a combination of sea base enabler platforms in order to close current...This change in premise spawned a post-Cold War naval intellectual renaissance , reflected in several Department of the Navy (DON) “white papers...information collected regarding the various systems is reliable. 3. Primary Areas of Focus Detailed engineering analyses, naval architecture or other

  8. Construction of reliable protein-protein interaction networks with a new interaction generality measure.

    PubMed

    Saito, Rintaro; Suzuki, Harukazu; Hayashizaki, Yoshihide

    2003-04-12

    Recent screening techniques have made large amounts of protein-protein interaction data available, from which biologically important information such as the function of uncharacterized proteins, the existence of novel protein complexes, and novel signal-transduction pathways can be discovered. However, experimental data on protein interactions contain many false positives, making these discoveries difficult. Therefore computational methods of assessing the reliability of each candidate protein-protein interaction are urgently needed. We developed a new 'interaction generality' measure (IG2) to assess the reliability of protein-protein interactions using only the topological properties of their interaction-network structure. Using yeast protein-protein interaction data, we showed that reliable protein-protein interactions had significantly lower IG2 values than less-reliable interactions, suggesting that IG2 values can be used to evaluate and filter interaction data to enable the construction of reliable protein-protein interaction networks.

  9. NASA Advanced Exploration Systems: Advancements in Life Support Systems

    NASA Technical Reports Server (NTRS)

    Shull, Sarah A.; Schneider, Walter F.

    2016-01-01

    The NASA Advanced Exploration Systems (AES) Life Support Systems (LSS) project strives to develop reliable, energy-efficient, and low-mass spacecraft systems to provide environmental control and life support systems (ECLSS) critical to enabling long duration human missions beyond low Earth orbit (LEO). Highly reliable, closed-loop life support systems are among the capabilities required for the longer duration human space exploration missions assessed by NASA’s Habitability Architecture Team.

  10. Interphase Thermomechanical Reliability and Optimization for High-Performance Ti Metal Laminates

    DTIC Science & Technology

    2011-12-19

    Thermomechanical Reliability and Optimization for High-Performance Ti FA9550-08-l-0015 Metal Laminates Sb. GRANT NUMBER Program Manager: Dr Joycelyn Harrison...OSR-VA-TR-2012-0202 12. DISTRIBUTION/AVAILABILITY STATEMENT A 13. SUPPLEMENTARY NOTES 14. ABSTRACT Hybrid laminated composites such as titanium...graphite (TiGr) laminates are an emerging class of structural materials with the potential to enable a new generation of efficient, high-performance

  11. Implementation of a Personnel Reliability Program as a Facilitator of Biosafety and Biosecurity Culture in BSL-3 and BSL-4 Laboratories

    PubMed Central

    Weaver, Patrick; Fitch, J. Patrick; Johnson, Barbara; Pearl, R. Marene

    2013-01-01

    In late 2010, the National Biodefense Analysis and Countermeasures Center (NBACC) implemented a Personnel Reliability Program (PRP) with the goal of enabling active participation by its staff to drive and improve the biosafety and biosecurity culture at the organization. A philosophical keystone for accomplishment of NBACC's scientific mission is simultaneous excellence in operations and outreach. Its personnel reliability program builds on this approach to: (1) enable and support a culture of responsibility based on human performance principles, (2) maintain compliance with regulations, and (3) address the risk associated with the insider threat. Recently, the Code of Federal Regulations (CFR) governing use and possession of biological select agents and toxins (BSAT) was amended to require a pre-access suitability assessment and ongoing evaluation for staff accessing Tier 1 BSAT. These 2 new requirements are in addition to the already required Federal Bureau of Investigation (FBI) Security Risk Assessment (SRA). Two years prior to the release of these guidelines, NBACC developed its PRP to supplement the SRA requirement as a means to empower personnel and foster an operational environment where any and all work with BSAT is conducted in a safe, secure, and reliable manner. PMID:23745523

  12. Extended version of the "Sniffin' Sticks" identification test: test-retest reliability and validity.

    PubMed

    Sorokowska, A; Albrecht, E; Haehner, A; Hummel, T

    2015-03-30

    The extended, 32-item version of the Sniffin' Sticks identification test was developed in order to create a precise tool enabling repeated, longitudinal testing of individual olfactory subfunctions. Odors of the previous test version had to be changed for technical reasons, and the odor identification test needed re-investigation in terms of reliability, validity, and normative values. In our study we investigated olfactory abilities of a group of 100 patients with olfactory dysfunction and 100 controls. We reconfirmed the high test-retest reliability of the extended version of the Sniffin' Sticks identification test and high correlations between the new and the original part of this tool. In addition, we confirmed the validity of the test as it discriminated clearly between controls and patients with olfactory loss. The additional set of 16 odor identification sticks can be either included in the current olfactory test, thus creating a more detailed diagnosis tool, or it can be used separately, enabling to follow olfactory function over time. Additionally, the normative values presented in our paper might provide useful guidelines for interpretation of the extended identification test results. The revised version of the Sniffin' Sticks 32-item odor identification test is a reliable and valid tool for the assessment of olfactory function. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 3: HARP Graphics Oriented (GO) input user's guide

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Mittal, Nitin; Koppen, Sandra Howell

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems, and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical preprocessor Graphics Oriented (GO) program. GO is a graphical user interface for the HARP engine that enables the drawing of reliability/availability models on a monitor. A mouse is used to select fault tree gates or Markov graphical symbols from a menu for drawing.

  14. Autonomous navigation system based on GPS and magnetometer data

    NASA Technical Reports Server (NTRS)

    Julie, Thienel K. (Inventor); Richard, Harman R. (Inventor); Bar-Itzhack, Itzhack Y. (Inventor)

    2004-01-01

    This invention is drawn to an autonomous navigation system using Global Positioning System (GPS) and magnetometers for low Earth orbit satellites. As a magnetometer is reliable and always provides information on spacecraft attitude, rate, and orbit, the magnetometer-GPS configuration solves GPS initialization problem, decreasing the convergence time for navigation estimate and improving the overall accuracy. Eventually the magnetometer-GPS configuration enables the system to avoid costly and inherently less reliable gyro for rate estimation. Being autonomous, this invention would provide for black-box spacecraft navigation, producing attitude, orbit, and rate estimates without any ground input with high accuracy and reliability.

  15. Cold Climate Foundation Retrofit Experimental Hygrothermal Performance. Cloquet Residential Research Facility Laboratory Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Louise F.; Harmon, Anna C.

    2015-04-09

    This project was funded jointly by the National Renewable Energy Laboratory (NREL) and Oak Ridge National Laboratory (ORNL). ORNL focused on developing a full basement wall system experimental database to enable others to validate hygrothermal simulation codes. NREL focused on testing the moisture durability of practical basement wall interior insulation retrofit solutions for cold climates. The project has produced a physically credible and reliable long-term hygrothermal performance database for retrofit foundation wall insulation systems in zone 6 and 7 climates that are fully compliant with the performance criteria in the 2009 Minnesota Energy Code. These data currently span the periodmore » from November 10, 2012 through May 31, 2014 and are anticipated to be extended through November 2014. The experimental data were configured into a standard format that can be published online and that is compatible with standard commercially available spreadsheet and database software.« less

  16. Moles: Tool-Assisted Environment Isolation with Closures

    NASA Astrophysics Data System (ADS)

    de Halleux, Jonathan; Tillmann, Nikolai

    Isolating test cases from environment dependencies is often desirable, as it increases test reliability and reduces test execution time. However, code that calls non-virtual methods or consumes sealed classes is often impossible to test in isolation. Moles is a new lightweight framework which addresses this problem. For any .NET method, Moles allows test-code to provide alternative implementations, given as .NET delegates, for which C# provides very concise syntax while capturing local variables in a closure object. Using code instrumentation, the Moles framework will redirect calls to provided delegates instead of the original methods. The Moles framework is designed to work together with the dynamic symbolic execution tool Pex to enable automated test generation. In a case study, testing code programmed against the Microsoft SharePoint Foundation API, we achieved full code coverage while running tests in isolation without an actual SharePoint server. The Moles framework integrates with .NET and Visual Studio.

  17. A Space Station robot walker and its shared control software

    NASA Technical Reports Server (NTRS)

    Xu, Yangsheng; Brown, Ben; Aoki, Shigeru; Yoshida, Tetsuji

    1994-01-01

    In this paper, we first briefly overview the update of the self-mobile space manipulator (SMSM) configuration and testbed. The new robot is capable of projecting cameras anywhere interior or exterior of the Space Station Freedom (SSF), and will be an ideal tool for inspecting connectors, structures, and other facilities on SSF. Experiments have been performed under two gravity compensation systems and a full-scale model of a segment of SSF. This paper presents a real-time shared control architecture that enables the robot to coordinate autonomous locomotion and teleoperation input for reliable walking on SSF. Autonomous locomotion can be executed based on a CAD model and off-line trajectory planning, or can be guided by a vision system with neural network identification. Teleoperation control can be specified by a real-time graphical interface and a free-flying hand controller. SMSM will be a valuable assistant for astronauts in inspection and other EVA missions.

  18. A Comparative Study of Power Supply Architectures In Wireless Electric Vehicle Charging Systems

    NASA Astrophysics Data System (ADS)

    Esteban, Bryan

    Wireless inductive power transfer is a transformational and disruptive technology that enables the reliable and efficient transfer of electrical power over large air gaps for a host of unique applications. One such application that is now gaining much momentum worldwide is the wireless charging of electric vehicles (EVs). This thesis examines two of the primary power supply topologies being predominantly used for EV charging, namely the SLC and the LCL resonant full bridge inverter topologies. The study of both of these topologies is presented in the context of designing a 3 kW, primary side controlled, wireless EV charger with nominal operating parameters of 30 kHz centre frequency and range of coupling in the neighborhood of .18-.26. A comparison of both topologies is made in terms of their complexity, cost, efficiency, and power quality. The aim of the study is to determine which topology is better for wireless EV charging.

  19. Practical Applications of Cables and Ropes in the ISS Countermeasures System

    NASA Technical Reports Server (NTRS)

    Moore, Cherice; Svetlik, Randall; Williams, Antony

    2017-01-01

    As spaceflight durations have increased over the last four decades, the effects of weightlessness on the human body are far better understood, as are the countermeasures. A combination of aerobic and resistive exercise devices contribute to countering the losses in muscle strength, aerobic fitness, and bone strength of today's astronauts and cosmonauts that occur during their missions on the International Space Station. Creation of these systems has been a dynamically educational experience for designers and engineers. The ropes and cables in particular have experienced a wide range of challenges, providing a full set of lessons learned that have already enabled improvements in on-orbit reliability by initiating system design improvements. This paper examines the on-orbit experience of ropes and cables in several exercise devices and discusses the lessons learned from these hardware items, with the goal of informing future system design.

  20. DPSSL and FL pumps based on 980-nm telecom pump laser technology: changing the industry

    NASA Astrophysics Data System (ADS)

    Lichtenstein, Norbert; Schmidt, Berthold E.; Fily, Arnaud; Weiss, Stefan; Arlt, Sebastian; Pawlik, Susanne; Sverdlov, Boris; Muller, Jurgen; Harder, Christoph S.

    2004-06-01

    Diode-pumped solid state laser (DPSSL) and fiber laser (FL) are believed to become the dominant systems of very high power lasers in the industrial environment. Today, ranging from 100 W to 5 - 10 kW in light output power, their field of applications spread from biomedical and sensoring to material processing. Key driver for the wide spread of such systems is a competitive ratio of cost, performance and reliability. Enabling high power, highly reliable broad-area laser diodes and laser diode bars with excellent performance at the relevant wavelengths can further optimize this ratio. In this communication we present, that this can be achieved by leveraging the tremendous improvements in reliability and performance together with the high volume, low cost manufacturing areas established during the "telecom-bubble." From today's generations of 980-nm narrow-stripe laser diodes 1.8 W of maximum CW output power can be obtained fulfilling the stringent telecom reliability at operating conditions. Single-emitter broad-area lasers deliver in excess of 11 W CW while from similar 940-nm laser bars more than 160 W output power (CW) can be obtained at 200 A. In addition, introducing telecom-grade AuSn-solder mounting technology on expansion matched subassemblies enables excellent reliability performance. Degradation rates of less than 1% over 1000 h at 60 A are observed for both 808-nm and 940-nm laser bars even under harsh intermittent operation conditions.

  1. Development of a Brief Questionnaire to Assess Contraceptive Intent

    PubMed Central

    Raine-Bennett, Tina R; Rocca, Corinne H

    2015-01-01

    Objective We sought to develop and validate an instrument that can enable providers to identify young women who may be at risk of contraceptive non-adherence. Methods Item response theory based methods were used to evaluate the psychometric properties of the Contraceptive Intent Questionnaire, a 15-item self-administered questionnaire, based on theory and prior qualitative and quantitative research. The questionnaire was administered to 200 women aged 15–24 years who were initiating contraceptives. We assessed item fit to the item response model, internal consistency, internal structure validity, and differential item functioning. Results All items fit a one-dimensional model. The separation reliability coefficient was 0.73. Participants’ overall scores covered the full range of the scale (0–15), and items appropriately matched the range of participants’ contraceptive intent. Items met the criteria for internal structure validity and most items functioned similarly between groups of women. Conclusion The Contraceptive Intent Questionnaire appears to be a reliable and valid tool. Future testing is needed to assess predictive ability and clinical utility. Practice Implications The Contraceptive Intent Questionnaire may serve as a valid tool to help providers identify women who may have problems with contraceptive adherence, as well as to pinpoint areas in which counseling may be directed. PMID:26104994

  2. Development of a brief questionnaire to assess contraceptive intent.

    PubMed

    Raine-Bennett, Tina R; Rocca, Corinne H

    2015-11-01

    We sought to develop and validate an instrument that can enable providers to identify young women who may be at risk of contraceptive non-adherence. Item response theory based methods were used to evaluate the psychometric properties of the Contraceptive Intent Questionnaire, a 15-item self-administered questionnaire, based on theory and prior qualitative and quantitative research. The questionnaire was administered to 200 women aged 15-24 years who were initiating contraceptives. We assessed item fit to the item response model, internal consistency, internal structure validity, and differential item functioning. All items fit a one-dimensional model. The separation reliability coefficient was 0.73. Participants' overall scores covered the full range of the scale (0-15), and items appropriately matched the range of participants' contraceptive intent. Items met the criteria for internal structure validity and most items functioned similarly between groups of women. The Contraceptive Intent Questionnaire appears to be a reliable and valid tool. Future testing is needed to assess predictive ability and clinical utility. The Contraceptive Intent Questionnaire may serve as a valid tool to help providers identify women who may have problems with contraceptive adherence, as well as to pinpoint areas in which counseling may be directed. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  3. Properties and Performance Attributes of Novel Co-Extruded Polyolefin Battery Separator Materials. Part 1; Mechanical Properties

    NASA Technical Reports Server (NTRS)

    Baldwin, Richard S.; Guzik, Monica; Skierski, Michael

    2011-01-01

    As NASA prepares for its next era of manned spaceflight missions, advanced energy storage technologies are being developed and evaluated to address future mission needs and technical requirements and to provide new mission-enabling technologies. Cell-level components for advanced lithium-ion batteries possessing higher energy, more reliable performance and enhanced, inherent safety characteristics are actively under development within the NASA infrastructure. A key component for safe and reliable cell performance is the cell separator, which separates the two energetic electrodes and functions to prevent the occurrence of an internal short-circuit while enabling ionic transport. Recently, a new generation of co-extruded separator films has been developed by ExxonMobil Chemical and introduced into their battery separator product portfolio. Several grades of this new separator material have been evaluated with respect to dynamic mechanical properties and safety-related performance attributes. This paper presents the results of these evaluations in comparison to a current state-ofthe-practice separator material. The results are discussed with respect to potential opportunities to enhance the inherent safety characteristics and reliability of future, advanced lithium-ion cell chemistries.

  4. Reliability of lower limb alignment measures using an established landmark-based method with a customized computer software program

    PubMed Central

    Sled, Elizabeth A.; Sheehy, Lisa M.; Felson, David T.; Costigan, Patrick A.; Lam, Miu; Cooke, T. Derek V.

    2010-01-01

    The objective of the study was to evaluate the reliability of frontal plane lower limb alignment measures using a landmark-based method by (1) comparing inter- and intra-reader reliability between measurements of alignment obtained manually with those using a computer program, and (2) determining inter- and intra-reader reliability of computer-assisted alignment measures from full-limb radiographs. An established method for measuring alignment was used, involving selection of 10 femoral and tibial bone landmarks. 1) To compare manual and computer methods, we used digital images and matching paper copies of five alignment patterns simulating healthy and malaligned limbs drawn using AutoCAD. Seven readers were trained in each system. Paper copies were measured manually and repeat measurements were performed daily for 3 days, followed by a similar routine with the digital images using the computer. 2) To examine the reliability of computer-assisted measures from full-limb radiographs, 100 images (200 limbs) were selected as a random sample from 1,500 full-limb digital radiographs which were part of the Multicenter Osteoarthritis (MOST) Study. Three trained readers used the software program to measure alignment twice from the batch of 100 images, with two or more weeks between batch handling. Manual and computer measures of alignment showed excellent agreement (intraclass correlations [ICCs] 0.977 – 0.999 for computer analysis; 0.820 – 0.995 for manual measures). The computer program applied to full-limb radiographs produced alignment measurements with high inter- and intra-reader reliability (ICCs 0.839 – 0.998). In conclusion, alignment measures using a bone landmark-based approach and a computer program were highly reliable between multiple readers. PMID:19882339

  5. ICAROUS - Integrated Configurable Algorithms for Reliable Operations Of Unmanned Systems

    NASA Technical Reports Server (NTRS)

    Consiglio, María; Muñoz, César; Hagen, George; Narkawicz, Anthony; Balachandran, Swee

    2016-01-01

    NASA's Unmanned Aerial System (UAS) Traffic Management (UTM) project aims at enabling near-term, safe operations of small UAS vehicles in uncontrolled airspace, i.e., Class G airspace. A far-term goal of UTM research and development is to accommodate the expected rise in small UAS traffic density throughout the National Airspace System (NAS) at low altitudes for beyond visual line-of-sight operations. This paper describes a new capability referred to as ICAROUS (Integrated Configurable Algorithms for Reliable Operations of Unmanned Systems), which is being developed under the UTM project. ICAROUS is a software architecture comprised of highly assured algorithms for building safety-centric, autonomous, unmanned aircraft applications. Central to the development of the ICAROUS algorithms is the use of well-established formal methods to guarantee higher levels of safety assurance by monitoring and bounding the behavior of autonomous systems. The core autonomy-enabling capabilities in ICAROUS include constraint conformance monitoring and contingency control functions. ICAROUS also provides a highly configurable user interface that enables the modular integration of mission-specific software components.

  6. Diverse Redundant Systems for Reliable Space Life Support

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2015-01-01

    Reliable life support systems are required for deep space missions. The probability of a fatal life support failure should be less than one in a thousand in a multi-year mission. It is far too expensive to develop a single system with such high reliability. Using three redundant units would require only that each have a failure probability of one in ten over the mission. Since the system development cost is inverse to the failure probability, this would cut cost by a factor of one hundred. Using replaceable subsystems instead of full systems would further cut cost. Using full sets of replaceable components improves reliability more than using complete systems as spares, since a set of components could repair many different failures instead of just one. Replaceable components would require more tools, space, and planning than full systems or replaceable subsystems. However, identical system redundancy cannot be relied on in practice. Common cause failures can disable all the identical redundant systems. Typical levels of common cause failures will defeat redundancy greater than two. Diverse redundant systems are required for reliable space life support. Three, four, or five diverse redundant systems could be needed for sufficient reliability. One system with lower level repair could be substituted for two diverse systems to save cost.

  7. Rapid Mapping of Lithiation Dynamics in Transition Metal Oxide Particles with Operando X-ray Absorption Spectroscopy.

    PubMed

    Nowack, Lea; Grolimund, Daniel; Samson, Vallerie; Marone, Federica; Wood, Vanessa

    2016-02-24

    Since the commercialization of lithium ion batteries (LIBs), layered transition metal oxides (LiMO2, where M = Co, Mn, Ni, or mixtures thereof) have been materials of choice for LIB cathodes. During cycling, the transition metals change their oxidation states, an effect that can be tracked by detecting energy shifts in the X-ray absorption near edge structure (XANES) spectrum. X-ray absorption spectroscopy (XAS) can therefore be used to visualize and quantify lithiation kinetics in transition metal oxide cathodes; however, in-situ measurements are often constrained by temporal resolution and X-ray dose, necessitating compromises in the electrochemistry cycling conditions used or the materials examined. We report a combined approach to reduce measurement time and X-ray exposure for operando XAS studies of lithium ion batteries. A highly discretized energy resolution coupled with advanced post-processing enables rapid yet reliable identification of the oxidation state. A full-field microscopy setup provides sub-particle resolution over a large area of battery electrode, enabling the oxidation state within many transition metal oxide particles to be tracked simultaneously. Here, we apply this approach to gain insights into the lithiation kinetics of a commercial, mixed-metal oxide cathode material, nickel cobalt aluminium oxide (NCA), during (dis)charge and its degradation during overcharge.

  8. Rapid Mapping of Lithiation Dynamics in Transition Metal Oxide Particles with Operando X-ray Absorption Spectroscopy

    PubMed Central

    Nowack, Lea; Grolimund, Daniel; Samson, Vallerie; Marone, Federica; Wood, Vanessa

    2016-01-01

    Since the commercialization of lithium ion batteries (LIBs), layered transition metal oxides (LiMO2, where M = Co, Mn, Ni, or mixtures thereof) have been materials of choice for LIB cathodes. During cycling, the transition metals change their oxidation states, an effect that can be tracked by detecting energy shifts in the X-ray absorption near edge structure (XANES) spectrum. X-ray absorption spectroscopy (XAS) can therefore be used to visualize and quantify lithiation kinetics in transition metal oxide cathodes; however, in-situ measurements are often constrained by temporal resolution and X-ray dose, necessitating compromises in the electrochemistry cycling conditions used or the materials examined. We report a combined approach to reduce measurement time and X-ray exposure for operando XAS studies of lithium ion batteries. A highly discretized energy resolution coupled with advanced post-processing enables rapid yet reliable identification of the oxidation state. A full-field microscopy setup provides sub-particle resolution over a large area of battery electrode, enabling the oxidation state within many transition metal oxide particles to be tracked simultaneously. Here, we apply this approach to gain insights into the lithiation kinetics of a commercial, mixed-metal oxide cathode material, nickel cobalt aluminium oxide (NCA), during (dis)charge and its degradation during overcharge. PMID:26908198

  9. Rapid Mapping of Lithiation Dynamics in Transition Metal Oxide Particles with Operando X-ray Absorption Spectroscopy

    NASA Astrophysics Data System (ADS)

    Nowack, Lea; Grolimund, Daniel; Samson, Vallerie; Marone, Federica; Wood, Vanessa

    2016-02-01

    Since the commercialization of lithium ion batteries (LIBs), layered transition metal oxides (LiMO2, where M = Co, Mn, Ni, or mixtures thereof) have been materials of choice for LIB cathodes. During cycling, the transition metals change their oxidation states, an effect that can be tracked by detecting energy shifts in the X-ray absorption near edge structure (XANES) spectrum. X-ray absorption spectroscopy (XAS) can therefore be used to visualize and quantify lithiation kinetics in transition metal oxide cathodes; however, in-situ measurements are often constrained by temporal resolution and X-ray dose, necessitating compromises in the electrochemistry cycling conditions used or the materials examined. We report a combined approach to reduce measurement time and X-ray exposure for operando XAS studies of lithium ion batteries. A highly discretized energy resolution coupled with advanced post-processing enables rapid yet reliable identification of the oxidation state. A full-field microscopy setup provides sub-particle resolution over a large area of battery electrode, enabling the oxidation state within many transition metal oxide particles to be tracked simultaneously. Here, we apply this approach to gain insights into the lithiation kinetics of a commercial, mixed-metal oxide cathode material, nickel cobalt aluminium oxide (NCA), during (dis)charge and its degradation during overcharge.

  10. Lessons Learned from the Development and Implementation of the Atmosphere Resource Recovery and Environmental Monitoring Project

    NASA Technical Reports Server (NTRS)

    Roman, Monsi C.; Perry, Jay L.; Howard, David F.

    2014-01-01

    The Advanced Exploration Systems (AES) Program's Atmosphere Resource Recovery and Environmental Monitoring (ARREM) Project have been developing atmosphere revitalization and environmental monitoring subsystem architectures suitable for enabling sustained crewed exploration missions beyond low Earth orbit (LEO). Using the International Space Station state-of-the-art (SOA) as the technical basis, the ARREM Project has contributed to technical advances that improve affordability, reliability, and functional efficiency while reducing dependence on a ground-based logistics resupply model. Functional demonstrations have merged new process technologies and concepts with existing ISS developmental hardware and operate them in a controlled environment simulating various crew metabolic loads. The ARREM Project's strengths include access to a full complement of existing developmental hardware that perform all the core atmosphere revitalization functions, unique testing facilities to evaluate subsystem performance, and a coordinated partnering effort among six NASA field centers and industry partners to provide the innovative expertise necessary to succeed. A project overview is provided and the project management strategies that have enabled a multidiscipinary engineering team to work efficiently across project, NASA field center, and industry boundaries to achieve the project's technical goals are discussed. Lessons learned and best practices relating to the project are presented and discussed.

  11. Synthetic spike-in standards for high-throughput 16S rRNA gene amplicon sequencing

    PubMed Central

    Tourlousse, Dieter M.; Yoshiike, Satowa; Ohashi, Akiko; Matsukura, Satoko; Noda, Naohiro

    2017-01-01

    Abstract High-throughput sequencing of 16S rRNA gene amplicons (16S-seq) has become a widely deployed method for profiling complex microbial communities but technical pitfalls related to data reliability and quantification remain to be fully addressed. In this work, we have developed and implemented a set of synthetic 16S rRNA genes to serve as universal spike-in standards for 16S-seq experiments. The spike-ins represent full-length 16S rRNA genes containing artificial variable regions with negligible identity to known nucleotide sequences, permitting unambiguous identification of spike-in sequences in 16S-seq read data from any microbiome sample. Using defined mock communities and environmental microbiota, we characterized the performance of the spike-in standards and demonstrated their utility for evaluating data quality on a per-sample basis. Further, we showed that staggered spike-in mixtures added at the point of DNA extraction enable concurrent estimation of absolute microbial abundances suitable for comparative analysis. Results also underscored that template-specific Illumina sequencing artifacts may lead to biases in the perceived abundance of certain taxa. Taken together, the spike-in standards represent a novel bioanalytical tool that can substantially improve 16S-seq-based microbiome studies by enabling comprehensive quality control along with absolute quantification. PMID:27980100

  12. Enabling the environmentally clean air transportation of the future: a vision of computational fluid dynamics in 2030

    PubMed Central

    Slotnick, Jeffrey P.; Khodadoust, Abdollah; Alonso, Juan J.; Darmofal, David L.; Gropp, William D.; Lurie, Elizabeth A.; Mavriplis, Dimitri J.; Venkatakrishnan, Venkat

    2014-01-01

    As global air travel expands rapidly to meet demand generated by economic growth, it is essential to continue to improve the efficiency of air transportation to reduce its carbon emissions and address concerns about climate change. Future transports must be ‘cleaner’ and designed to include technologies that will continue to lower engine emissions and reduce community noise. The use of computational fluid dynamics (CFD) will be critical to enable the design of these new concepts. In general, the ability to simulate aerodynamic and reactive flows using CFD has progressed rapidly during the past several decades and has fundamentally changed the aerospace design process. Advanced simulation capabilities not only enable reductions in ground-based and flight-testing requirements, but also provide added physical insight, and enable superior designs at reduced cost and risk. In spite of considerable success, reliable use of CFD has remained confined to a small region of the operating envelope due, in part, to the inability of current methods to reliably predict turbulent, separated flows. Fortunately, the advent of much more powerful computing platforms provides an opportunity to overcome a number of these challenges. This paper summarizes the findings and recommendations from a recent NASA-funded study that provides a vision for CFD in the year 2030, including an assessment of critical technology gaps and needed development, and identifies the key CFD technology advancements that will enable the design and development of much cleaner aircraft in the future. PMID:25024413

  13. 76 FR 45804 - Agency Information Collection Request; 60-Day Public Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... an algorithm that enables reliable prediction of a certain event. A responder could submit the correct algorithm, but without the methodology, the evaluation process could not be adequately performed...

  14. Fully-Enclosed Ceramic Micro-burners Using Fugitive Phase and Powder-based Processing

    NASA Astrophysics Data System (ADS)

    Do, Truong; Shin, Changseop; Kwon, Patrick; Yeom, Junghoon

    2016-08-01

    Ceramic-based microchemical systems (μCSs) are more suitable for operation under harsh environments such as high temperature and corrosive reactants compared to the more conventional μCS materials such as silicon and polymers. With the recent renewed interests in chemical manufacturing and process intensification, simple, inexpensive, and reliable ceramic manufacturing technologies are needed. The main objective of this paper is to introduce a new powder-based fabrication framework, which is a one-pot, cost-effective, and versatile process for ceramic μCS components. The proposed approach employs the compaction of metal-oxide sub-micron powders with a graphite fugitive phase that is burned out to create internal cavities and microchannels before full sintering. Pure alumina powder has been used without any binder phase, enabling more precise dimensional control and less structure shrinkage upon sintering. The key process steps such as powder compaction, graphite burnout during partial sintering, machining in a conventional machine tool, and final densification have been studied to characterize the process. This near-full density ceramic structure with the combustion chamber and various internal channels was fabricated to be used as a micro-burner for gas sensing applications.

  15. Fully-Enclosed Ceramic Micro-burners Using Fugitive Phase and Powder-based Processing

    PubMed Central

    Do, Truong; Shin, Changseop; Kwon, Patrick; Yeom, Junghoon

    2016-01-01

    Ceramic-based microchemical systems (μCSs) are more suitable for operation under harsh environments such as high temperature and corrosive reactants compared to the more conventional μCS materials such as silicon and polymers. With the recent renewed interests in chemical manufacturing and process intensification, simple, inexpensive, and reliable ceramic manufacturing technologies are needed. The main objective of this paper is to introduce a new powder-based fabrication framework, which is a one-pot, cost-effective, and versatile process for ceramic μCS components. The proposed approach employs the compaction of metal-oxide sub-micron powders with a graphite fugitive phase that is burned out to create internal cavities and microchannels before full sintering. Pure alumina powder has been used without any binder phase, enabling more precise dimensional control and less structure shrinkage upon sintering. The key process steps such as powder compaction, graphite burnout during partial sintering, machining in a conventional machine tool, and final densification have been studied to characterize the process. This near-full density ceramic structure with the combustion chamber and various internal channels was fabricated to be used as a micro-burner for gas sensing applications. PMID:27546059

  16. Comparison of weighting techniques for acoustic full waveform inversion

    NASA Astrophysics Data System (ADS)

    Jeong, Gangwon; Hwang, Jongha; Min, Dong-Joo

    2017-12-01

    To reconstruct long-wavelength structures in full waveform inversion (FWI), the wavefield-damping and weighting techniques have been used to synthesize and emphasize low-frequency data components in frequency-domain FWI. However, these methods have some weak points. The application of wavefield-damping method on filtered data fails to synthesize reliable low-frequency data; the optimization formula obtained introducing the weighting technique is not theoretically complete, because it is not directly derived from the objective function. In this study, we address these weak points and present how to overcome them. We demonstrate that the source estimation in FWI using damped wavefields fails when the data used in the FWI process does not satisfy the causality condition. This phenomenon occurs when a non-causal filter is applied to data. We overcome this limitation by designing a causal filter. Also we modify the conventional weighting technique so that its optimization formula is directly derived from the objective function, retaining its original characteristic of emphasizing the low-frequency data components. Numerical results show that the newly designed causal filter enables to recover long-wavelength structures using low-frequency data components synthesized by damping wavefields in frequency-domain FWI, and the proposed weighting technique enhances the inversion results.

  17. Robust mode space approach for atomistic modeling of realistically large nanowire transistors

    NASA Astrophysics Data System (ADS)

    Huang, Jun Z.; Ilatikhameneh, Hesameddin; Povolotskyi, Michael; Klimeck, Gerhard

    2018-01-01

    Nanoelectronic transistors have reached 3D length scales in which the number of atoms is countable. Truly atomistic device representations are needed to capture the essential functionalities of the devices. Atomistic quantum transport simulations of realistically extended devices are, however, computationally very demanding. The widely used mode space (MS) approach can significantly reduce the numerical cost, but a good MS basis is usually very hard to obtain for atomistic full-band models. In this work, a robust and parallel algorithm is developed to optimize the MS basis for atomistic nanowires. This enables engineering-level, reliable tight binding non-equilibrium Green's function simulation of nanowire metal-oxide-semiconductor field-effect transistor (MOSFET) with a realistic cross section of 10 nm × 10 nm using a small computer cluster. This approach is applied to compare the performance of InGaAs and Si nanowire n-type MOSFETs (nMOSFETs) with various channel lengths and cross sections. Simulation results with full-band accuracy indicate that InGaAs nanowire nMOSFETs have no drive current advantage over their Si counterparts for cross sections up to about 10 nm × 10 nm.

  18. Cervical collagen imaging for determining preterm labor risks using a colposcope with full Mueller matrix capability

    NASA Astrophysics Data System (ADS)

    Stoff, Susan; Chue-Sang, Joseph; Holness, Nola A.; Gandjbakhche, Amir; Chernomordik, Viktor; Ramella-Roman, Jessica

    2016-02-01

    Preterm birth is a worldwide health issue, as the number one cause of infant mortality and neurological disorders. Although affecting nearly 10% of all births, an accurate, reliable diagnostic method for preterm birth has, yet, to be developed. The primary constituent of the cervix, collagen, provides the structural support and mechanical strength to maintain cervical closure, through specific organization, during fetal gestation. As pregnancy progresses, the disorganization of the cervical collagen occurs to allow eventual cervical pliability so the baby can be birthed through the cervical opening. This disorganization of collagen affects the mechanical properties of the cervix and, if the changes occur prematurely, may be a significant factor leading to preterm birth. The organization of collagen can be analyzed through the use of Mueller Matrix Polarimetric imaging of the characteristic birefringence of collagen. In this research, we have built a full Mueller Matrix Polarimetry attachment to a standard colposcope to enable imaging of human cervixes during standard prenatal exams at various stages of fetal gestation. Analysis of the polarimetric images provides information of quantity and organization of cervical collagen at specific gestational stages of pregnancy. This quantitative information may provide an indication of risk of preterm birth.

  19. A 100-Year Review: Cheese production and quality.

    PubMed

    Johnson, M E

    2017-12-01

    In the beginning, cheese making in the United States was all art, but embracing science and technology was necessary to make progress in producing a higher quality cheese. Traditional cheese making could not keep up with the demand for cheese, and the development of the factory system was necessary. Cheese quality suffered because of poor-quality milk, but 3 major innovations changed that: refrigeration, commercial starters, and the use of pasteurized milk for cheese making. Although by all accounts cold storage improved cheese quality, it was the improvement of milk quality, pasteurization of milk, and the use of reliable cultures for fermentation that had the biggest effect. Together with use of purified commercial cultures, pasteurization enabled cheese production to be conducted on a fixed time schedule. Fundamental research on the genetics of starter bacteria greatly increased the reliability of fermentation, which in turn made automation feasible. Demand for functionality, machinability, application in baking, and more emphasis on nutritional aspects (low fat and low sodium) of cheese took us back to the fundamental principles of cheese making and resulted in renewed vigor for scientific investigations into the chemical, microbiological, and enzymatic changes that occur during cheese making and ripening. As milk production increased, cheese factories needed to become more efficient. Membrane concentration and separation of milk offered a solution and greatly enhanced plant capacity. Full implementation of membrane processing and use of its full potential have yet to be achieved. Implementation of new technologies, the science of cheese making, and the development of further advances will require highly trained personnel at both the academic and industrial levels. This will be a great challenge to address and overcome. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  20. Power Electronics Thermal Management Research: Annual Progress Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreno, Gilberto

    The objective for this project is to develop thermal management strategies to enable efficient and high-temperature wide-bandgap (WBG)-based power electronic systems (e.g., emerging inverter and DC-DC converter). Reliable WBG devices are capable of operating at elevated temperatures (≥ 175 °Celsius). However, packaging WBG devices within an automotive inverter and operating them at higher junction temperatures will expose other system components (e.g., capacitors and electrical boards) to temperatures that may exceed their safe operating limits. This creates challenges for thermal management and reliability. In this project, system-level thermal analyses are conducted to determine the effect of elevated device temperatures on invertermore » components. Thermal modeling work is then conducted to evaluate various thermal management strategies that will enable the use of highly efficient WBG devices with automotive power electronic systems.« less

  1. Reliability assessment and improvement for a fast corrector power supply in TPS

    NASA Astrophysics Data System (ADS)

    Liu, Kuo-Bin; Liu, Chen-Yao; Wang, Bao-Sheng; Wong, Yong Seng

    2018-07-01

    Fast Orbit Feedback System (FOFB) can be installed in a synchrotron light source to eliminate undesired disturbances and to improve the stability of beam orbit. The design and implementation of an accurate and reliable Fast Corrector Power Supply (FCPS) is essential to realize the effectiveness and availability of the FOFB. A reliability assessment for the FCPSs in the FOFB of Taiwan Photon Source (TPS) considering MOSFETs' temperatures is represented in this paper. The FCPS is composed of a full-bridge topology and a low-pass filter. A Hybrid Pulse Width Modulation (HPWM) requiring two MOSFETs in the full-bridge circuit to be operated at high frequency and the other two be operated at the output frequency is adopted to control the implemented FCPS. Due the characteristic of HPWM, the conduction loss and switching loss of each MOSFET in the FCPS is not same. Two of the MOSFETs in the full-bridge circuit will suffer higher temperatures and therefore the circuit reliability of FCPS is reduced. A Modified PWM Scheme (MPWMS) designed to average MOSFETs' temperatures and to improve circuit reliability is proposed in this paper. Experimental results measure the MOSFETs' temperatures of FCPS controlled by the HPWM and the proposed MPWMS. The reliability indices under different PWM controls are then assessed. From the experimental results, it can be observed that the reliability of FCPS using the proposed MPWMS can be improved because the MOSFETs' temperatures are closer. Since the reliability of FCPS can be enhanced, the availability of FOFB can also be improved.

  2. The reliability of a VISION COACH task as a measure of psychomotor skills.

    PubMed

    Xi, Yubin; Rosopa, Patrick J; Mossey, Mary; Crisler, Matthew C; Drouin, Nathalie; Kopera, Kevin; Brooks, Johnell O

    2014-10-01

    The VISION COACH™ interactive light board is designed to test and enhance participants' psychomotor skills. The primary goal of this study was to examine the test-retest reliability of the Full Field 120 VISION COACH task. One hundred eleven male and 131 female adult participants completed six trials where they responded to 120 randomly distributed lights displayed on the VISION COACH interactive light board. The mean time required for a participant to complete a trial was 101 seconds. Intraclass correlation coefficients, ranging from 0.962 to 0.987 suggest the VISION COACH Full Field 120 task was a reliable task. Cohen's d's of adjacent pairs of trials suggest learning effects did not negatively affect reliability after the third trial.

  3. Algorithm For Solution Of Subset-Regression Problems

    NASA Technical Reports Server (NTRS)

    Verhaegen, Michel

    1991-01-01

    Reliable and flexible algorithm for solution of subset-regression problem performs QR decomposition with new column-pivoting strategy, enables selection of subset directly from originally defined regression parameters. This feature, in combination with number of extensions, makes algorithm very flexible for use in analysis of subset-regression problems in which parameters have physical meanings. Also extended to enable joint processing of columns contaminated by noise with those free of noise, without using scaling techniques.

  4. A phylogenetic comparison of urease-positive thermophilic Campylobacter (UPTC) and urease-negative (UN) C. lari.

    PubMed

    Hirayama, Junichi; Tazumi, Akihiro; Hayashi, Kyohei; Tasaki, Erina; Kuribayashi, Takashi; Moore, John E; Millar, Beverley C; Matsuda, Motoo

    2011-06-01

    In the present study, the reliability of full-length gene sequence information for several genes including 16S rRNA was examined, for the discrimination of the two representative Campylobacter lari taxa, namely urease-negative (UN) C. lari and urease-positive thermophilic Campylobacter (UPTC). As previously described, 16S rRNA gene sequence are not reliable for the molecular discrimination of UN C. lari from UPTC organisms employing both the unweighted pair group method using arithmetic means analysis (UPGMA) and neighbor joining (NJ) methods. In addition, three composite full-length gene sequences (ciaB, flaC and vacJ) out of seven gene loci examined were reliable for discrimination employing dendrograms constructed by the UPGMA method. In addition, all the dendrograms of the NJ phylogenetic trees constructed based on the nine gene information were not reliable for the discrimination. Three composite full-length gene sequences (ciaB, flaC and vacJ) were reliable for the molecular discrimination between UN C. lari and UPTC organisms employing the UPGMA method, as well as among four thermophilic Campylobacter species. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  6. MorphDB: Prioritizing Genes for Specialized Metabolism Pathways and Gene Ontology Categories in Plants.

    PubMed

    Zwaenepoel, Arthur; Diels, Tim; Amar, David; Van Parys, Thomas; Shamir, Ron; Van de Peer, Yves; Tzfadia, Oren

    2018-01-01

    Recent times have seen an enormous growth of "omics" data, of which high-throughput gene expression data are arguably the most important from a functional perspective. Despite huge improvements in computational techniques for the functional classification of gene sequences, common similarity-based methods often fall short of providing full and reliable functional information. Recently, the combination of comparative genomics with approaches in functional genomics has received considerable interest for gene function analysis, leveraging both gene expression based guilt-by-association methods and annotation efforts in closely related model organisms. Besides the identification of missing genes in pathways, these methods also typically enable the discovery of biological regulators (i.e., transcription factors or signaling genes). A previously built guilt-by-association method is MORPH, which was proven to be an efficient algorithm that performs particularly well in identifying and prioritizing missing genes in plant metabolic pathways. Here, we present MorphDB, a resource where MORPH-based candidate genes for large-scale functional annotations (Gene Ontology, MapMan bins) are integrated across multiple plant species. Besides a gene centric query utility, we present a comparative network approach that enables researchers to efficiently browse MORPH predictions across functional gene sets and species, facilitating efficient gene discovery and candidate gene prioritization. MorphDB is available at http://bioinformatics.psb.ugent.be/webtools/morphdb/morphDB/index/. We also provide a toolkit, named "MORPH bulk" (https://github.com/arzwa/morph-bulk), for running MORPH in bulk mode on novel data sets, enabling researchers to apply MORPH to their own species of interest.

  7. Multi-hop routing mechanism for reliable sensor computing.

    PubMed

    Chen, Jiann-Liang; Ma, Yi-Wei; Lai, Chia-Ping; Hu, Chia-Cheng; Huang, Yueh-Min

    2009-01-01

    Current research on routing in wireless sensor computing concentrates on increasing the service lifetime, enabling scalability for large number of sensors and supporting fault tolerance for battery exhaustion and broken nodes. A sensor node is naturally exposed to various sources of unreliable communication channels and node failures. Sensor nodes have many failure modes, and each failure degrades the network performance. This work develops a novel mechanism, called Reliable Routing Mechanism (RRM), based on a hybrid cluster-based routing protocol to specify the best reliable routing path for sensor computing. Table-driven intra-cluster routing and on-demand inter-cluster routing are combined by changing the relationship between clusters for sensor computing. Applying a reliable routing mechanism in sensor computing can improve routing reliability, maintain low packet loss, minimize management overhead and save energy consumption. Simulation results indicate that the reliability of the proposed RRM mechanism is around 25% higher than that of the Dynamic Source Routing (DSR) and ad hoc On-demand Distance Vector routing (AODV) mechanisms.

  8. The Effect of Reading Duration on the Reliability and Validity of Middle School Students' ORF Performance

    ERIC Educational Resources Information Center

    Barth, Amy E.; Stuebing, Karla K.; Fletcher, Jack M.; Denton, Carolyn A.; Vaughn, Sharon; Francis, David

    2014-01-01

    We evaluated the technical adequacy of oral reading fluency (ORF) probes in which 1,472 middle school students with and without reading difficulties read fluency probes for 60 s versus reading the full passage. Results suggested that the reliability of 60-s probes (rs = 0.75) was not substantively different than full passage probes (rs = 0.77)…

  9. Modern Methods of Rail Welding

    NASA Astrophysics Data System (ADS)

    Kozyrev, Nikolay A.; Kozyreva, Olga A.; Usoltsev, Aleksander A.; Kryukov, Roman E.; Shevchenko, Roman A.

    2017-10-01

    Existing methods of rail welding, which are enable to get continuous welded rail track, are observed in this article. Analysis of existing welding methods allows considering an issue of continuous rail track in detail. Metallurgical and welding technologies of rail welding and also process technologies reducing aftereffects of temperature exposure are important factors determining the quality and reliability of the continuous rail track. Analysis of the existing methods of rail welding enable to find the research line for solving this problem.

  10. Bridge deterioration models to support Indiana's bridge management system.

    DOT National Transportation Integrated Search

    2016-02-01

    An effective bridge management system that is equipped with reliable deterioration models enables agency engineers to carry out : monitoring and long-term programming of bridge repair actions. At the project level, deterioration models help the agenc...

  11. Measurement and evaluation of transit travel time reliability

    DOT National Transportation Integrated Search

    2011-01-01

    Transportation system customers need consistency in their daily travel times to enable them to plan their daily : activities, whether that is a commuter on their way to work, a company setting up delivery schedules for justintime : manufacturin...

  12. COTS-Based Fault Tolerance in Deep Space: Qualitative and Quantitative Analyses of a Bus Network Architecture

    NASA Technical Reports Server (NTRS)

    Tai, Ann T.; Chau, Savio N.; Alkalai, Leon

    2000-01-01

    Using COTS products, standards and intellectual properties (IPs) for all the system and component interfaces is a crucial step toward significant reduction of both system cost and development cost as the COTS interfaces enable other COTS products and IPs to be readily accommodated by the target system architecture. With respect to the long-term survivable systems for deep-space missions, the major challenge for us is, under stringent power and mass constraints, to achieve ultra-high reliability of the system comprising COTS products and standards that are not developed for mission-critical applications. The spirit of our solution is to exploit the pertinent standard features of a COTS product to circumvent its shortcomings, though these standard features may not be originally designed for highly reliable systems. In this paper, we discuss our experiences and findings on the design of an IEEE 1394 compliant fault-tolerant COTS-based bus architecture. We first derive and qualitatively analyze a -'stacktree topology" that not only complies with IEEE 1394 but also enables the implementation of a fault-tolerant bus architecture without node redundancy. We then present a quantitative evaluation that demonstrates significant reliability improvement from the COTS-based fault tolerance.

  13. An Enhanced Backbone-Assisted Reliable Framework for Wireless Sensor Networks

    PubMed Central

    Tufail, Ali; Khayam, Syed Ali; Raza, Muhammad Taqi; Ali, Amna; Kim, Ki-Hyung

    2010-01-01

    An extremely reliable source to sink communication is required for most of the contemporary WSN applications especially pertaining to military, healthcare and disaster-recovery. However, due to their intrinsic energy, bandwidth and computational constraints, Wireless Sensor Networks (WSNs) encounter several challenges in reliable source to sink communication. In this paper, we present a novel reliable topology that uses reliable hotlines between sensor gateways to boost the reliability of end-to-end transmissions. This reliable and efficient routing alternative reduces the number of average hops from source to the sink. We prove, with the help of analytical evaluation, that communication using hotlines is considerably more reliable than traditional WSN routing. We use reliability theory to analyze the cost and benefit of adding gateway nodes to a backbone-assisted WSN. However, in hotline assisted routing some scenarios where source and the sink are just a couple of hops away might bring more latency, therefore, we present a Signature Based Routing (SBR) scheme. SBR enables the gateways to make intelligent routing decisions, based upon the derived signature, hence providing lesser end-to-end delay between source to the sink communication. Finally, we evaluate our proposed hotline based topology with the help of a simulation tool and show that the proposed topology provides manifold increase in end-to-end reliability. PMID:22294890

  14. Enhanced CARES Software Enables Improved Ceramic Life Prediction

    NASA Technical Reports Server (NTRS)

    Janosik, Lesley A.

    1997-01-01

    The NASA Lewis Research Center has developed award-winning software that enables American industry to establish the reliability and life of brittle material (e.g., ceramic, intermetallic, graphite) structures in a wide variety of 21st century applications. The CARES (Ceramics Analysis and Reliability Evaluation of Structures) series of software is successfully used by numerous engineers in industrial, academic, and government organizations as an essential element of the structural design and material selection processes. The latest version of this software, CARES/Life, provides a general- purpose design tool that predicts the probability of failure of a ceramic component as a function of its time in service. CARES/Life was recently enhanced by adding new modules designed to improve functionality and user-friendliness. In addition, a beta version of the newly-developed CARES/Creep program (for determining the creep life of monolithic ceramic components) has just been released to selected organizations.

  15. Overview of RICOR tactical cryogenic refrigerators for space missions

    NASA Astrophysics Data System (ADS)

    Riabzev, Sergey; Filis, Avishai; Livni, Dorit; Regev, Itai; Segal, Victor; Gover, Dan

    2016-05-01

    Cryogenic refrigerators represent a significant enabling technology for Earth and Space science enterprises. Many of the space instruments require cryogenic refrigeration to enable the use of advanced detectors to explore a wide range of phenomena from space. RICOR refrigerators involved in various space missions are overviewed in this paper, starting in 1994 with "Clementine" Moon mission, till the latest ExoMars mission launched in 2016. RICOR tactical rotary refrigerators have been incorporated in many space instruments, after passing qualification, life time, thermal management testing and flight acceptance. The tactical to space customization framework includes an extensive characterization and qualification test program to validate reliability, the design of thermal interfacing with a detector, vibration export control, efficient heat dissipation in a vacuum environment, robustness, mounting design, compliance with outgassing requirements and strict performance screening. Current RICOR development is focused on dedicated ultra-long-life, highly reliable, space cryogenic refrigerator based on a Pulse Tube design

  16. Inter-rater reliability and aspects of validity of the parent-infant relationship global assessment scale (PIR-GAS)

    PubMed Central

    2013-01-01

    Background The Parent-Infant Relationship Global Assessment Scale (PIR-GAS) signifies a conceptually relevant development in the multi-axial, developmentally sensitive classification system DC:0-3R for preschool children. However, information about the reliability and validity of the PIR-GAS is rare. A review of the available empirical studies suggests that in research, PIR-GAS ratings can be based on a ten-minute videotaped interaction sequence. The qualification of raters may be very heterogeneous across studies. Methods To test whether the use of the PIR-GAS still allows for a reliable assessment of the parent-infant relationship, our study compared a PIR-GAS ratings based on a full-information procedure across multiple settings with ratings based on a ten-minute video by two doctoral candidates of medicine. For each mother-child dyad at a family day hospital (N = 48), we obtained two video ratings and one full-information rating at admission to therapy and at discharge. This pre-post design allowed for a replication of our findings across the two measurement points. We focused on the inter-rater reliability between the video coders, as well as between the video and full-information procedure, including mean differences and correlations between the raters. Additionally, we examined aspects of the validity of video and full-information ratings based on their correlation with measures of child and maternal psychopathology. Results Our results showed that a ten-minute video and full-information PIR-GAS ratings were not interchangeable. Most results at admission could be replicated by the data obtained at discharge. We concluded that a higher degree of standardization of the assessment procedure should increase the reliability of the PIR-GAS, and a more thorough theoretical foundation of the manual should increase its validity. PMID:23705962

  17. Fast Dynamic Simulation-Based Small Signal Stability Assessment and Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acharya, Naresh; Baone, Chaitanya; Veda, Santosh

    2014-12-31

    Power grid planning and operation decisions are made based on simulation of the dynamic behavior of the system. Enabling substantial energy savings while increasing the reliability of the aging North American power grid through improved utilization of existing transmission assets hinges on the adoption of wide-area measurement systems (WAMS) for power system stabilization. However, adoption of WAMS alone will not suffice if the power system is to reach its full entitlement in stability and reliability. It is necessary to enhance predictability with "faster than real-time" dynamic simulations that will enable the dynamic stability margins, proactive real-time control, and improve gridmore » resiliency to fast time-scale phenomena such as cascading network failures. Present-day dynamic simulations are performed only during offline planning studies, considering only worst case conditions such as summer peak, winter peak days, etc. With widespread deployment of renewable generation, controllable loads, energy storage devices and plug-in hybrid electric vehicles expected in the near future and greater integration of cyber infrastructure (communications, computation and control), monitoring and controlling the dynamic performance of the grid in real-time would become increasingly important. The state-of-the-art dynamic simulation tools have limited computational speed and are not suitable for real-time applications, given the large set of contingency conditions to be evaluated. These tools are optimized for best performance of single-processor computers, but the simulation is still several times slower than real-time due to its computational complexity. With recent significant advances in numerical methods and computational hardware, the expectations have been rising towards more efficient and faster techniques to be implemented in power system simulators. This is a natural expectation, given that the core solution algorithms of most commercial simulators were developed decades ago, when High Performance Computing (HPC) resources were not commonly available.« less

  18. Genomic prediction using imputed whole-genome sequence data in Holstein Friesian cattle.

    PubMed

    van Binsbergen, Rianne; Calus, Mario P L; Bink, Marco C A M; van Eeuwijk, Fred A; Schrooten, Chris; Veerkamp, Roel F

    2015-09-17

    In contrast to currently used single nucleotide polymorphism (SNP) panels, the use of whole-genome sequence data is expected to enable the direct estimation of the effects of causal mutations on a given trait. This could lead to higher reliabilities of genomic predictions compared to those based on SNP genotypes. Also, at each generation of selection, recombination events between a SNP and a mutation can cause decay in reliability of genomic predictions based on markers rather than on the causal variants. Our objective was to investigate the use of imputed whole-genome sequence genotypes versus high-density SNP genotypes on (the persistency of) the reliability of genomic predictions using real cattle data. Highly accurate phenotypes based on daughter performance and Illumina BovineHD Beadchip genotypes were available for 5503 Holstein Friesian bulls. The BovineHD genotypes (631,428 SNPs) of each bull were used to impute whole-genome sequence genotypes (12,590,056 SNPs) using the Beagle software. Imputation was done using a multi-breed reference panel of 429 sequenced individuals. Genomic estimated breeding values for three traits were predicted using a Bayesian stochastic search variable selection (BSSVS) model and a genome-enabled best linear unbiased prediction model (GBLUP). Reliabilities of predictions were based on 2087 validation bulls, while the other 3416 bulls were used for training. Prediction reliabilities ranged from 0.37 to 0.52. BSSVS performed better than GBLUP in all cases. Reliabilities of genomic predictions were slightly lower with imputed sequence data than with BovineHD chip data. Also, the reliabilities tended to be lower for both sequence data and BovineHD chip data when relationships between training animals were low. No increase in persistency of prediction reliability using imputed sequence data was observed. Compared to BovineHD genotype data, using imputed sequence data for genomic prediction produced no advantage. To investigate the putative advantage of genomic prediction using (imputed) sequence data, a training set with a larger number of individuals that are distantly related to each other and genomic prediction models that incorporate biological information on the SNPs or that apply stricter SNP pre-selection should be considered.

  19. Enabling the environmentally clean air transportation of the future: a vision of computational fluid dynamics in 2030.

    PubMed

    Slotnick, Jeffrey P; Khodadoust, Abdollah; Alonso, Juan J; Darmofal, David L; Gropp, William D; Lurie, Elizabeth A; Mavriplis, Dimitri J; Venkatakrishnan, Venkat

    2014-08-13

    As global air travel expands rapidly to meet demand generated by economic growth, it is essential to continue to improve the efficiency of air transportation to reduce its carbon emissions and address concerns about climate change. Future transports must be 'cleaner' and designed to include technologies that will continue to lower engine emissions and reduce community noise. The use of computational fluid dynamics (CFD) will be critical to enable the design of these new concepts. In general, the ability to simulate aerodynamic and reactive flows using CFD has progressed rapidly during the past several decades and has fundamentally changed the aerospace design process. Advanced simulation capabilities not only enable reductions in ground-based and flight-testing requirements, but also provide added physical insight, and enable superior designs at reduced cost and risk. In spite of considerable success, reliable use of CFD has remained confined to a small region of the operating envelope due, in part, to the inability of current methods to reliably predict turbulent, separated flows. Fortunately, the advent of much more powerful computing platforms provides an opportunity to overcome a number of these challenges. This paper summarizes the findings and recommendations from a recent NASA-funded study that provides a vision for CFD in the year 2030, including an assessment of critical technology gaps and needed development, and identifies the key CFD technology advancements that will enable the design and development of much cleaner aircraft in the future. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  20. Advancement of a 30K W Solar Electric Propulsion System Capability for NASA Human and Robotic Exploration Missions

    NASA Technical Reports Server (NTRS)

    Smith, Bryan K.; Nazario, Margaret L.; Manzella, David H.

    2012-01-01

    Solar Electric Propulsion has evolved into a demonstrated operational capability performing station keeping for geosynchronous satellites, enabling challenging deep-space science missions, and assisting in the transfer of satellites from an elliptical orbit Geostationary Transfer Orbit (GTO) to a Geostationary Earth Orbit (GEO). Advancing higher power SEP systems will enable numerous future applications for human, robotic, and commercial missions. These missions are enabled by either the increased performance of the SEP system or by the cost reductions when compared to conventional chemical propulsion systems. Higher power SEP systems that provide very high payload for robotic missions also trade favorably for the advancement of human exploration beyond low Earth orbit. Demonstrated reliable systems are required for human space flight and due to their successful present day widespread use and inherent high reliability, SEP systems have progressively become a viable entrant into these future human exploration architectures. NASA studies have identified a 30 kW-class SEP capability as the next appropriate evolutionary step, applicable to wide range of both human and robotic missions. This paper describes the planning options, mission applications, and technology investments for representative 30kW-class SEP mission concepts under consideration by NASA

  1. Design and Analysis of a Flexible, Reliable Deep Space Life Support System

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2012-01-01

    This report describes a flexible, reliable, deep space life support system design approach that uses either storage or recycling or both together. The design goal is to provide the needed life support performance with the required ultra reliability for the minimum Equivalent System Mass (ESM). Recycling life support systems used with multiple redundancy can have sufficient reliability for deep space missions but they usually do not save mass compared to mixed storage and recycling systems. The best deep space life support system design uses water recycling with sufficient water storage to prevent loss of crew if recycling fails. Since the amount of water needed for crew survival is a small part of the total water requirement, the required amount of stored water is significantly less than the total to be consumed. Water recycling with water, oxygen, and carbon dioxide removal material storage can achieve the high reliability of full storage systems with only half the mass of full storage and with less mass than the highly redundant recycling systems needed to achieve acceptable reliability. Improved recycling systems with lower mass and higher reliability could perform better than systems using storage.

  2. Engine System Model Development for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Nelson, Karl W.; Simpson, Steven P.

    2006-01-01

    In order to design, analyze, and evaluate conceptual Nuclear Thermal Propulsion (NTP) engine systems, an improved NTP design and analysis tool has been developed. The NTP tool utilizes the Rocket Engine Transient Simulation (ROCETS) system tool and many of the routines from the Enabler reactor model found in Nuclear Engine System Simulation (NESS). Improved non-nuclear component models and an external shield model were added to the tool. With the addition of a nearly complete system reliability model, the tool will provide performance, sizing, and reliability data for NERVA-Derived NTP engine systems. A new detailed reactor model is also being developed and will replace Enabler. The new model will allow more flexibility in reactor geometry and include detailed thermal hydraulics and neutronics models. A description of the reactor, component, and reliability models is provided. Another key feature of the modeling process is the use of comprehensive spreadsheets for each engine case. The spreadsheets include individual worksheets for each subsystem with data, plots, and scaled figures, making the output very useful to each engineering discipline. Sample performance and sizing results with the Enabler reactor model are provided including sensitivities. Before selecting an engine design, all figures of merit must be considered including the overall impacts on the vehicle and mission. Evaluations based on key figures of merit of these results and results with the new reactor model will be performed. The impacts of clustering and external shielding will also be addressed. Over time, the reactor model will be upgraded to design and analyze other NTP concepts with CERMET and carbide fuel cores.

  3. Reusable Solid Rocket Motor - Accomplishments, Lessons, and a Culture of Success

    NASA Technical Reports Server (NTRS)

    Moore, Dennis R.; Phelps, Willie J.

    2011-01-01

    The Reusable Solid Rocket Motor represents the largest solid rocket motor ever flown and the only human rated solid motor. Each Reusable Solid Rocket Motor (RSRM) provides approximately 3-million lb of thrust to lift the integrated Space Shuttle vehicle from the launch pad. The motors burn out approximately 2 minutes later, separate from the vehicle and are recovered and refurbished. The size of the motor and the need for high reliability were challenges. Thrust shaping, via shaping of the propellant grain, was needed to limit structural loads during ascent. The motor design evolved through several block upgrades to increase performance and to increase safety and reliability. A major redesign occurred after STS-51L with the Redesigned Solid Rocket Motor. Significant improvements in the joint sealing systems were added. Design improvements continued throughout the Program via block changes with a number of innovations including development of low temperature o-ring materials and incorporation of a unique carbon fiber rope thermal barrier material. Recovery of the motors and post flight inspection improved understanding of hardware performance, and led to key design improvements. Because of the multidecade program duration material obsolescence was addressed, and requalification of materials and vendors was sometimes needed. Thermal protection systems and ablatives were used to protect the motor cases and nozzle structures. Significant understanding of design and manufacturing features of the ablatives was developed during the program resulting in optimization of design features and processing parameters. The project advanced technology in eliminating ozone-depleting materials in manufacturing processes and the development of an asbestos-free case insulation. Manufacturing processes for the large motor components were unique and safety in the manufacturing environment was a special concern. Transportation and handling approaches were also needed for the large hardware segments. The reusable solid rocket motor achieved significant reliability via process control, ground test programs, and postflight assessment. Process control is mandatory for a solid rocket motor as an acceptance test of the delivered product is not feasible. Process control included process failure modes and effects analysis, statistical process control, witness panels, and process product integrity audits. Material controls and inspections were maintained throughout the sub tier vendors. Material fingerprinting was employed to assess any drift in delivered material properties. The RSRM maintained both full scale and sub-scale test articles. These enabled continuous improvement of design and evaluation of process control and material behavior. Additionally RSRM reliability was achieved through attention to detail in post flight assessment to observe any shift in performance. The postflight analysis and inspections provided invaluable reliability data as it enables observation of actual flight performance, most of which would not be available if the motors were not recovered. These unique challenges, features of the reusable solid rocket motor, materials and manufacturing issues, and design improvements will be discussed in the paper.

  4. Scientific Data Management (SDM) Center for Enabling Technologies. Final Report, 2007-2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludascher, Bertram; Altintas, Ilkay

    Our contributions to advancing the State of the Art in scientific workflows have focused on the following areas: Workflow development; Generic workflow components and templates; Provenance collection and analysis; and, Workflow reliability and fault tolerance.

  5. Using Toxicological Evidence from QSAR Models in Practice

    EPA Science Inventory

    The new generation of QSAR models provides supporting documentation in addition to the predicted toxicological value. Such information enables the toxicologist to explore the properties of chemical substances and to review and increase the reliability of toxicity predictions. Thi...

  6. Closing the Loop on Space Waste

    NASA Astrophysics Data System (ADS)

    Meier, A. J.; Hintze, P. E.

    2018-02-01

    A heat transfer study of mission mixed waste streams in a reactor hot zone, along with solid, tar, and water recovery. This research enables reliability and benefit on waste conversion systems to manage our environmental impact, on- and off-Earth.

  7. Solar-cell interconnect design for terrestrial photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1984-01-01

    Useful solar cell interconnect reliability design and life prediction algorithms are presented, together with experimental data indicating that the classical strain cycle (fatigue) curve for the interconnect material does not account for the statistical scatter that is required in reliability predictions. This shortcoming is presently addressed by fitting a functional form to experimental cumulative interconnect failure rate data, which thereby yields statistical fatigue curves enabling not only the prediction of cumulative interconnect failures during the design life of an array field, but also the quantitative interpretation of data from accelerated thermal cycling tests. Optimal interconnect cost reliability design algorithms are also derived which may allow the minimization of energy cost over the design life of the array field.

  8. Solar-cell interconnect design for terrestrial photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1984-11-01

    Useful solar cell interconnect reliability design and life prediction algorithms are presented, together with experimental data indicating that the classical strain cycle (fatigue) curve for the interconnect material does not account for the statistical scatter that is required in reliability predictions. This shortcoming is presently addressed by fitting a functional form to experimental cumulative interconnect failure rate data, which thereby yields statistical fatigue curves enabling not only the prediction of cumulative interconnect failures during the design life of an array field, but also the quantitative interpretation of data from accelerated thermal cycling tests. Optimal interconnect cost reliability design algorithms are also derived which may allow the minimization of energy cost over the design life of the array field.

  9. Genomic predictions can accelerate selection for resistance against Piscirickettsia salmonis in Atlantic salmon (Salmo salar).

    PubMed

    Bangera, Rama; Correa, Katharina; Lhorente, Jean P; Figueroa, René; Yáñez, José M

    2017-01-31

    Salmon Rickettsial Syndrome (SRS) caused by Piscirickettsia salmonis is a major disease affecting the Chilean salmon industry. Genomic selection (GS) is a method wherein genome-wide markers and phenotype information of full-sibs are used to predict genomic EBV (GEBV) of selection candidates and is expected to have increased accuracy and response to selection over traditional pedigree based Best Linear Unbiased Prediction (PBLUP). Widely used GS methods such as genomic BLUP (GBLUP), SNPBLUP, Bayes C and Bayesian Lasso may perform differently with respect to accuracy of GEBV prediction. Our aim was to compare the accuracy, in terms of reliability of genome-enabled prediction, from different GS methods with PBLUP for resistance to SRS in an Atlantic salmon breeding program. Number of days to death (DAYS), binary survival status (STATUS) phenotypes, and 50 K SNP array genotypes were obtained from 2601 smolts challenged with P. salmonis. The reliability of different GS methods at different SNP densities with and without pedigree were compared to PBLUP using a five-fold cross validation scheme. Heritability estimated from GS methods was significantly higher than PBLUP. Pearson's correlation between predicted GEBV from PBLUP and GS models ranged from 0.79 to 0.91 and 0.79-0.95 for DAYS and STATUS, respectively. The relative increase in reliability from different GS methods for DAYS and STATUS with 50 K SNP ranged from 8 to 25% and 27-30%, respectively. All GS methods outperformed PBLUP at all marker densities. DAYS and STATUS showed superior reliability over PBLUP even at the lowest marker density of 3 K and 500 SNP, respectively. 20 K SNP showed close to maximal reliability for both traits with little improvement using higher densities. These results indicate that genomic predictions can accelerate genetic progress for SRS resistance in Atlantic salmon and implementation of this approach will contribute to the control of SRS in Chile. We recommend GBLUP for routine GS evaluation because this method is computationally faster and the results are very similar with other GS methods. The use of lower density SNP or the combination of low density SNP and an imputation strategy may help to reduce genotyping costs without compromising gain in reliability.

  10. SOFIA: a flexible source finder for 3D spectral line data

    NASA Astrophysics Data System (ADS)

    Serra, Paolo; Westmeier, Tobias; Giese, Nadine; Jurek, Russell; Flöer, Lars; Popping, Attila; Winkel, Benjamin; van der Hulst, Thijs; Meyer, Martin; Koribalski, Bärbel S.; Staveley-Smith, Lister; Courtois, Hélène

    2015-04-01

    We introduce SOFIA, a flexible software application for the detection and parametrization of sources in 3D spectral line data sets. SOFIA combines for the first time in a single piece of software a set of new source-finding and parametrization algorithms developed on the way to future H I surveys with ASKAP (WALLABY, DINGO) and APERTIF. It is designed to enable the general use of these new algorithms by the community on a broad range of data sets. The key advantages of SOFIA are the ability to: search for line emission on multiple scales to detect 3D sources in a complete and reliable way, taking into account noise level variations and the presence of artefacts in a data cube; estimate the reliability of individual detections; look for signal in arbitrarily large data cubes using a catalogue of 3D coordinates as a prior; provide a wide range of source parameters and output products which facilitate further analysis by the user. We highlight the modularity of SOFIA, which makes it a flexible package allowing users to select and apply only the algorithms useful for their data and science questions. This modularity makes it also possible to easily expand SOFIA in order to include additional methods as they become available. The full SOFIA distribution, including a dedicated graphical user interface, is publicly available for download.

  11. Issues in designing transport layer multicast facilities

    NASA Technical Reports Server (NTRS)

    Dempsey, Bert J.; Weaver, Alfred C.

    1990-01-01

    Multicasting denotes a facility in a communications system for providing efficient delivery from a message's source to some well-defined set of locations using a single logical address. While modem network hardware supports multidestination delivery, first generation Transport Layer protocols (e.g., the DoD Transmission Control Protocol (TCP) (15) and ISO TP-4 (41)) did not anticipate the changes over the past decade in underlying network hardware, transmission speeds, and communication patterns that have enabled and driven the interest in reliable multicast. Much recent research has focused on integrating the underlying hardware multicast capability with the reliable services of Transport Layer protocols. Here, we explore the communication issues surrounding the design of such a reliable multicast mechanism. Approaches and solutions from the literature are discussed, and four experimental Transport Layer protocols that incorporate reliable multicast are examined.

  12. Overview of Future of Probabilistic Methods and RMSL Technology and the Probabilistic Methods Education Initiative for the US Army at the SAE G-11 Meeting

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.

  13. Synthesis of Natural and Unnatural Cyclooligomeric Depsipeptides Enabled by Flow Chemistry

    PubMed Central

    Lücke, Daniel; Dalton, Toryn; Ley, Steven V.

    2016-01-01

    Abstract Flow chemistry has been successfully integrated into the synthesis of a series of cyclooligomeric depsipeptides of three different ring sizes including the natural products beauvericin (1 a), bassianolide (2 b) and enniatin C (1 b). A reliable flow chemistry protocol was established for the coupling and macrocyclisation to form challenging N‐methylated amides. This flexible approach has allowed the rapid synthesis of both natural and unnatural depsipeptides in high yields, enabling further exploration of their promising biological activity. PMID:26844421

  14. Bootstrap study of genome-enabled prediction reliabilities using haplotype blocks across Nordic Red cattle breeds.

    PubMed

    Cuyabano, B C D; Su, G; Rosa, G J M; Lund, M S; Gianola, D

    2015-10-01

    This study compared the accuracy of genome-enabled prediction models using individual single nucleotide polymorphisms (SNP) or haplotype blocks as covariates when using either a single breed or a combined population of Nordic Red cattle. The main objective was to compare predictions of breeding values of complex traits using a combined training population with haplotype blocks, with predictions using a single breed as training population and individual SNP as predictors. To compare the prediction reliabilities, bootstrap samples were taken from the test data set. With the bootstrapped samples of prediction reliabilities, we built and graphed confidence ellipses to allow comparisons. Finally, measures of statistical distances were used to calculate the gain in predictive ability. Our analyses are innovative in the context of assessment of predictive models, allowing a better understanding of prediction reliabilities and providing a statistical basis to effectively calibrate whether one prediction scenario is indeed more accurate than another. An ANOVA indicated that use of haplotype blocks produced significant gains mainly when Bayesian mixture models were used but not when Bayesian BLUP was fitted to the data. Furthermore, when haplotype blocks were used to train prediction models in a combined Nordic Red cattle population, we obtained up to a statistically significant 5.5% average gain in prediction accuracy, over predictions using individual SNP and training the model with a single breed. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  15. [Validation and adhesion to GESIDA quality indicators in patients with HIV infection].

    PubMed

    Riera, Melchor; Esteban, Herminia; Suarez, Ignacio; Palacios, Rosario; Lozano, Fernando; Blanco, Jose R; Valencia, Eulalia; Ocampo, Antonio; Amador, Concha; Frontera, Guillem; vonWichmann-de Miguel, Miguel Angel

    2016-01-01

    The objective of the study is to validate the relevant GESIDA quality indicators for HIV infection, assessing the reliability, feasibility and adherence to them. The reliability was evaluated using the reproducibility of 6 indicators in peer review, with the second observer being an outsider. The feasibility and measurement of the level of adherence to the 22 indicators was conducted with annual fragmented retrospective collection of information from specific databases or the clinical charts of the nine participating hospitals. Reliability was very high, with interobserver agreement levels higher than 95% in 5 of the 6 indicators. The median time to achieve the indicators ranged between 5 and 600minutes, but could be achieved progressively from specific databases, enabling obtaining them automatically. As regards adherence to the indicators related with the initial evaluation of the patients, instructions and suitability of the guidelines for ART, adherence to ART, follow-up in clinics, and achieve an undetectable HIV by PCR at week 48 of the ART. Indicators of quality related to the prevention of opportunistic infections and control of comorbidities, the standards set were not achieved, and significant heterogeneity was observed between hospitals. The GESIDA quality indicators of HIV infection enabled the relevant indicators to be feasibly and reliably measured, and should be collected in all the units that care for patients with HIV infection. Copyright © 2015 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  16. Incorporating driver behaviors into connected and automated vehicle simulation.

    DOT National Transportation Integrated Search

    2016-05-24

    The adoption of connected vehicle (CV) technology is anticipated at various levels of development and deployment over the next decade. One primary challenge with these new technologies is the lack of platform to enable a robust and reliable evaluatio...

  17. 77 FR 38706 - Agency Information Collection Activities: Request for Comments for a New Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-28

    ... requested. For instance, a prize may be awarded to the solution of a challenge to develop an algorithm that enables reliable prediction of a certain event. A responder could submit the correct algorithm, but...

  18. Demographic Planning: An Action Approach

    ERIC Educational Resources Information Center

    Finch, Harold L.; Smith, Joyce

    1974-01-01

    Community colleges are in a good position to obtain reliable long-term forecasts of future demand. An approach developed at Johnson County Community College in Overland Park, Kansas, has enabled the college to assist other community institutions in their parallel planning efforts. (Author/MLF)

  19. Synthetic spike-in standards for high-throughput 16S rRNA gene amplicon sequencing.

    PubMed

    Tourlousse, Dieter M; Yoshiike, Satowa; Ohashi, Akiko; Matsukura, Satoko; Noda, Naohiro; Sekiguchi, Yuji

    2017-02-28

    High-throughput sequencing of 16S rRNA gene amplicons (16S-seq) has become a widely deployed method for profiling complex microbial communities but technical pitfalls related to data reliability and quantification remain to be fully addressed. In this work, we have developed and implemented a set of synthetic 16S rRNA genes to serve as universal spike-in standards for 16S-seq experiments. The spike-ins represent full-length 16S rRNA genes containing artificial variable regions with negligible identity to known nucleotide sequences, permitting unambiguous identification of spike-in sequences in 16S-seq read data from any microbiome sample. Using defined mock communities and environmental microbiota, we characterized the performance of the spike-in standards and demonstrated their utility for evaluating data quality on a per-sample basis. Further, we showed that staggered spike-in mixtures added at the point of DNA extraction enable concurrent estimation of absolute microbial abundances suitable for comparative analysis. Results also underscored that template-specific Illumina sequencing artifacts may lead to biases in the perceived abundance of certain taxa. Taken together, the spike-in standards represent a novel bioanalytical tool that can substantially improve 16S-seq-based microbiome studies by enabling comprehensive quality control along with absolute quantification. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Scaling Impacts in Life Support Architecture and Technology Selection

    NASA Technical Reports Server (NTRS)

    Lange, Kevin

    2016-01-01

    For long-duration space missions outside of Earth orbit, reliability considerations will drive higher levels of redundancy and/or on-board spares for life support equipment. Component scaling will be a critical element in minimizing overall launch mass while maintaining an acceptable level of system reliability. Building on an earlier reliability study (AIAA 2012-3491), this paper considers the impact of alternative scaling approaches, including the design of technology assemblies and their individual components to maximum, nominal, survival, or other fractional requirements. The optimal level of life support system closure is evaluated for deep-space missions of varying duration using equivalent system mass (ESM) as the comparative basis. Reliability impacts are included in ESM by estimating the number of component spares required to meet a target system reliability. Common cause failures are included in the analysis. ISS and ISS-derived life support technologies are considered along with selected alternatives. This study focusses on minimizing launch mass, which may be enabling for deep-space missions.

  1. New generation lidar systems for eye safe full time observations

    NASA Technical Reports Server (NTRS)

    Spinhirne, James D.

    1995-01-01

    The traditional lidar over the last thirty years has typically been a big pulse low repetition rate system. Pulse energies are in the 0.1 to 1.0 J range and repetition rates from 0.1 to 10 Hz. While such systems have proven to be good research tools, they have a number of limitations that prevent them from moving beyond lidar research to operational, application oriented instruments. These problems include a lack of eye safety, very low efficiency, poor reliability, lack of ruggedness and high development and operating costs. Recent advances in solid state laser, detectors and data systems have enabled the development of a new generation of lidar technology that meets the need for routine, application oriented instruments. In this paper the new approaches to operational lidar systems will be discussed. Micro pulse lidar (MPL) systems are currently in use, and their technology is highlighted. The basis and current development of continuous wave (CW) lidar and potential of other technical approaches is presented.

  2. Kinetic turbulence simulations at extreme scale on leadership-class systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Bei; Ethier, Stephane; Tang, William

    2013-01-01

    Reliable predictive simulation capability addressing confinement properties in magnetically confined fusion plasmas is critically-important for ITER, a 20 billion dollar international burning plasma device under construction in France. The complex study of kinetic turbulence, which can severely limit the energy confinement and impact the economic viability of fusion systems, requires simulations at extreme scale for such an unprecedented device size. Our newly optimized, global, ab initio particle-in-cell code solving the nonlinear equations underlying gyrokinetic theory achieves excellent performance with respect to "time to solution" at the full capacity of the IBM Blue Gene/Q on 786,432 cores of Mira at ALCFmore » and recently of the 1,572,864 cores of Sequoia at LLNL. Recent multithreading and domain decomposition optimizations in the new GTC-P code represent critically important software advances for modern, low memory per core systems by enabling routine simulations at unprecedented size (130 million grid points ITER-scale) and resolution (65 billion particles).« less

  3. An association account of false belief understanding.

    PubMed

    De Bruin, L C; Newen, A

    2012-05-01

    The elicited-response false belief task has traditionally been considered as reliably indicating that children acquire an understanding of false belief around 4 years of age. However, recent investigations using spontaneous-response tasks suggest that false belief understanding emerges much earlier. This leads to a developmental paradox: if young infants already understand false belief, then why do they fail the elicited-response false belief task? We postulate two systems to account for the development of false belief understanding: an association module, which provides infants with the capacity to register congruent associations between agents and objects, and an operating system, which allows them to transform these associations into incongruent associations through a process of inhibition, selection and representation. The interaction between the association module and the operating system enables infants to register increasingly complex associations on the basis of another agent's movements, visual perspective and propositional attitudes. This allows us account for the full range of findings on false belief understanding. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. GridPix detectors: Production and beam test results

    NASA Astrophysics Data System (ADS)

    Koppert, W. J. C.; van Bakel, N.; Bilevych, Y.; Colas, P.; Desch, K.; Fransen, M.; van der Graaf, H.; Hartjes, F.; Hessey, N. P.; Kaminski, J.; Schmitz, J.; Schön, R.; Zappon, F.

    2013-12-01

    The innovative GridPix detector is a Time Projection Chamber (TPC) that is read out with a Timepix-1 pixel chip. By using wafer post-processing techniques an aluminium grid is placed on top of the chip. When operated, the electric field between the grid and the chip is sufficient to create electron induced avalanches which are detected by the pixels. The time-to-digital converter (TDC) records the drift time enabling the reconstruction of high precision 3D track segments. Recently GridPixes were produced on full wafer scale, to meet the demand for more reliable and cheaper devices in large quantities. In a recent beam test the contribution of both diffusion and time walk to the spatial and angular resolutions of a GridPix detector with a 1.2 mm drift gap are studied in detail. In addition long term tests show that in a significant fraction of the chips the protection layer successfully quenches discharges, preventing harm to the chip.

  5. Programming Light-Harvesting Efficiency Using DNA Origami

    PubMed Central

    2016-01-01

    The remarkable performance and quantum efficiency of biological light-harvesting complexes has prompted a multidisciplinary interest in engineering biologically inspired antenna systems as a possible route to novel solar cell technologies. Key to the effectiveness of biological “nanomachines” in light capture and energy transport is their highly ordered nanoscale architecture of photoactive molecules. Recently, DNA origami has emerged as a powerful tool for organizing multiple chromophores with base-pair accuracy and full geometric freedom. Here, we present a programmable antenna array on a DNA origami platform that enables the implementation of rationally designed antenna structures. We systematically analyze the light-harvesting efficiency with respect to number of donors and interdye distances of a ring-like antenna using ensemble and single-molecule fluorescence spectroscopy and detailed Förster modeling. This comprehensive study demonstrates exquisite and reliable structural control over multichromophoric geometries and points to DNA origami as highly versatile platform for testing design concepts in artificial light-harvesting networks. PMID:26906456

  6. Federated software defined network operations for LHC experiments

    NASA Astrophysics Data System (ADS)

    Kim, Dongkyun; Byeon, Okhwan; Cho, Kihyeon

    2013-09-01

    The most well-known high-energy physics collaboration, the Large Hadron Collider (LHC), which is based on e-Science, has been facing several challenges presented by its extraordinary instruments in terms of the generation, distribution, and analysis of large amounts of scientific data. Currently, data distribution issues are being resolved by adopting an advanced Internet technology called software defined networking (SDN). Stability of the SDN operations and management is demanded to keep the federated LHC data distribution networks reliable. Therefore, in this paper, an SDN operation architecture based on the distributed virtual network operations center (DvNOC) is proposed to enable LHC researchers to assume full control of their own global end-to-end data dissemination. This may achieve an enhanced data delivery performance based on data traffic offloading with delay variation. The evaluation results indicate that the overall end-to-end data delivery performance can be improved over multi-domain SDN environments based on the proposed federated SDN/DvNOC operation framework.

  7. Paper Capillary Enables Effective Sampling for Microfluidic Paper Analytical Devices.

    PubMed

    Shangguan, Jin-Wen; Liu, Yu; Wang, Sha; Hou, Yun-Xuan; Xu, Bi-Yi; Xu, Jing-Juan; Chen, Hong-Yuan

    2018-06-06

    Paper capillary is introduced to enable effective sampling on microfluidic paper analytical devices. By coupling mac-roscale capillary force of paper capillary and microscale capillary forces of native paper, fluid transport can be flexibly tailored with proper design. Subsequently, a hybrid-fluid-mode paper capillary device was proposed, which enables fast and reliable sampling in an arrayed form, with less surface adsorption and bias for different components. The resulting device thus well supports high throughput, quantitative, and repeatable assays all by hands operation. With all these merits, multiplex analysis of ions, proteins, and microbe have all been realized on this platform, which has paved the way to level-up analysis on μPADs.

  8. Reliability analysis of laminated CMC components through shell subelement techniques

    NASA Technical Reports Server (NTRS)

    Starlinger, Alois; Duffy, Stephen F.; Gyekenyesi, John P.

    1992-01-01

    An updated version of the integrated design program Composite Ceramics Analysis and Reliability Evaluation of Structures (C/CARES) was developed for the reliability evaluation of ceramic matrix composites (CMC) laminated shell components. The algorithm is now split into two modules: a finite-element data interface program and a reliability evaluation algorithm. More flexibility is achieved, allowing for easy implementation with various finite-element programs. The interface program creates a neutral data base which is then read by the reliability module. This neutral data base concept allows easy data transfer between different computer systems. The new interface program from the finite-element code Matrix Automated Reduction and Coupling (MARC) also includes the option of using hybrid laminates (a combination of plies of different materials or different layups) and allows for variations in temperature fields throughout the component. In the current version of C/CARES, a subelement technique was implemented, enabling stress gradients within an element to be taken into account. The noninteractive reliability function is now evaluated at each Gaussian integration point instead of using averaging techniques. As a result of the increased number of stress evaluation points, considerable improvements in the accuracy of reliability analyses were realized.

  9. We need more replication research - A case for test-retest reliability.

    PubMed

    Leppink, Jimmie; Pérez-Fuster, Patricia

    2017-06-01

    Following debates in psychology on the importance of replication research, we have also started to see pleas for a more prominent role for replication research in medical education. To enable replication research, it is of paramount importance to carefully study the reliability of the instruments we use. Cronbach's alpha has been the most widely used estimator of reliability in the field of medical education, notably as some kind of quality label of test or questionnaire scores based on multiple items or of the reliability of assessment across exam stations. However, as this narrative review outlines, Cronbach's alpha or alternative reliability statistics may complement but not replace psychometric methods such as factor analysis. Moreover, multiple-item measurements should be preferred above single-item measurements, and when using single-item measurements, coefficients as Cronbach's alpha should not be interpreted as indicators of the reliability of a single item when that item is administered after fundamentally different activities, such as learning tasks that differ in content. Finally, if we want to follow up on recent pleas for more replication research, we have to start studying the test-retest reliability of the instruments we use.

  10. A Delay-Aware and Reliable Data Aggregation for Cyber-Physical Sensing

    PubMed Central

    Zhang, Jinhuan; Long, Jun; Zhang, Chengyuan; Zhao, Guihu

    2017-01-01

    Physical information sensed by various sensors in a cyber-physical system should be collected for further operation. In many applications, data aggregation should take reliability and delay into consideration. To address these problems, a novel Tiered Structure Routing-based Delay-Aware and Reliable Data Aggregation scheme named TSR-DARDA for spherical physical objects is proposed. By dividing the spherical network constructed by dispersed sensor nodes into circular tiers with specifically designed widths and cells, TSTR-DARDA tries to enable as many nodes as possible to transmit data simultaneously. In order to ensure transmission reliability, lost packets are retransmitted. Moreover, to minimize the latency while maintaining reliability for data collection, in-network aggregation and broadcast techniques are adopted to deal with the transmission between data collecting nodes in the outer layer and their parent data collecting nodes in the inner layer. Thus, the optimization problem is transformed to minimize the delay under reliability constraints by controlling the system parameters. To demonstrate the effectiveness of the proposed scheme, we have conducted extensive theoretical analysis and comparisons to evaluate the performance of TSR-DARDA. The analysis and simulations show that TSR-DARDA leads to lower delay with reliability satisfaction. PMID:28218668

  11. Composite Stress Rupture: A New Reliability Model Based on Strength Decay

    NASA Technical Reports Server (NTRS)

    Reeder, James R.

    2012-01-01

    A model is proposed to estimate reliability for stress rupture of composite overwrap pressure vessels (COPVs) and similar composite structures. This new reliability model is generated by assuming a strength degradation (or decay) over time. The model suggests that most of the strength decay occurs late in life. The strength decay model will be shown to predict a response similar to that predicted by a traditional reliability model for stress rupture based on tests at a single stress level. In addition, the model predicts that even though there is strength decay due to proof loading, a significant overall increase in reliability is gained by eliminating any weak vessels, which would fail early. The model predicts that there should be significant periods of safe life following proof loading, because time is required for the strength to decay from the proof stress level to the subsequent loading level. Suggestions for testing the strength decay reliability model have been made. If the strength decay reliability model predictions are shown through testing to be accurate, COPVs may be designed to carry a higher level of stress than is currently allowed, which will enable the production of lighter structures

  12. Deciphering the distance to antibiotic resistance for the pneumococcus using genome sequencing data

    PubMed Central

    Mobegi, Fredrick M.; Cremers, Amelieke J. H.; de Jonge, Marien I.; Bentley, Stephen D.; van Hijum, Sacha A. F. T.; Zomer, Aldert

    2017-01-01

    Advances in genome sequencing technologies and genome-wide association studies (GWAS) have provided unprecedented insights into the molecular basis of microbial phenotypes and enabled the identification of the underlying genetic variants in real populations. However, utilization of genome sequencing in clinical phenotyping of bacteria is challenging due to the lack of reliable and accurate approaches. Here, we report a method for predicting microbial resistance patterns using genome sequencing data. We analyzed whole genome sequences of 1,680 Streptococcus pneumoniae isolates from four independent populations using GWAS and identified probable hotspots of genetic variation which correlate with phenotypes of resistance to essential classes of antibiotics. With the premise that accumulation of putative resistance-conferring SNPs, potentially in combination with specific resistance genes, precedes full resistance, we retrogressively surveyed the hotspot loci and quantified the number of SNPs and/or genes, which if accumulated would confer full resistance to an otherwise susceptible strain. We name this approach the ‘distance to resistance’. It can be used to identify the creep towards complete antibiotics resistance in bacteria using genome sequencing. This approach serves as a basis for the development of future sequencing-based methods for predicting resistance profiles of bacterial strains in hospital microbiology and public health settings. PMID:28205635

  13. 78 FR 37431 - Expanding America's Leadership in Wireless Innovation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-20

    ... Policy Team, in consultation with the Department of Justice, the National Archives and Records... making recommendations to the President regarding market-based or other approaches that could give... conditions that promote a reliable secondary market for spectrum, including provisions enabling negotiated...

  14. Energy Advantages for Green Schools

    ERIC Educational Resources Information Center

    Griffin, J. Tim

    2012-01-01

    Because of many advantages associated with central utility systems, school campuses, from large universities to elementary schools, have used district energy for decades. District energy facilities enable thermal and electric utilities to be generated with greater efficiency and higher system reliability, while requiring fewer maintenance and…

  15. System reliability of randomly vibrating structures: Computational modeling and laboratory testing

    NASA Astrophysics Data System (ADS)

    Sundar, V. S.; Ammanagi, S.; Manohar, C. S.

    2015-09-01

    The problem of determination of system reliability of randomly vibrating structures arises in many application areas of engineering. We discuss in this paper approaches based on Monte Carlo simulations and laboratory testing to tackle problems of time variant system reliability estimation. The strategy we adopt is based on the application of Girsanov's transformation to the governing stochastic differential equations which enables estimation of probability of failure with significantly reduced number of samples than what is needed in a direct simulation study. Notably, we show that the ideas from Girsanov's transformation based Monte Carlo simulations can be extended to conduct laboratory testing to assess system reliability of engineering structures with reduced number of samples and hence with reduced testing times. Illustrative examples include computational studies on a 10-degree of freedom nonlinear system model and laboratory/computational investigations on road load response of an automotive system tested on a four-post test rig.

  16. Techniques for control of long-term reliability of complex integrated circuits. I - Reliability assurance by test vehicle qualification.

    NASA Technical Reports Server (NTRS)

    Van Vonno, N. W.

    1972-01-01

    Development of an alternate approach to the conventional methods of reliability assurance for large-scale integrated circuits. The product treated is a large-scale T squared L array designed for space applications. The concept used is that of qualification of product by evaluation of the basic processing used in fabricating the product, providing an insight into its potential reliability. Test vehicles are described which enable evaluation of device characteristics, surface condition, and various parameters of the two-level metallization system used. Evaluation of these test vehicles is performed on a lot qualification basis, with the lot consisting of one wafer. Assembled test vehicles are evaluated by high temperature stress at 300 C for short time durations. Stressing at these temperatures provides a rapid method of evaluation and permits a go/no go decision to be made on the wafer lot in a timely fashion.

  17. Recent GE BWR fuel experience and design evolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, J.E.; Potts, G.A.; Proebstle, R.A.

    1992-01-01

    Reliable fuel operation is essential to the safe, reliable, and economic power production by today's commercial nuclear reactors. GE Nuclear Energy is committed to maximize fuel reliability through the progressive development of improved fuel design features and dedication to provide the maximum quality of the design features and dedication to provide the maximum quality of the design, fabrication, and operation of GE BWR fuel. Over the last 35 years, GE has designed, fabricated, and placed in operation over 82,000 BWR fuel bundles containing over 5 million fuel rods. This experience includes successful commercial reactor operation of fuel assemblies to greatermore » than 45000 MWd/MTU bundle average exposure. This paper reports that this extensive experience base has enabled clear identification and characterization of the active failure mechanisms. With this failure mechanism characterization, mitigating actions have been developed and implemented by GE to provide the highest reliability BWR fuel bundles possible.« less

  18. Enabling reliability assessments of pre-commercial perovskite photovoltaics with lessons learned from industrial standards

    NASA Astrophysics Data System (ADS)

    Snaith, Henry J.; Hacke, Peter

    2018-06-01

    Photovoltaic modules are expected to operate in the field for more than 25 years, so reliability assessment is critical for the commercialization of new photovoltaic technologies. In early development stages, understanding and addressing the device degradation mechanisms are the priorities. However, any technology targeting large-scale deployment must eventually pass industry-standard qualification tests and undergo reliability testing to validate the module lifetime. In this Perspective, we review the methodologies used to assess the reliability of established photovoltaics technologies and to develop standardized qualification tests. We present the stress factors and stress levels for degradation mechanisms currently identified in pre-commercial perovskite devices, along with engineering concepts for mitigation of those degradation modes. Recommendations for complete and transparent reporting of stability tests are given, to facilitate future inter-laboratory comparisons and to further the understanding of field-relevant degradation mechanisms, which will benefit the development of accelerated stress tests.

  19. Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks.

    PubMed

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2017-11-05

    Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way.

  20. Systems Integration Fact Sheet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2016-06-01

    This fact sheet is an overview of the Systems Integration subprogram at the U.S. Department of Energy SunShot Initiative. The Systems Integration subprogram enables the widespread deployment of safe, reliable, and cost-effective solar energy technologies by addressing the associated technical and non-technical challenges. These include timely and cost-effective interconnection procedures, optimal system planning, accurate prediction of solar resources, monitoring and control of solar power, maintaining grid reliability and stability, and many more. To address the challenges associated with interconnecting and integrating hundreds of gigawatts of solar power onto the electricity grid, the Systems Integration program funds research, development, and demonstrationmore » projects in four broad, interrelated focus areas: grid performance and reliability, dispatchability, power electronics, and communications.« less

  1. Cavitating Propeller Performance in Inclined Shaft Conditions with OpenFOAM: PPTC 2015 Test Case

    NASA Astrophysics Data System (ADS)

    Gaggero, Stefano; Villa, Diego

    2018-05-01

    In this paper, we present our analysis of the non-cavitating and cavitating unsteady performances of the Potsdam Propeller Test Case (PPTC) in oblique flow. For our calculations, we used the Reynolds-averaged Navier-Stokes equation (RANSE) solver from the open-source OpenFOAM libraries. We selected the homogeneous mixture approach to solve for multiphase flow with phase change, using the volume of fluid (VoF) approach to solve the multiphase flow and modeling the mass transfer between vapor and water with the Schnerr-Sauer model. Comparing the model results with the experimental measurements collected during the Second Workshop on Cavitation and Propeller Performance - SMP'15 enabled our assessment of the reliability of the open-source calculations. Comparisons with the numerical data collected during the workshop enabled further analysis of the reliability of different flow solvers from which we produced an overview of recommended guidelines (mesh arrangements and solver setups) for accurate numerical prediction even in off-design conditions. Lastly, we propose a number of calculations using the boundary element method developed at the University of Genoa for assessing the reliability of this dated but still widely adopted approach for design and optimization in the preliminary stages of very demanding test cases.

  2. Rapid Focused Ion Beam Milling Based Fabrication of Plasmonic Nanoparticles and Assemblies via "Sketch and Peel" Strategy.

    PubMed

    Chen, Yiqin; Bi, Kaixi; Wang, Qianjin; Zheng, Mengjie; Liu, Qing; Han, Yunxin; Yang, Junbo; Chang, Shengli; Zhang, Guanhua; Duan, Huigao

    2016-12-27

    Focused ion beam (FIB) milling is a versatile maskless and resistless patterning technique and has been widely used for the fabrication of inverse plasmonic structures such as nanoholes and nanoslits for various applications. However, due to its subtractive milling nature, it is an impractical method to fabricate isolated plasmonic nanoparticles and assemblies which are more commonly adopted in applications. In this work, we propose and demonstrate an approach to reliably and rapidly define plasmonic nanoparticles and their assemblies using FIB milling via a simple "sketch and peel" strategy. Systematic experimental investigations and mechanism studies reveal that the high reliability of this fabrication approach is enabled by a conformally formed sidewall coating due to the ion-milling-induced redeposition. Particularly, we demonstrated that this strategy is also applicable to the state-of-the-art helium ion beam milling technology, with which high-fidelity plasmonic dimers with tiny gaps could be directly and rapidly prototyped. Because the proposed approach enables rapid and reliable patterning of arbitrary plasmonic nanostructures that are not feasible to fabricate via conventional FIB milling process, our work provides the FIB milling technology an additional nanopatterning capability and thus could greatly increase its popularity for utilization in fundamental research and device prototyping.

  3. Assessing Reliability of Cold Spray Sputter Targets in Photovoltaic Manufacturing

    NASA Astrophysics Data System (ADS)

    Hardikar, Kedar; Vlcek, Johannes; Bheemreddy, Venkata; Juliano, Daniel

    2017-10-01

    Cold spray has been used to manufacture more than 800 Cu-In-Ga (CIG) sputter targets for deposition of high-efficiency photovoltaic thin films. It is a preferred technique since it enables high deposit purity and transfer of non-equilibrium alloy states to the target material. In this work, an integrated approach to reliability assessment of such targets with deposit weight in excess of 50 lb. is undertaken, involving thermal-mechanical characterization of the material in as-deposited condition, characterization of the interface adhesion on cylindrical substrate in as-deposited condition, and developing means to assess target integrity under thermal-mechanical loads during the physical vapor deposition (PVD) sputtering process. Mechanical characterization of cold spray deposited CIG alloy is accomplished through the use of indentation testing and adaptation of Brazilian disk test. A custom lever test was developed to characterize adhesion along the cylindrical interface between the CIG deposit and cylindrical substrate, overcoming limitations of current standards. A cohesive zone model for crack initiation and propagation at the deposit interface is developed and validated using the lever test and later used to simulate the potential catastrophic target failure in the PVD process. It is shown that this approach enables reliability assessment of sputter targets and improves robustness.

  4. A prototype for unsupervised analysis of tissue microarrays for cancer research and diagnostics.

    PubMed

    Chen, Wenjin; Reiss, Michael; Foran, David J

    2004-06-01

    The tissue microarray (TMA) technique enables researchers to extract small cylinders of tissue from histological sections and arrange them in a matrix configuration on a recipient paraffin block such that hundreds can be analyzed simultaneously. TMA offers several advantages over traditional specimen preparation by maximizing limited tissue resources and providing a highly efficient means for visualizing molecular targets. By enabling researchers to reliably determine the protein expression profile for specific types of cancer, it may be possible to elucidate the mechanism by which healthy tissues are transformed into malignancies. Currently, the primary methods used to evaluate arrays involve the interactive review of TMA samples while they are viewed under a microscope, subjectively evaluated, and scored by a technician. This process is extremely slow, tedious, and prone to error. In order to facilitate large-scale, multi-institutional studies, a more automated and reliable means for analyzing TMAs is needed. We report here a web-based prototype which features automated imaging, registration, and distributed archiving of TMAs in multiuser network environments. The system utilizes a principal color decomposition approach to identify and characterize the predominant staining signatures of specimens in color space. This strategy was shown to be reliable for detecting and quantifying the immunohistochemical expression levels for TMAs.

  5. Health management and controls for Earth-to-orbit propulsion systems

    NASA Astrophysics Data System (ADS)

    Bickford, R. L.

    1995-03-01

    Avionics and health management technologies increase the safety and reliability while decreasing the overall cost for Earth-to-orbit (ETO) propulsion systems. New ETO propulsion systems will depend on highly reliable fault tolerant flight avionics, advanced sensing systems and artificial intelligence aided software to ensure critical control, safety and maintenance requirements are met in a cost effective manner. Propulsion avionics consist of the engine controller, actuators, sensors, software and ground support elements. In addition to control and safety functions, these elements perform system monitoring for health management. Health management is enhanced by advanced sensing systems and algorithms which provide automated fault detection and enable adaptive control and/or maintenance approaches. Aerojet is developing advanced fault tolerant rocket engine controllers which provide very high levels of reliability. Smart sensors and software systems which significantly enhance fault coverage and enable automated operations are also under development. Smart sensing systems, such as flight capable plume spectrometers, have reached maturity in ground-based applications and are suitable for bridging to flight. Software to detect failed sensors has reached similar maturity. This paper will discuss fault detection and isolation for advanced rocket engine controllers as well as examples of advanced sensing systems and software which significantly improve component failure detection for engine system safety and health management.

  6. Cognitive Decline in Down Syndrome: A Validity/Reliability Study of the Test for Severe Impairment.

    ERIC Educational Resources Information Center

    Cosgrave, Mary P.; McCarron, Mary; Anderson, Mary; Tyrrell, Janette; Gill, Michael; Lawlor, Brian A.

    1998-01-01

    The utility of the Test for Severe Impairment was studied with 60 older persons who had Down Syndrome. Construct validity, test-retest reliability, and interrater reliability were established for the full study group and for subgroups based on degree of mental retardation and dementia status. Some possible applications and limitations of the test…

  7. Reliability of Three Benton Judgment of Line Orientation Short Forms in Idiopathic Parkinson’s Disease

    PubMed Central

    Gullett, Joseph M.; Price, Catherine C.; Nguyen, Peter; Okun, Michael S.; Bauer, Russell M.; Bowers, Dawn

    2013-01-01

    Individuals with Parkinson’s disease (PD) often exhibit deficits in visuospatial functioning throughout the course of their disease. These deficits should be carefully assessed as they may have implications for patient safety and disease severity. One of the most commonly administered tests of visuospatial ability, the Benton Judgment of Line Orientation (JLO), consists of 30 pairs of lines requiring the patient to match the orientation of two lines to an array of 11 lines on a separate page. Reliable short forms have been constructed out of the full JLO form, but the reliability of these forms in PD has yet to be examined. Recent functional MRI studies examining the JLO demonstrate right parietal and occipital activation, as well as bilateral frontal activation and PD is known to adversely affect these pathways. We compared the reliability of the original full form to three unique short forms in a sample of 141 non-demented, idiopathic PD patients and 56 age and education matched controls. Results indicated that a two-thirds length short form can be used with high reliability and classification accuracy in patients with idiopathic PD. The other short forms performed in a similar, though slightly less reliable manner. PMID:23957375

  8. A self-learning camera for the validation of highly variable and pseudorandom patterns

    NASA Astrophysics Data System (ADS)

    Kelley, Michael

    2004-05-01

    Reliable and productive manufacturing operations have depended on people to quickly detect and solve problems whenever they appear. Over the last 20 years, more and more manufacturing operations have embraced machine vision systems to increase productivity, reliability and cost-effectiveness, including reducing the number of human operators required. Although machine vision technology has long been capable of solving simple problems, it has still not been broadly implemented. The reason is that until now, no machine vision system has been designed to meet the unique demands of complicated pattern recognition. The ZiCAM family was specifically developed to be the first practical hardware to meet these needs. To be able to address non-traditional applications, the machine vision industry must include smart camera technology that meets its users" demands for lower costs, better performance and the ability to address applications of irregular lighting, patterns and color. The next-generation smart cameras will need to evolve as a fundamentally different kind of sensor, with new technology that behaves like a human but performs like a computer. Neural network based systems, coupled with self-taught, n-space, non-linear modeling, promises to be the enabler of the next generation of machine vision equipment. Image processing technology is now available that enables a system to match an operator"s subjectivity. A Zero-Instruction-Set-Computer (ZISC) powered smart camera allows high-speed fuzzy-logic processing, without the need for computer programming. This can address applications of validating highly variable and pseudo-random patterns. A hardware-based implementation of a neural network, Zero-Instruction-Set-Computer, enables a vision system to "think" and "inspect" like a human, with the speed and reliability of a machine.

  9. TruMicro Series 2000 sub-400 fs class industrial fiber lasers: adjustment of laser parameters to process requirements

    NASA Astrophysics Data System (ADS)

    Kanal, Florian; Kahmann, Max; Tan, Chuong; Diekamp, Holger; Jansen, Florian; Scelle, Raphael; Budnicki, Aleksander; Sutter, Dirk

    2017-02-01

    The matchless properties of ultrashort laser pulses, such as the enabling of cold processing and non-linear absorption, pave the way to numerous novel applications. Ultrafast lasers arrived in the last decade at a level of reliability suitable for the industrial environment.1 Within the next years many industrial manufacturing processes in several markets will be replaced by laser-based processes due to their well-known benefits: These are non-contact wear-free processing, higher process accuracy or an increase of processing speed and often improved economic efficiency compared to conventional processes. Furthermore, new processes will arise with novel sources, addressing previously unsolved challenges. One technical requirement for these exciting new applications will be to optimize the large number of available parameters to the requirements of the application. In this work we present an ultrafast laser system distinguished by its capability to combine high flexibility and real time process-inherent adjustments of the parameters with industry-ready reliability. This industry-ready reliability is ensured by a long experience in designing and building ultrashort-pulse lasers in combination with rigorous optimization of the mechanical construction, optical components and the entire laser head for continuous performance. By introducing a new generation of mechanical design in the last few years, TRUMPF enabled its ultrashort-laser platforms to fulfill the very demanding requirements for passively coupling high-energy single-mode radiation into a hollow-core transport fiber. The laser architecture presented here is based on the all fiber MOPA (master oscillator power amplifier) CPA (chirped pulse amplification) technology. The pulses are generated in a high repetition rate mode-locked fiber oscillator also enabling flexible pulse bursts (groups of multiple pulses) with 20 ns intra-burst pulse separation. An external acousto-optic modulator (XAOM) enables linearization and multi-level quad-loop stabilization of the output power of the laser.2 In addition to the well-established platform latest developments addressed single-pulse energies up to 50 μJ and made femtosecond pulse durations available for the TruMicro Series 2000. Beyond these stabilization aspects this laser architecture together with other optical modules and combined with smart laser control software enables process-driven adjustments of the parameters (e. g. repetition rate, multi-pulse functionalities, pulse energy, pulse duration) by external signals, which will be presented in this work.

  10. Solar Energy Grid Integration Systems (SEGIS): adding functionality while maintaining reliability and economics

    NASA Astrophysics Data System (ADS)

    Bower, Ward

    2011-09-01

    An overview of the activities and progress made during the US DOE Solar Energy Grid Integration Systems (SEGIS) solicitation, while maintaining reliability and economics is provided. The SEGIS R&D opened pathways for interconnecting PV systems to intelligent utility grids and micro-grids of the future. In addition to new capabilities are "value added" features. The new hardware designs resulted in smaller, less material-intensive products that are being viewed by utilities as enabling dispatchable generation and not just unpredictable negative loads. The technical solutions enable "advanced integrated system" concepts and "smart grid" processes to move forward in a faster and focused manner. The advanced integrated inverters/controllers can now incorporate energy management functionality, intelligent electrical grid support features and a multiplicity of communication technologies. Portals for energy flow and two-way communications have been implemented. SEGIS hardware was developed for the utility grid of today, which was designed for one-way power flow, for intermediate grid scenarios, AND for the grid of tomorrow, which will seamlessly accommodate managed two-way power flows as required by large-scale deployment of solar and other distributed generation. The SEGIS hardware and control developed for today meets existing standards and codes AND provides for future connections to a "smart grid" mode that enables utility control and optimized performance.

  11. A stepwise, multi-objective, multi-variable parameter optimization method for the APEX model

    USDA-ARS?s Scientific Manuscript database

    Proper parameterization enables hydrological models to make reliable estimates of non-point source pollution for effective control measures. The automatic calibration of hydrologic models requires significant computational power limiting its application. The study objective was to develop and eval...

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Youssef, Tarek A.; Elsayed, Ahmed T.; Mohammed, Osama A.

    This study presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS) is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discoverymore » feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS).« less

  13. A Probabilistic System Analysis of Intelligent Propulsion System Technologies

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2007-01-01

    NASA s Intelligent Propulsion System Technology (Propulsion 21) project focuses on developing adaptive technologies that will enable commercial gas turbine engines to produce fewer emissions and less noise while increasing reliability. It features adaptive technologies that have included active tip-clearance control for turbine and compressor, active combustion control, turbine aero-thermal and flow control, and enabling technologies such as sensors which are reliable at high operating temperatures and are minimally intrusive. A probabilistic system analysis is performed to evaluate the impact of these technologies on aircraft CO2 (directly proportional to fuel burn) and LTO (landing and takeoff) NO(x) reductions. A 300-passenger aircraft, with two 396-kN thrust (85,000-pound) engines is chosen for the study. The results show that NASA s Intelligent Propulsion System technologies have the potential to significantly reduce the CO2 and NO(x) emissions. The results are used to support informed decisionmaking on the development of the intelligent propulsion system technology portfolio for CO2 and NO(x) reductions.

  14. Three-dimensional assessment of the asymptomatic and post-stroke shoulder: intra-rater test-retest reliability and within-subject repeatability of the palpation and digitization approach.

    PubMed

    Pain, Liza A M; Baker, Ross; Sohail, Qazi Zain; Richardson, Denyse; Zabjek, Karl; Mogk, Jeremy P M; Agur, Anne M R

    2018-03-23

    Altered three-dimensional (3D) joint kinematics can contribute to shoulder pathology, including post-stroke shoulder pain. Reliable assessment methods enable comparative studies between asymptomatic shoulders of healthy subjects and painful shoulders of post-stroke subjects, and could inform treatment planning for post-stroke shoulder pain. The study purpose was to establish intra-rater test-retest reliability and within-subject repeatability of a palpation/digitization protocol, which assesses 3D clavicular/scapular/humeral rotations, in asymptomatic and painful post-stroke shoulders. Repeated measurements of 3D clavicular/scapular/humeral joint/segment rotations were obtained using palpation/digitization in 32 asymptomatic and six painful post-stroke shoulders during four reaching postures (rest/flexion/abduction/external rotation). Intra-class correlation coefficients (ICCs), standard error of the measurement and 95% confidence intervals were calculated. All ICC values indicated high to very high test-retest reliability (≥0.70), with lower reliability for scapular anterior/posterior tilt during external rotation in asymptomatic subjects, and scapular medial/lateral rotation, humeral horizontal abduction/adduction and axial rotation during abduction in post-stroke subjects. All standard error of measurement values demonstrated within-subject repeatability error ≤5° for all clavicular/scapular/humeral joint/segment rotations (asymptomatic ≤3.75°; post-stroke ≤5.0°), except for humeral axial rotation (asymptomatic ≤5°; post-stroke ≤15°). This noninvasive, clinically feasible palpation/digitization protocol was reliable and repeatable in asymptomatic shoulders, and in a smaller sample of painful post-stroke shoulders. Implications for Rehabilitation In the clinical setting, a reliable and repeatable noninvasive method for assessment of three-dimensional (3D) clavicular/scapular/humeral joint orientation and range of motion (ROM) is currently required. The established reliability and repeatability of this proposed palpation/digitization protocol will enable comparative 3D ROM studies between asymptomatic and post-stroke shoulders, which will further inform treatment planning. Intra-rater test-retest repeatability, which is measured by the standard error of the measure, indicates the range of error associated with a single test measure. Therefore, clinicians can use the standard error of the measure to determine the "true" differences between pre-treatment and post-treatment test scores.

  15. The effectiveness of using the calculated braking current for longitudinal differential protection of 110 - 750 kV shunt reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vdovin, S. A.; Shalimov, A. S.

    2013-05-15

    The use of the function of effective current braking of the longitudinal differential protection of shunt reactors to offset current surges, which enables the sensitivity of differential protection to be increased when there are short circuits with low damage currents, is considered. It is shown that the use of the calculated braking characteristic enables the reliability of offset protection from transients to be increased when the reactor is connected, which is accompanied by the flow of asymmetric currents containing an aperiodic component.

  16. Synthesis of Natural and Unnatural Cyclooligomeric Depsipeptides Enabled by Flow Chemistry.

    PubMed

    Lücke, Daniel; Dalton, Toryn; Ley, Steven V; Wilson, Zoe E

    2016-03-14

    Flow chemistry has been successfully integrated into the synthesis of a series of cyclooligomeric depsipeptides of three different ring sizes including the natural products beauvericin (1 a), bassianolide (2 b) and enniatin C (1 b). A reliable flow chemistry protocol was established for the coupling and macrocyclisation to form challenging N-methylated amides. This flexible approach has allowed the rapid synthesis of both natural and unnatural depsipeptides in high yields, enabling further exploration of their promising biological activity. © 2016 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  17. Propulsion controls

    NASA Technical Reports Server (NTRS)

    Harkney, R. D.

    1980-01-01

    Increased system requirements and functional integration with the aircraft have placed an increased demand on control system capability and reliability. To provide these at an affordable cost and weight and because of the rapid advances in electronic technology, hydromechanical systems are being phased out in favor of digital electronic systems. The transition is expected to be orderly from electronic trimming of hydromechanical controls to full authority digital electronic control. Future propulsion system controls will be highly reliable full authority digital electronic with selected component and circuit redundancy to provide the required safety and reliability. Redundancy may include a complete backup control of a different technology for single engine applications. The propulsion control will be required to communicate rapidly with the various flight and fire control avionics as part of an integrated control concept.

  18. Composite Reliability and Standard Errors of Measurement for a Seven-Subtest Short Form of the Wechsler Adult Intelligence Scale-Revised.

    ERIC Educational Resources Information Center

    Schretlen, David; And Others

    1994-01-01

    Composite reliability and standard errors of measurement were computed for prorated Verbal, Performance, and Full-Scale intelligence quotient (IQ) scores from a seven-subtest short form of the Wechsler Adult Intelligence Scale-Revised. Results with 1,880 adults (standardization sample) indicate that this form is as reliable as the complete test.…

  19. Qualification and Reliability for MEMS and IC Packages

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza

    2004-01-01

    Advanced IC electronic packages are moving toward miniaturization from two key different approaches, front and back-end processes, each with their own challenges. Successful use of more of the back-end process front-end, e.g. microelectromechanical systems (MEMS) Wafer Level Package (WLP), enable reducing size and cost. Use of direct flip chip die is the most efficient approach if and when the issues of know good die and board/assembly are resolved. Wafer level package solve the issue of known good die by enabling package test, but it has its own limitation, e.g., the I/O limitation, additional cost, and reliability. From the back-end approach, system-in-a-package (SIAP/SIP) development is a response to an increasing demand for package and die integration of different functions into one unit to reduce size and cost and improve functionality. MEMS add another challenging dimension to electronic packaging since they include moving mechanical elements. Conventional qualification and reliability need to be modified and expanded in most cases in order to detect new unknown failures. This paper will review four standards that already released or being developed that specifically address the issues on qualification and reliability of assembled packages. Exposures to thermal cycles, monotonic bend test, mechanical shock and drop are covered in these specifications. Finally, mechanical and thermal cycle qualification data generated for MEMS accelerometer will be presented. The MEMS was an element of an inertial measurement unit (IMU) qualified for NASA Mars Exploration Rovers (MERs), Spirit and Opportunity that successfully is currently roaring the Martian surface

  20. “Retention Projection” Enables Reliable Use of Shared Gas Chromatographic Retention Data Across Labs, Instruments, and Methods

    PubMed Central

    Barnes, Brian B.; Wilson, Michael B.; Carr, Peter W.; Vitha, Mark F.; Broeckling, Corey D.; Heuberger, Adam L.; Prenni, Jessica; Janis, Gregory C.; Corcoran, Henry; Snow, Nicholas H.; Chopra, Shilpi; Dhandapani, Ramkumar; Tawfall, Amanda; Sumner, Lloyd W.; Boswell, Paul G.

    2014-01-01

    Gas chromatography-mass spectrometry (GC-MS) is a primary tool used to identify compounds in complex samples. Both mass spectra and GC retention times are matched to those of standards, but it is often impractical to have standards on hand for every compound of interest, so we must rely on shared databases of MS data and GC retention information. Unfortunately, retention databases (e.g. linear retention index libraries) are experimentally restrictive, notoriously unreliable, and strongly instrument dependent, relegating GC retention information to a minor, often negligible role in compound identification despite its potential power. A new methodology called “retention projection” has great potential to overcome the limitations of shared chromatographic databases. In this work, we tested the reliability of the methodology in five independent laboratories. We found that even when each lab ran nominally the same method, the methodology was 3-fold more accurate than retention indexing because it properly accounted for unintentional differences between the GC-MS systems. When the labs used different methods of their own choosing, retention projections were 4- to 165-fold more accurate. More importantly, the distribution of error in the retention projections was predictable across different methods and labs, thus enabling automatic calculation of retention time tolerance windows. Tolerance windows at 99% confidence were generally narrower than those widely used even when physical standards are on hand to measure their retention. With its high accuracy and reliability, the new retention projection methodology makes GC retention a reliable, precise tool for compound identification, even when standards are not available to the user. PMID:24205931

  1. Method for reworkable packaging of high speed, low electrical parasitic power electronics modules through gate drive integration

    DOEpatents

    Passmore, Brandon; Cole, Zach; Whitaker, Bret; Barkley, Adam; McNutt, Ty; Lostetter, Alexander

    2016-08-02

    A multichip power module directly connecting the busboard to a printed-circuit board that is attached to the power substrate enabling extremely low loop inductance for extreme environments such as high temperature operation. Wire bond interconnections are taught from the power die directly to the busboard further enabling enable low parasitic interconnections. Integration of on-board high frequency bus capacitors provide extremely low loop inductance. An extreme environment gate driver board allows close physical proximity of gate driver and power stage to reduce overall volume and reduce impedance in the control circuit. Parallel spring-loaded pin gate driver PCB connections allows a reliable and reworkable power module to gate driver interconnections.

  2. Tracking reliability for space cabin-borne equipment in development by Crow model.

    PubMed

    Chen, J D; Jiao, S J; Sun, H L

    2001-12-01

    Objective. To study and track the reliability growth of manned spaceflight cabin-borne equipment in the course of its development. Method. A new technique of reliability growth estimation and prediction, which is composed of the Crow model and test data conversion (TDC) method was used. Result. The estimation and prediction value of the reliability growth conformed to its expectations. Conclusion. The method could dynamically estimate and predict the reliability of the equipment by making full use of various test information in the course of its development. It offered not only a possibility of tracking the equipment reliability growth, but also the reference for quality control in manned spaceflight cabin-borne equipment design and development process.

  3. Entrepreneurship Education. OPTIONS. Expanding Educational Services for Adults.

    ERIC Educational Resources Information Center

    Belcher, James O.; Warmbrod, Catharine P.

    This monograph is part of OPTIONS, a packaged set of materials developed to provide postsecondary administrators, program planners, curriculum developers, counselors, and instructors with up-to-date, reliable information. This volume and two other monographs are intended to enable counselors and instructors to establish and conduct special…

  4. Generalizability Theory and Classical Test Theory

    ERIC Educational Resources Information Center

    Brennan, Robert L.

    2011-01-01

    Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…

  5. Physical activity problem-solving inventory for adolescents: Development and initial validation

    USDA-ARS?s Scientific Manuscript database

    Youth encounter physical activity barriers, often called problems. The purpose of problem-solving is to generate solutions to overcome the barriers. Enhancing problem-solving ability may enable youth to be more physically active. Therefore, a method for reliably assessing physical activity problem-s...

  6. Innovations in Mission Architectures for Human and Robotic Exploration Beyond Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Cooke, Douglas R.; Joosten, B. Kent; Lo, Martin W.; Ford, Ken; Hansen, Jack

    2002-01-01

    Through the application of advanced technologies, mission concepts, and new ideas in combining capabilities, architectures for missions beyond Earth orbit have been dramatically simplified. These concepts enable a stepping stone approach to discovery driven, technology enabled exploration. Numbers and masses of vehicles required are greatly reduced, yet enable the pursuit of a broader range of objectives. The scope of missions addressed range from the assembly and maintenance of arrays of telescopes for emplacement at the Earth-Sun L2, to Human missions to asteroids, the moon and Mars. Vehicle designs are developed for proof of concept, to validate mission approaches and understand the value of new technologies. The stepping stone approach employs an incremental buildup of capabilities; allowing for decision points on exploration objectives. It enables testing of technologies to achieve greater reliability and understanding of costs for the next steps in exploration.

  7. [The assessment of family resources and need for help: Construct validity and reliability of the Systematic Exploration and Process Inventory for health professionals in early childhood intervention services (SEVG)].

    PubMed

    Scharmanski, Sara; Renner, Ilona

    2016-12-01

    Health professionals in early childhood intervention and prevention make an important contribution by helping burdened families with young children cope with everyday life and child raising issues. A prerequisite for success is the health professionals' ability to tailor their services to the specific needs of families. The "Systematic Exploration and Process Inventory for health professionals in early childhood intervention services (SEVG)" can be used to identify each family's individual resources and needs, enabling a valid, reliable and objective assessment of the conditions and the process of counseling service. The present paper presents the statistical analyses that were used to confirm the reliability of the inventory. Based on the results of the reliability analysis and principal component analysis (PCA), the SEVG seems to be a reliable and objective inventory for assessing families' need for support. It also allows for calculation of average values of each scale. The development of valid and reliable assessments is essential to quality assurance and the professionalization of interventions in early childhood service. Copyright © 2016. Published by Elsevier GmbH.

  8. Large Liquid Rocket Testing: Strategies and Challenges

    NASA Technical Reports Server (NTRS)

    Rahman, Shamim A.; Hebert, Bartt J.

    2005-01-01

    Rocket propulsion development is enabled by rigorous ground testing in order to mitigate the propulsion systems risks that are inherent in space flight. This is true for virtually all propulsive devices of a space vehicle including liquid and solid rocket propulsion, chemical and non-chemical propulsion, boost stage and in-space propulsion and so forth. In particular, large liquid rocket propulsion development and testing over the past five decades of human and robotic space flight has involved a combination of component-level testing and engine-level testing to first demonstrate that the propulsion devices were designed to meet the specified requirements for the Earth to Orbit launchers that they powered. This was followed by a vigorous test campaign to demonstrate the designed propulsion articles over the required operational envelope, and over robust margins, such that a sufficiently reliable propulsion system is delivered prior to first flight. It is possible that hundreds of tests, and on the order of a hundred thousand test seconds, are needed to achieve a high-reliability, flight-ready, liquid rocket engine system. This paper overviews aspects of earlier and recent experience of liquid rocket propulsion testing at NASA Stennis Space Center, where full scale flight engines and flight stages, as well as a significant amount of development testing has taken place in the past decade. The liquid rocket testing experience discussed includes testing of engine components (gas generators, preburners, thrust chambers, pumps, powerheads), as well as engine systems and complete stages. The number of tests, accumulated test seconds, and years of test stand occupancy needed to meet varying test objectives, will be selectively discussed and compared for the wide variety of ground test work that has been conducted at Stennis for subscale and full scale liquid rocket devices. Since rocket propulsion is a crucial long-lead element of any space system acquisition or development, the appropriate plan and strategy must be put in place at the outset of the development effort. A deferment of this test planning, or inattention to strategy, will compromise the ability of the development program to achieve its systems reliability requirements and/or its development milestones. It is important for the government leadership and support team, as well as the vehicle and propulsion development team, to give early consideration to this aspect of space propulsion and space transportation work.

  9. HPLC-MRM relative quantification analysis of fatty acids based on a novel derivatization strategy.

    PubMed

    Cai, Tie; Ting, Hu; Xin-Xiang, Zhang; Jiang, Zhou; Jin-Lan, Zhang

    2014-12-07

    Fatty acids (FAs) are associated with a series of diseases including tumors, diabetes, and heart diseases. As potential biomarkers, FAs have attracted increasing attention from both biological researchers and the pharmaceutical industry. However, poor ionization efficiency, extreme diversity, strict dependence on internal standards and complicated multiple reaction monitoring (MRM) optimization protocols have challenged efforts to quantify FAs. In this work, a novel derivatization strategy based on 2,4-bis(diethylamino)-6-hydrazino-1,3,5-triazine was developed to enable quantification of FAs. The sensitivity of FA detection was significantly enhanced as a result of the derivatization procedure. FA quantities as low as 10 fg could be detected by high-performance liquid chromatography coupled with triple-quadrupole mass spectrometry. General MRM conditions were developed for any FA, which facilitated the quantification and extended the application of the method. The FA quantification strategy based on HPLC-MRM was carried out using deuterated derivatization reagents. "Heavy" derivatization reagents were used as internal standards (ISs) to minimize matrix effects. Prior to statistical analysis, amounts of each FA species were normalized by their corresponding IS, which guaranteed the accuracy and reliability of the method. FA changes in plasma induced by ageing were studied using this strategy. Several FA species were identified as potential ageing biomarkers. The sensitivity, accuracy, reliability, and full coverage of the method ensure that this strategy has strong potential for both biomarker discovery and lipidomic research.

  10. Test-retest reliability and practice effects of a rapid screen of mild traumatic brain injury.

    PubMed

    De Monte, Veronica Eileen; Geffen, Gina Malke; Kwapil, Karleigh

    2005-07-01

    Test-retest reliabilities and practice effects of measures from the Rapid Screen of Concussion (RSC), in addition to the Digit Symbol Substitution Test (Digit Symbol), were examined. Twenty five male participants were tested three times; each testing session scheduled a week apart. The test-retest reliability estimates for most measures were reasonably good, ranging from .79 to .97. An exception was the delayed word recall test, which has had a reliability estimate of .66 for the first retest, and .59 for the second retest. Practice effects were evident from Times 1 to 2 on the sentence comprehension and delayed recall subtests of the RSC, Digit Symbol and a composite score. There was also a practice effect of the same magnitude found from Time 2 to Time 3 on Digit Symbol, delayed recall and the composite score. Statistics on measures for both the first and second retest intervals, with associated practice effects, are presented to enable the calculation of reliable change indices (RCI). The RCI may be used to assess any improvement in cognitive functioning after mild Traumatic Brain Injury.

  11. Sustainable, Reliable Mission-Systems Architecture

    NASA Technical Reports Server (NTRS)

    O'Neil, Graham; Orr, James K.; Watson, Steve

    2005-01-01

    A mission-systems architecture, based on a highly modular infrastructure utilizing open-standards hardware and software interfaces as the enabling technology is essential for affordable md sustainable space exploration programs. This mission-systems architecture requires (8) robust communication between heterogeneous systems, (b) high reliability, (c) minimal mission-to-mission reconfiguration, (d) affordable development, system integration, end verification of systems, and (e) minimal sustaining engineering. This paper proposes such an architecture. Lessons learned from the Space Shuttle program and Earthbound complex engineered systems are applied to define the model. Technology projections reaching out 5 years are made to refine model details.

  12. A Sustainable, Reliable Mission-Systems Architecture that Supports a System of Systems Approach to Space Exploration

    NASA Technical Reports Server (NTRS)

    Watson, Steve; Orr, Jim; O'Neil, Graham

    2004-01-01

    A mission-systems architecture based on a highly modular "systems of systems" infrastructure utilizing open-standards hardware and software interfaces as the enabling technology is absolutely essential for an affordable and sustainable space exploration program. This architecture requires (a) robust communication between heterogeneous systems, (b) high reliability, (c) minimal mission-to-mission reconfiguration, (d) affordable development, system integration, and verification of systems, and (e) minimum sustaining engineering. This paper proposes such an architecture. Lessons learned from the space shuttle program are applied to help define and refine the model.

  13. Improved ATIR concentrator photovoltaic module

    NASA Astrophysics Data System (ADS)

    Adriani, Paul M.; Mao, Erwang

    2013-09-01

    Novel aggregated total internal reflection (ATIR) concentrator photovoltaic module design comprises 2-D shaped primary and secondary optics that effectively combine optical efficiency, low profile, convenient range of acceptance angles, reliability, and manufacturability. This novel optical design builds upon previous investigations by improving the shapes of primary and secondary optics to enable improved long-term reliability and manufacturability. This low profile, low concentration (5x to 10x) design fits well with one-axis trackers that are often used for flat plate crystalline silicon photovoltaic modules in large scale ground mount installations. Standard mounting points, materials, and procedures apply without changes from flat plate modules.

  14. Dielectric Spectroscopic Detection of Early Failures in 3-D Integrated Circuits.

    PubMed

    Obeng, Yaw; Okoro, C A; Ahn, Jung-Joon; You, Lin; Kopanski, Joseph J

    The commercial introduction of three dimensional integrated circuits (3D-ICs) has been hindered by reliability challenges, such as stress related failures, resistivity changes, and unexplained early failures. In this paper, we discuss a new RF-based metrology, based on dielectric spectroscopy, for detecting and characterizing electrically active defects in fully integrated 3D devices. These defects are traceable to the chemistry of the insolation dielectrics used in the through silicon via (TSV) construction. We show that these defects may be responsible for some of the unexplained early reliability failures observed in TSV enabled 3D devices.

  15. Sustainable, Reliable Mission-Systems Architecture

    NASA Technical Reports Server (NTRS)

    O'Neil, Graham; Orr, James K.; Watson, Steve

    2007-01-01

    A mission-systems architecture, based on a highly modular infrastructure utilizing: open-standards hardware and software interfaces as the enabling technology is essential for affordable and sustainable space exploration programs. This mission-systems architecture requires (a) robust communication between heterogeneous system, (b) high reliability, (c) minimal mission-to-mission reconfiguration, (d) affordable development, system integration, and verification of systems, and (e) minimal sustaining engineering. This paper proposes such an architecture. Lessons learned from the Space Shuttle program and Earthbound complex engineered system are applied to define the model. Technology projections reaching out 5 years are mde to refine model details.

  16. Evaluation, Use, and Refinement of Knowledge Representations through Acquisition Modeling

    ERIC Educational Resources Information Center

    Pearl, Lisa

    2017-01-01

    Generative approaches to language have long recognized the natural link between theories of knowledge representation and theories of knowledge acquisition. The basic idea is that the knowledge representations provided by Universal Grammar enable children to acquire language as reliably as they do because these representations highlight the…

  17. EPAct 2005: A Roadmap for Open Access

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrison, Jay A.

    After nine years of negotiation characterized by significant philosophical swings, Congress came together in the middle to support a moderate vision of open access intended primarily to enable load-serving entities to obtain the transmission service they need to meet the long-term needs of their consumers reliably and economically.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    NETL's Hybrid Performance, or Hyper, facility is a one-of-a-kind laboratory built to develop control strategies for the reliable operation of fuel cell/turbine hybrids and enable the simulation, design, and implementation of commercial equipment. The Hyper facility provides a unique opportunity for researchers to explore issues related to coupling fuel cell and gas turbine technologies.

  19. Atmosphere to Electrons: Enabling the Wind Plant of Tomorrow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Office of Energy Efficiency and Renewable Energy

    2015-11-01

    The U.S. Department of Energy’s Atmosphere to Electrons research initiative is focused on improving the performance and reliability of wind plants by establishing an unprecedented understanding of how the Earth’s atmosphere interacts with the wind plants and developing innovative technologies to maximize energy extraction from the wind.

  20. Development of Key-Enabling Technologies for a Variable-blend Natural Gas Vehicle

    DOT National Transportation Integrated Search

    2017-12-01

    A portable, economic and reliable sensor for the Natural Gas (NG) fuel quality has been developed. Both Wobbe Index (WI) and Methane Indexes (MI) as well as inert gas content (inert%) of the NG fuel can be measured in real time within 5% accuracy. Th...

  1. Human Research Program Opportunities

    NASA Technical Reports Server (NTRS)

    Kundrot, Craig E.

    2014-01-01

    The goal of HRP is to provide human health and performance countermeasures, knowledge, technologies, and tools to enable safe, reliable, and productive human space exploration. The Human Research Program was designed to meet the needs of human space exploration, and understand and reduce the risk to crew health and performance in exploration missions.

  2. 78 FR 48422 - Agency Information Collection Activities: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-08

    ... quantitative data through surveys with working-age (age 18-61) and older American (age 62 and older) consumers in order to develop and refine survey instruments that will enable the CFPB to reliably and... conducting research to identify methods and strategies to educate and counsel seniors, and developing goals...

  3. Reliable High Performance Peta- and Exa-Scale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G

    2012-04-02

    As supercomputers become larger and more powerful, they are growing increasingly complex. This is reflected both in the exponentially increasing numbers of components in HPC systems (LLNL is currently installing the 1.6 million core Sequoia system) as well as the wide variety of software and hardware components that a typical system includes. At this scale it becomes infeasible to make each component sufficiently reliable to prevent regular faults somewhere in the system or to account for all possible cross-component interactions. The resulting faults and instability cause HPC applications to crash, perform sub-optimally or even produce erroneous results. As supercomputers continuemore » to approach Exascale performance and full system reliability becomes prohibitively expensive, we will require novel techniques to bridge the gap between the lower reliability provided by hardware systems and users unchanging need for consistent performance and reliable results. Previous research on HPC system reliability has developed various techniques for tolerating and detecting various types of faults. However, these techniques have seen very limited real applicability because of our poor understanding of how real systems are affected by complex faults such as soft fault-induced bit flips or performance degradations. Prior work on such techniques has had very limited practical utility because it has generally focused on analyzing the behavior of entire software/hardware systems both during normal operation and in the face of faults. Because such behaviors are extremely complex, such studies have only produced coarse behavioral models of limited sets of software/hardware system stacks. Since this provides little insight into the many different system stacks and applications used in practice, this work has had little real-world impact. My project addresses this problem by developing a modular methodology to analyze the behavior of applications and systems during both normal and faulty operation. By synthesizing models of individual components into a whole-system behavior models my work is making it possible to automatically understand the behavior of arbitrary real-world systems to enable them to tolerate a wide range of system faults. My project is following a multi-pronged research strategy. Section II discusses my work on modeling the behavior of existing applications and systems. Section II.A discusses resilience in the face of soft faults and Section II.B looks at techniques to tolerate performance faults. Finally Section III presents an alternative approach that studies how a system should be designed from the ground up to make resilience natural and easy.« less

  4. Validation of the French translation-adaptation of the impact of cancer questionnaire version 2 (IOCv2) in a breast cancer survivor population.

    PubMed

    Blanchin, Myriam; Dauchy, Sarah; Cano, Alejandra; Brédart, Anne; Aaronson, Neil K; Hardouin, Jean-Benoit

    2015-07-29

    The Impact of Cancer version 2 (IOCv2) was designed to assess the physical and psychosocial health experience of cancer survivors through its positive and negative impacts. Although the IOCv2 is available in English and Dutch, it has not yet been validated for use in French-speaking populations. The current study was undertaken to provide a comprehensive assessment of the reliability and validity of the French language version of the IOCv2 in a sample of breast cancer survivors. An adapted French version of the IOCv2 as well as demographic and medical information were completed by 243 women to validate the factor structure divergent/divergent validities and reliability. Concurrent validity was assessed by correlating the IOCv2 scales with measures from the SF-12, PostTraumatic Growth Inventory and Fear of Cancer Recurrence Inventory. The French version of the IOCv2 supports the structure of the original version, with four positive impact dimensions and four negative impact dimensions. This result was suggested by the good fit of the confirmatory factor analysis and the adequate reliability revealed by Cronbach's alpha coefficients and other psychometric indices. The concurrent validity analysis revealed patterns of association between IOCv2 scale scores and other measures. Unlike the original version, a structure with a Positive Impact domain consisting in the IOCv2 positive dimensions and a Negative Impact domain consisting in the negative ones has not been clearly evidenced in this study. The limited practical use of the conditional dimensions Employment Concerns and Relationship Concerns, whether the patient is partnered or not, did not make possible to provide evidence of validity and reliability of these dimensions as the subsets of sample to work with were not large enough. The scores of these conditional dimensions have to be used with full knowledge of the facts of this limitation of the study. Integrating IOCv2 into studies will contribute to evaluate the psychosocial health experience of the growing population of cancer survivors, enabling better understanding of the multi-dimensional impact of cancer.

  5. Thread-Like CMOS Logic Circuits Enabled by Reel-Processed Single-Walled Carbon Nanotube Transistors via Selective Doping.

    PubMed

    Heo, Jae Sang; Kim, Taehoon; Ban, Seok-Gyu; Kim, Daesik; Lee, Jun Ho; Jur, Jesse S; Kim, Myung-Gil; Kim, Yong-Hoon; Hong, Yongtaek; Park, Sung Kyu

    2017-08-01

    The realization of large-area electronics with full integration of 1D thread-like devices may open up a new era for ultraflexible and human adaptable electronic systems because of their potential advantages in demonstrating scalable complex circuitry by a simply integrated weaving technology. More importantly, the thread-like fiber electronic devices can be achieved using a simple reel-to-reel process, which is strongly required for low-cost and scalable manufacturing technology. Here, high-performance reel-processed complementary metal-oxide-semiconductor (CMOS) integrated circuits are reported on 1D fiber substrates by using selectively chemical-doped single-walled carbon nanotube (SWCNT) transistors. With the introduction of selective n-type doping and a nonrelief photochemical patterning process, p- and n-type SWCNT transistors are successfully implemented on cylindrical fiber substrates under air ambient, enabling high-performance and reliable thread-like CMOS inverter circuits. In addition, it is noteworthy that the optimized reel-coating process can facilitate improvement in the arrangement of SWCNTs, building uniformly well-aligned SWCNT channels, and enhancement of the electrical performance of the devices. The p- and n-type SWCNT transistors exhibit field-effect mobility of 4.03 and 2.15 cm 2 V -1 s -1 , respectively, with relatively narrow distribution. Moreover, the SWCNT CMOS inverter circuits demonstrate a gain of 6.76 and relatively good dynamic operation at a supply voltage of 5.0 V. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. A reliability and mass perspective of SP-100 Stirling cycle lunar-base powerplant designs

    NASA Technical Reports Server (NTRS)

    Bloomfield, Harvey S.

    1991-01-01

    The purpose was to obtain reliability and mass perspectives on selection of space power system conceptual designs based on SP-100 reactor and Stirling cycle power-generation subsystems. The approach taken was to: (1) develop a criterion for an acceptable overall reliability risk as a function of the expected range of emerging technology subsystem unit reliabilities; (2) conduct reliability and mass analyses for a diverse matrix of 800-kWe lunar-base design configurations employing single and multiple powerplants with both full and partial subsystem redundancy combinations; and (3) derive reliability and mass perspectives on selection of conceptual design configurations that meet an acceptable reliability criterion with the minimum system mass increase relative to reference powerplant design. The developed perspectives provided valuable insight into the considerations required to identify and characterize high-reliability and low-mass lunar-base powerplant conceptual design.

  7. Reliability and validity analysis of the transfer assessment instrument.

    PubMed

    McClure, Laura A; Boninger, Michael L; Ozawa, Haishin; Koontz, Alicia

    2011-03-01

    To describe the development and evaluate the reliability and validity of a newly created outcome measure, the Transfer Assessment Instrument (TAI), to assess the quality of transfers performed by full-time wheelchair users. Repeated measures. 2009 National Veterans Wheelchair Games in Spokane, WA. A convenience sample of full-time wheelchair users (N=40) who perform sitting pivot or standing pivot transfers. Not applicable. Intraclass correlation coefficients (ICCs) for reliability and Spearman correlation coefficients for concurrent validity between the TAI and a global assessment scale (0-100 visual analog scale [VAS]). No adverse events occurred during testing. Intrarater ICCs for 3 raters ranged between .35 and .89, and the interrater ICC was .642. Correlations between the TAI and a global assessment VAS ranged between .19 (P=.285) and .69 (P>.000). Item analyses of the tool found a wide range of results, from weak to good reliability. Evaluators found the TAI to be safe and able to be completed in a short time. The TAI is a safe, quick outcome measure that uses equipment typically found in a clinical setting and does not ask participants to perform new skills. Reliability and validity testing found the TAI to have acceptable interrater and a wide range of intrarater reliability. Future work indicates the need for continued refinement including removal or modification of items found to have low reliability, improved education for clinicians, and further reliability and validity analysis with a more diverse subject population. The TAI has the potential to fill a void in assessment of transfers. Copyright © 2011 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  8. Resting-state fMRI correlations: From link-wise unreliability to whole brain stability.

    PubMed

    Pannunzi, Mario; Hindriks, Rikkert; Bettinardi, Ruggero G; Wenger, Elisabeth; Lisofsky, Nina; Martensson, Johan; Butler, Oisin; Filevich, Elisa; Becker, Maxi; Lochstet, Martyna; Kühn, Simone; Deco, Gustavo

    2017-08-15

    The functional architecture of spontaneous BOLD fluctuations has been characterized in detail by numerous studies, demonstrating its potential relevance as a biomarker. However, the systematic investigation of its consistency is still in its infancy. Here, we analyze within- and between-subject variability and test-retest reliability of resting-state functional connectivity (FC) in a unique data set comprising multiple fMRI scans (42) from 5 subjects, and 50 single scans from 50 subjects. We adopt a statistical framework that enables us to identify different sources of variability in FC. We show that the low reliability of single links can be significantly improved by using multiple scans per subject. Moreover, in contrast to earlier studies, we show that spatial heterogeneity in FC reliability is not significant. Finally, we demonstrate that despite the low reliability of individual links, the information carried by the whole-brain FC matrix is robust and can be used as a functional fingerprint to identify individual subjects from the population. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Using Facility Condition Assessments to Identify Actions Related to Infrastructure

    NASA Technical Reports Server (NTRS)

    Rubert, Kennedy F.

    2010-01-01

    To support cost effective, quality research it is essential that laboratory and testing facilities are maintained in a continuous and reliable state of availability at all times. NASA Langley Research Center (LaRC) and its maintenance contractor, Jacobs Technology, Inc. Research Operations, Maintenance, and Engineering (ROME) group, are in the process of implementing a combined Facility Condition Assessment (FCA) and Reliability Centered Maintenance (RCM) program to improve asset management and overall reliability of testing equipment in facilities such as wind tunnels. Specific areas are being identified for improvement, the deferred maintenance cost is being estimated, and priority is being assigned against facilities where conditions have been allowed to deteriorate. This assessment serves to assist in determining where to commit available funds on the Center. RCM methodologies are being reviewed and enhanced to assure that appropriate preventive, predictive, and facilities/equipment acceptance techniques are incorporated to prolong lifecycle availability and assure reliability at minimum cost. The results from the program have been favorable, better enabling LaRC to manage assets prudently.

  10. State recovery and lockstep execution restart in a system with multiprocessor pairing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gara, Alan; Gschwind, Michael K; Salapura, Valentina

    System, method and computer program product for a multiprocessing system to offer selective pairing of processor cores for increased processing reliability. A selective pairing facility is provided that selectively connects, i.e., pairs, multiple microprocessor or processor cores to provide one highly reliable thread (or thread group). Each paired microprocessor or processor cores that provide one highly reliable thread for high-reliability connect with a system components such as a memory "nest" (or memory hierarchy), an optional system controller, and optional interrupt controller, optional I/O or peripheral devices, etc. The memory nest is attached to a selective pairing facility via a switchmore » or a bus. Each selectively paired processor core is includes a transactional execution facility, whereing the system is configured to enable processor rollback to a previous state and reinitialize lockstep execution in order to recover from an incorrect execution when an incorrect execution has been detected by the selective pairing facility.« less

  11. Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks

    PubMed Central

    Dâmaso, Antônio; Maciel, Paulo

    2017-01-01

    Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way. PMID:29113078

  12. Examining the Measurement Precision and Invariance of the Revised Get Ready to Read!

    PubMed Central

    Farrington, Amber L.; Lonigan, Christopher J.

    2016-01-01

    Children's emergent literacy skills are highly predictive of later reading abilities. To determine which children have weaker emergent literacy skills and are in need of intervention, it is necessary to assess emergent literacy skills accurately and reliably. In this study, 1,351 children were administered the Revised Get Ready to Read! (GRTR-R), and an item response theory analysis was used to evaluate the item-level reliability of the measure. Differential item functioning (DIF) analyses were conducted to examine whether items function similarly between subpopulations of children. The GRTR-R had acceptable reliability for children whose ability level was just below the mean. DIF for a small number of items was present for only two comparisons—children who were older versus younger and children who were White versus African American. These results demonstrate that the GRTR-R has acceptable reliability and limited DIF, enabling the screener to identify those at risk for developing reading problems. PMID:23851136

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreno, Gilbert

    The objective for this project is to develop thermal management strategies to enable efficient and high-temperature wide-bandgap (WBG)-based power electronic systems (e.g., emerging inverter and DC-DC converter). Device- and system-level thermal analyses are conducted to determine the thermal limitations of current automotive power modules under elevated device temperature conditions. Additionally, novel cooling concepts and material selection will be evaluated to enable high-temperature silicon and WBG devices in power electronics components. WBG devices (silicon carbide [SiC], gallium nitride [GaN]) promise to increase efficiency, but will be driven as hard as possible. This creates challenges for thermal management and reliability.

  14. The behaviour change wheel: a new method for characterising and designing behaviour change interventions.

    PubMed

    Michie, Susan; van Stralen, Maartje M; West, Robert

    2011-04-23

    Improving the design and implementation of evidence-based practice depends on successful behaviour change interventions. This requires an appropriate method for characterising interventions and linking them to an analysis of the targeted behaviour. There exists a plethora of frameworks of behaviour change interventions, but it is not clear how well they serve this purpose. This paper evaluates these frameworks, and develops and evaluates a new framework aimed at overcoming their limitations. A systematic search of electronic databases and consultation with behaviour change experts were used to identify frameworks of behaviour change interventions. These were evaluated according to three criteria: comprehensiveness, coherence, and a clear link to an overarching model of behaviour. A new framework was developed to meet these criteria. The reliability with which it could be applied was examined in two domains of behaviour change: tobacco control and obesity. Nineteen frameworks were identified covering nine intervention functions and seven policy categories that could enable those interventions. None of the frameworks reviewed covered the full range of intervention functions or policies, and only a minority met the criteria of coherence or linkage to a model of behaviour. At the centre of a proposed new framework is a 'behaviour system' involving three essential conditions: capability, opportunity, and motivation (what we term the 'COM-B system'). This forms the hub of a 'behaviour change wheel' (BCW) around which are positioned the nine intervention functions aimed at addressing deficits in one or more of these conditions; around this are placed seven categories of policy that could enable those interventions to occur. The BCW was used reliably to characterise interventions within the English Department of Health's 2010 tobacco control strategy and the National Institute of Health and Clinical Excellence's guidance on reducing obesity. Interventions and policies to change behaviour can be usefully characterised by means of a BCW comprising: a 'behaviour system' at the hub, encircled by intervention functions and then by policy categories. Research is needed to establish how far the BCW can lead to more efficient design of effective interventions.

  15. The behaviour change wheel: A new method for characterising and designing behaviour change interventions

    PubMed Central

    2011-01-01

    Background Improving the design and implementation of evidence-based practice depends on successful behaviour change interventions. This requires an appropriate method for characterising interventions and linking them to an analysis of the targeted behaviour. There exists a plethora of frameworks of behaviour change interventions, but it is not clear how well they serve this purpose. This paper evaluates these frameworks, and develops and evaluates a new framework aimed at overcoming their limitations. Methods A systematic search of electronic databases and consultation with behaviour change experts were used to identify frameworks of behaviour change interventions. These were evaluated according to three criteria: comprehensiveness, coherence, and a clear link to an overarching model of behaviour. A new framework was developed to meet these criteria. The reliability with which it could be applied was examined in two domains of behaviour change: tobacco control and obesity. Results Nineteen frameworks were identified covering nine intervention functions and seven policy categories that could enable those interventions. None of the frameworks reviewed covered the full range of intervention functions or policies, and only a minority met the criteria of coherence or linkage to a model of behaviour. At the centre of a proposed new framework is a 'behaviour system' involving three essential conditions: capability, opportunity, and motivation (what we term the 'COM-B system'). This forms the hub of a 'behaviour change wheel' (BCW) around which are positioned the nine intervention functions aimed at addressing deficits in one or more of these conditions; around this are placed seven categories of policy that could enable those interventions to occur. The BCW was used reliably to characterise interventions within the English Department of Health's 2010 tobacco control strategy and the National Institute of Health and Clinical Excellence's guidance on reducing obesity. Conclusions Interventions and policies to change behaviour can be usefully characterised by means of a BCW comprising: a 'behaviour system' at the hub, encircled by intervention functions and then by policy categories. Research is needed to establish how far the BCW can lead to more efficient design of effective interventions. PMID:21513547

  16. Multiple-Color Optical Activation, Silencing, and Desynchronization of Neural Activity, with Single-Spike Temporal Resolution

    PubMed Central

    Han, Xue; Boyden, Edward S.

    2007-01-01

    The quest to determine how precise neural activity patterns mediate computation, behavior, and pathology would be greatly aided by a set of tools for reliably activating and inactivating genetically targeted neurons, in a temporally precise and rapidly reversible fashion. Having earlier adapted a light-activated cation channel, channelrhodopsin-2 (ChR2), for allowing neurons to be stimulated by blue light, we searched for a complementary tool that would enable optical neuronal inhibition, driven by light of a second color. Here we report that targeting the codon-optimized form of the light-driven chloride pump halorhodopsin from the archaebacterium Natronomas pharaonis (hereafter abbreviated Halo) to genetically-specified neurons enables them to be silenced reliably, and reversibly, by millisecond-timescale pulses of yellow light. We show that trains of yellow and blue light pulses can drive high-fidelity sequences of hyperpolarizations and depolarizations in neurons simultaneously expressing yellow light-driven Halo and blue light-driven ChR2, allowing for the first time manipulations of neural synchrony without perturbation of other parameters such as spiking rates. The Halo/ChR2 system thus constitutes a powerful toolbox for multichannel photoinhibition and photostimulation of virally or transgenically targeted neural circuits without need for exogenous chemicals, enabling systematic analysis and engineering of the brain, and quantitative bioengineering of excitable cells. PMID:17375185

  17. Using Colaizzi's method of data analysis to explore the experiences of nurse academics teaching on satellite campuses.

    PubMed

    Wirihana, Lisa; Welch, Anthony; Williamson, Moira; Christensen, Martin; Bakon, Shannon; Craft, Judy

    2018-03-16

    Phenomenology is a useful methodological approach in qualitative nursing research. It enables researchers to put aside their perceptions of a phenomenon and give meaning to a participant's experiences. Exploring the experiences of others enables previously unavailable insights to be discovered. To delineate the implementation of Colaizzi's ( 1978 ) method of data analysis in descriptive phenomenological nursing research. The use of Colaizzi's method of data analysis enabled new knowledge to be revealed and provided insights into the experiences of nurse academics teaching on satellite campuses. Local adaptation of the nursing curriculum and additional unnoticed responsibilities had not been identified previously and warrant further research. Colaizzi's ( 1978 ) method of data analysis is rigorous and robust, and therefore a qualitative method that ensures the credibility and reliability of its results. It allows researchers to reveal emergent themes and their interwoven relationships. Researchers using a descriptive phenomenological approach should consider using this method as a clear and logical process through which the fundamental structure of an experience can be explored. Colaizzi's phenomenological methodology can be used reliably to understand people's experiences. This may prove beneficial in the development of therapeutic policy and the provision of patient-centred care. ©2018 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.

  18. A Briefing on Metrics and Risks for Autonomous Decision-Making in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Frost, Susan; Goebel, Kai Frank; Galvan, Jose Ramon

    2012-01-01

    Significant technology advances will enable future aerospace systems to safely and reliably make decisions autonomously, or without human interaction. The decision-making may result in actions that enable an aircraft or spacecraft in an off-nominal state or with slightly degraded components to achieve mission performance and safety goals while reducing or avoiding damage to the aircraft or spacecraft. Some key technology enablers for autonomous decision-making include: a continuous state awareness through the maturation of the prognostics health management field, novel sensor development, and the considerable gains made in computation power and data processing bandwidth versus system size. Sophisticated algorithms and physics based models coupled with these technological advances allow reliable assessment of a system, subsystem, or components. Decisions that balance mission objectives and constraints with remaining useful life predictions can be made autonomously to maintain safety requirements, optimal performance, and ensure mission objectives. This autonomous approach to decision-making will come with new risks and benefits, some of which will be examined in this paper. To start, an account of previous work to categorize or quantify autonomy in aerospace systems will be presented. In addition, a survey of perceived risks in autonomous decision-making in the context of piloted aircraft and remotely piloted or completely autonomous unmanned autonomous systems (UAS) will be presented based on interviews that were conducted with individuals from industry, academia, and government.

  19. Design and development of an active Gurney flap for rotorcraft

    NASA Astrophysics Data System (ADS)

    Freire Gómez, Jon; Booker, Julian D.; Mellor, Phil H.

    2013-03-01

    The EU's Green Rotorcraft programme will develop an Active Gurney Flap (AGF) for a full-scale helicopter main rotor blade as part of its `smart adaptive rotor blade' technology demonstrators. AGFs can be utilized to provide a localized and variable lift enhancement on the rotor, enabling a redistribution of loading on the rotor blade around the rotor azimuth. Further advantages include the possibility of using AGFs to allow a rotor speed reduction, which subsequently provides acoustic benefits. Designed to be integrable into a commercial helicopter blade, and thereby capable of withstanding real in-flight centrifugal loading, blade vibrations and aerodynamic loads, the demonstrator is expected to achieve a high technology readiness level (TRL). The AGF will be validated initially by a constant blade section 2D wind tunnel test and latterly by full blade 3D whirl tower testing. This paper presents the methodology adopted for the AGF concept topology selection, based on a series of both qualitative and quantitative performance criteria. Two different AGF candidate mechanisms are compared, both powered by a small commercial electromagnetic actuator. In both topologies, the link between the actuator and the control surface consists of two rotating torque bars, pivoting on flexure bearings. This provides the required reliability and precision, while making the design virtually frictionless. The engineering analysis presented suggests that both candidates would perform satisfactorily in a 2D wind tunnel test, but that equally, both have design constraints which limit their potential to be further taken into a whirl tower test under full scale centrifugal and inertial loads.

  20. Parallel production and verification of protein products using a novel high-throughput screening method.

    PubMed

    Tegel, Hanna; Yderland, Louise; Boström, Tove; Eriksson, Cecilia; Ukkonen, Kaisa; Vasala, Antti; Neubauer, Peter; Ottosson, Jenny; Hober, Sophia

    2011-08-01

    Protein production and analysis in a parallel fashion is today applied in laboratories worldwide and there is a great need to improve the techniques and systems used for this purpose. In order to save time and money, a fast and reliable screening method for analysis of protein production and also verification of the protein product is desired. Here, a micro-scale protocol for the parallel production and screening of 96 proteins in plate format is described. Protein capture was achieved using immobilized metal affinity chromatography and the product was verified using matrix-assisted laser desorption ionization time-of-flight MS. In order to obtain sufficiently high cell densities and product yield in the small-volume cultivations, the EnBase® cultivation technology was applied, which enables cultivation in as small volumes as 150 μL. Here, the efficiency of the method is demonstrated by producing 96 human, recombinant proteins, both in micro-scale and using a standard full-scale protocol and comparing the results in regard to both protein identity and sample purity. The results obtained are highly comparable to those acquired through employing standard full-scale purification protocols, thus validating this method as a successful initial screening step before protein production at a larger scale. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Point Positioning Service for Natural Hazard Monitoring

    NASA Astrophysics Data System (ADS)

    Bar-Sever, Y. E.

    2014-12-01

    In an effort to improve natural hazard monitoring, JPL has invested in updating and enlarging its global real-time GNSS tracking network, and has launched a unique service - real-time precise positioning for natural hazard monitoring, entitled GREAT Alert (GNSS Real-Time Earthquake and Tsunami Alert). GREAT Alert leverages the full technological and operational capability of the JPL's Global Differential GPS System [www.gdgps.net] to offer owners of real-time dual-frequency GNSS receivers: Sub-5 cm (3D RMS) real-time, absolute positioning in ITRF08, regardless of location Under 5 seconds turnaround time Full covariance information Estimates of ancillary parameters (such as troposphere) optionally provided This service enables GNSS networks operators to instantly have access to the most accurate and reliable real-time positioning solutions for their sites, and also to the hundreds of participating sites globally, assuring inter-consistency and uniformity across all solutions. Local authorities with limited technical and financial resources can now access to the best technology, and share environmental data to the benefit of the entire pacific region. We will describe the specialized precise point positioning techniques employed by the GREAT Alert service optimized for natural hazard monitoring, and in particular Earthquake monitoring. We address three fundamental aspects of these applications: 1) small and infrequent motion, 2) the availability of data at a central location, and 3) the need for refined solutions at several time scales

  2. Neck motion kinematics: an inter-tester reliability study using an interactive neck VR assessment in asymptomatic individuals.

    PubMed

    Sarig Bahat, Hilla; Sprecher, Elliot; Sela, Itamar; Treleaven, Julia

    2016-07-01

    The use of virtual reality (VR) for assessment and intervention of neck pain has previously been used and shown reliable for cervical range of motion measures. Neck VR enables analysis of task-oriented neck movement by stimulating responsive movements to external stimuli. Therefore, the purpose of this study was to establish inter-tester reliability of neck kinematic measures so that it can be used as a reliable assessment and treatment tool between clinicians. This reliability study included 46 asymptomatic participants, who were assessed using the neck VR system which displayed an interactive VR scenario via a head-mounted device, controlled by neck movements. The objective of the interactive assessment was to hit 16 targets, randomly appearing in four directions, as fast as possible. Each participant was tested twice by two different testers. Good reliability was found of neck motion kinematic measures in flexion, extension, and rotation (0.64-0.93 inter-class correlation). High reliability was shown for peak velocity globally (0.93), in left rotation (0.9), right rotation and extension (0.88), and flexion (0.86). Mean velocity had a good global reliability (0.84), except for left rotation directed movement with moderate reliability (0.68). Minimal detectable change for peak velocity ranged from 41 to 53 °/s, while mean velocity ranged from 20 to 25 °/s. The results suggest high reliability for peak and mean velocity as measured by the interactive Neck VR assessment of neck motion kinematics. VR appears to provide a reliable and more ecologically valid method of cervical motion evaluation than previous conventional methodologies.

  3. Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.

    2005-01-01

    An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.

  4. Psychometric qualities of a tetrad WAIS-III short form for use in individuals with mild to borderline intellectual disability.

    PubMed

    van Duijvenbode, Neomi; Didden, Robert; van den Hazel, Teunis; Engels, Rutger C M E

    2016-01-01

    To investigate the reliability and validity of a Wechsler Abbreviated Scale of Intelligence-based Wechsler Adult Intelligence Scale - third edition (WAIS-III) short form (SF) in a sample of individuals with mild to borderline intellectual disability (MBID) (N = 117; M(IQ) = 71.34; SD(IQ) = 8.00, range: 52-85). A full WAIS-III was administered as a standard procedure in the diagnostic process. The results indicate an excellent reliability (r = 0.96) and a strong, positive correlation with the full WAIS-III (r = 0.89). The SF correctly identified ID in general and the correct IQ category more specifically in the majority of cases (97.4% and 86.3% of cases, respectively). In addition, 82.1% of the full scale IQ (FSIQ) estimates fell within the 95% confidence interval of the original score. We conclude that the SF is a reliable and valid measure to estimate FSIQ. It can be used in clinical and research settings when global estimates of intelligence are sufficient.

  5. Empirical evaluation of the Process Overview Measure for assessing situation awareness in process plants.

    PubMed

    Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd

    2016-03-01

    The Process Overview Measure is a query-based measure developed to assess operator situation awareness (SA) from monitoring process plants. A companion paper describes how the measure has been developed according to process plant properties and operator cognitive work. The Process Overview Measure demonstrated practicality, sensitivity, validity and reliability in two full-scope simulator experiments investigating dramatically different operational concepts. Practicality was assessed based on qualitative feedback of participants and researchers. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA in full-scope simulator settings based on data collected on process experts. Thus, full-scope simulator studies can employ the Process Overview Measure to reveal the impact of new control room technology and operational concepts on monitoring process plants. Practitioner Summary: The Process Overview Measure is a query-based measure that demonstrated practicality, sensitivity, validity and reliability for assessing operator situation awareness (SA) from monitoring process plants in representative settings.

  6. Adaptation of ATI-R Scale to Turkish Samples: Validity and Reliability Analyses

    ERIC Educational Resources Information Center

    Tezci, Erdogan

    2017-01-01

    Teachers' teaching approaches have become an important issue in the search of quality in education and teaching because of their effect on students' learning. Improvements in teachers' knowledge and awareness of their own teaching approaches enable them to adopt teaching process in accordance with their students' learning styles. The Approaches to…

  7. Atlas Centaur Rocket With Reusable Booster Engines

    NASA Technical Reports Server (NTRS)

    Martin, James A.

    1993-01-01

    Proposed modification of Atlas Centaur enables reuse of booster engines. Includes replacement of current booster engines with engine of new design in which hydrogen used for both cooling and generation of power. Use of hydrogen in new engine eliminates coking and clogging and improves performance significantly. Primary advantages: reduction of cost; increased reliability; and increased payload.

  8. Benefits for Health; NASA

    NASA Technical Reports Server (NTRS)

    Perchonok, Michele

    2014-01-01

    The goal of HRP is to provide human health and performance countermeasures, knowledge, technologies, and tools to enable safe, reliable, and productive human space exploration. Presentation discusses (1) Bone Health: Vitamin D, Fish Consumption and Exercise (2) Medical Support in Remote Areas (3) ISS Ultrasound 4) Dry electrode EKG System (5) Environmental Factors and Psychological Health.

  9. Wildlife monitoring across multiple spatial scales using grid-based sampling

    Treesearch

    Kevin S. McKelvey; Samuel A. Cushman; Michael K. Schwartz; Leonard F. Ruggiero

    2009-01-01

    Recently, noninvasive genetic sampling has become the most effective way to reliably sample occurrence of many species. In addition, genetic data provide a rich data source enabling the monitoring of population status. The combination of genetically based animal data collected at known spatial coordinates with vegetation, topography, and other available covariates...

  10. Three-Dimensional Space to Assess Cloud Interoperability

    DTIC Science & Technology

    2013-03-01

    12 1. Portability and Mobility ...collection of network-enabled services that guarantees to provide a scalable, easy accessible, reliable, and personalized computing infrastructure , based on...are used in research to describe cloud models, such as SaaS (Software as a Service), PaaS (Platform as a service), IaaS ( Infrastructure as a Service

  11. Developments in Cylindrical Shell Stability Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Starnes, James H., Jr.

    1998-01-01

    Today high-performance computing systems and new analytical and numerical techniques enable engineers to explore the use of advanced materials for shell design. This paper reviews some of the historical developments of shell buckling analysis and design. The paper concludes by identifying key research directions for reliable and robust methods development in shell stability analysis and design.

  12. Use of mycelium and detached leaves in bioassays for assessing resistance to boxwood blight

    USDA-ARS?s Scientific Manuscript database

    Boxwood blight caused by Calonectria pseudonaviculata is a newly emergent disease of boxwood (Buxus L.) in the United States that causes leaf drop, stem lesions, and plant death. A rapid and reliable laboratory assay that enables screening hundreds of boxwood genotypes for resistance to boxwood blig...

  13. True-personality-assisted self-awareness expert system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laleuf, M.

    Based on psychoanalytic theory, the Who am I expert system explains in simple terms the individual's true personality, even it its unconscious or hidden aspects. Our overt personality traits are deeply rooted. The Who am I expert system gives access to an individual's primary personality, starting from his habitual everyday-life behavior: (1) describes the individual's basic personality, (2) explains this personality through the individual's deeply rooted experience and motivation, and (3) makes links with other people with a similar profile. The following are the primary features of the system: easy individual access, results in <20 minutes, and guaranteed confidentiality. Businessmore » applications include the following: (1) Individual training: Self-awareness improves a person's ability to fit in and to succeed within the group. (2) Communication: a homogeneous team has a better chance of success. (3) Human reliability: A close-knit team remains reliable even when faced with serious difficulties. (4) Recruitment: This technique enables the selection of individuals who will fit an existing homogeneous team. The system also enables a psychological diagnosis to be confirmed.« less

  14. MPNACK: an optical switching scheme enabling the buffer-less reliable transmission

    NASA Astrophysics Data System (ADS)

    Yu, Xiaoshan; Gu, Huaxi; Wang, Kun; Xu, Meng; Guo, Yantao

    2016-01-01

    Optical data center networks are becoming an increasingly promising solution to solve the bottlenecks faced by electrical networks, such as low transmission bandwidth, high wiring complexity, and unaffordable power consumption. However, the optical circuit switching (OCS) network is not flexible enough to carry the traffic burst while the optical packet switching (OPS) network cannot solve the packet contention in an efficient way. To this end, an improved switching strategy named OPS with multi-hop Negative Acknowledgement (MPNACK) is proposed. This scheme uses a feedback mechanism, rather than the buffering structure, to handle the optical packet contention. The collided packet is treated as a NACK packet and sent back to the source server. When the sender receives this NACK packet, it knows a collision happens in the transmission path and a retransmission procedure is triggered. Overall, the OPS-NACK scheme enables a reliable transmission in the buffer-less optical network. Furthermore, with this scheme, the expensive and energy-hungry elements, optical or electrical buffers, can be removed from the optical interconnects, thus a more scalable and cost-efficient network can be constructed for cloud computing data centers.

  15. A Benefit Analysis of Infusing Wireless into Aircraft and Fleet Operations - Report to Seedling Project Efficient Reconfigurable Cockpit Design and Fleet Operations Using Software Intensive, Network Enabled, Wireless Architecture (ECON)

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia; Holmes, Bruce J.; Hahn, Andrew S.

    2016-01-01

    We report on an examination of potential benefits of infusing wireless technologies into various areas of aircraft and airspace operations. The analysis is done in support of a NASA seedling project Efficient Reconfigurable Cockpit Design and Fleet Operations Using Software Intensive, Network Enabled Wireless Architecture (ECON). The study has two objectives. First, we investigate one of the main benefit hypotheses of the ECON proposal: that the replacement of wired technologies with wireless would lead to significant weight reductions on an aircraft, among other benefits. Second, we advance a list of wireless technology applications and discuss their system benefits. With regard to the primary hypothesis, we conclude that the promise of weight reduction is premature. Specificity of the system domain and aircraft, criticality of components, reliability of wireless technologies, the weight of replacement or augmentation equipment, and the cost of infusion must all be taken into account among other considerations, to produce a reliable estimate of weight savings or increase.

  16. Tracking Progress in Improving Diagnosis: A Framework for Defining Undesirable Diagnostic Events.

    PubMed

    Olson, Andrew P J; Graber, Mark L; Singh, Hardeep

    2018-01-29

    Diagnostic error is a prevalent, harmful, and costly phenomenon. Multiple national health care and governmental organizations have recently identified the need to improve diagnostic safety as a high priority. A major barrier, however, is the lack of standardized, reliable methods for measuring diagnostic safety. Given the absence of reliable and valid measures for diagnostic errors, we need methods to help establish some type of baseline diagnostic performance across health systems, as well as to enable researchers and health systems to determine the impact of interventions for improving the diagnostic process. Multiple approaches have been suggested but none widely adopted. We propose a new framework for identifying "undesirable diagnostic events" (UDEs) that health systems, professional organizations, and researchers could further define and develop to enable standardized measurement and reporting related to diagnostic safety. We propose an outline for UDEs that identifies both conditions prone to diagnostic error and the contexts of care in which these errors are likely to occur. Refinement and adoption of this framework across health systems can facilitate standardized measurement and reporting of diagnostic safety.

  17. Data distribution service-based interoperability framework for smart grid testbed infrastructure

    DOE PAGES

    Youssef, Tarek A.; Elsayed, Ahmed T.; Mohammed, Osama A.

    2016-03-02

    This study presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS) is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discoverymore » feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS).« less

  18. In-Vivo Human Skin to Textiles Friction Measurements

    NASA Astrophysics Data System (ADS)

    Pfarr, Lukas; Zagar, Bernhard

    2017-10-01

    We report on a measurement system to determine highly reliable and accurate friction properties of textiles as needed for example as input to garment simulation software. Our investigations led to a set-up that allows to characterize not just textile to textile but also textile to in-vivo human skin tribological properties and thus to fundamental knowledge about genuine wearer interaction in garments. The method of test conveyed in this paper is measuring concurrently and in a highly time resolved manner the normal force as well as the resulting shear force caused by a friction subject intending to slide out of the static friction regime and into the dynamic regime on a test bench. Deeper analysis of various influences is enabled by extending the simple model following Coulomb's law for rigid body friction to include further essential parameters such as contact force, predominance in the yarn's orientation and also skin hydration. This easy-to-use system enables to measure reliably and reproducibly both static and dynamic friction for a variety of friction partners including human skin with all its variability there might be.

  19. Using subject-specific three-dimensional (3D) anthropometry data in digital human modelling: case study in hand motion simulation.

    PubMed

    Tsao, Liuxing; Ma, Liang

    2016-11-01

    Digital human modelling enables ergonomists and designers to consider ergonomic concerns and design alternatives in a timely and cost-efficient manner in the early stages of design. However, the reliability of the simulation could be limited due to the percentile-based approach used in constructing the digital human model. To enhance the accuracy of the size and shape of the models, we proposed a framework to generate digital human models using three-dimensional (3D) anthropometric data. The 3D scan data from specific subjects' hands were segmented based on the estimated centres of rotation. The segments were then driven in forward kinematics to perform several functional postures. The constructed hand models were then verified, thereby validating the feasibility of the framework. The proposed framework helps generate accurate subject-specific digital human models, which can be utilised to guide product design and workspace arrangement. Practitioner Summary: Subject-specific digital human models can be constructed under the proposed framework based on three-dimensional (3D) anthropometry. This approach enables more reliable digital human simulation to guide product design and workspace arrangement.

  20. On the importance of electron impact processes in excimer-pumped alkali laser-induced plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Markosyan, Aram H.

    We present that the excimer-pumped alkali laser (XPAL) system has recently been demonstrated in several different mixtures of alkali vapor and rare gas. Without special preventive measures, plasma formation during operation of XPAL is unavoidable. Some recent advancements in the availability of reliable data for electron impact collisions with atoms and molecules have enabled development of a complete reaction mechanism to investigate XPAL-induced plasmas. Here, we report on pathways leading to plasma formation in an Ar/C 2H 6/CsAr/C2H6/Cs XPAL sustained at different cell temperatures. We find that depending on the operating conditions, the contribution of electron impact processes can bemore » as little as bringing the excitation of Cs(P 2) states to higher level Cs** states, and can be as high as bringing Cs(P 2) excited states to a full ionization. Increasing the input pumping power or cell temperature, or decreasing the C 2H 6 mole fraction leads to electron impact processes dominating in plasma formation over the energy pooling mechanisms previously reported in literature.« less

  1. Slice-thickness evaluation in CT and MRI: an alternative computerised procedure.

    PubMed

    Acri, G; Tripepi, M G; Causa, F; Testagrossa, B; Novario, R; Vermiglio, G

    2012-04-01

    The efficient use of computed tomography (CT) and magnetic resonance imaging (MRI) equipment necessitates establishing adequate quality-control (QC) procedures. In particular, the accuracy of slice thickness (ST) requires scan exploration of phantoms containing test objects (plane, cone or spiral). To simplify such procedures, a novel phantom and a computerised LabView-based procedure have been devised, enabling determination of full width at half maximum (FWHM) in real time. The phantom consists of a polymethyl methacrylate (PMMA) box, diagonally crossed by a PMMA septum dividing the box into two sections. The phantom images were acquired and processed using the LabView-based procedure. The LabView (LV) results were compared with those obtained by processing the same phantom images with commercial software, and the Fisher exact test (F test) was conducted on the resulting data sets to validate the proposed methodology. In all cases, there was no statistically significant variation between the two different procedures and the LV procedure, which can therefore be proposed as a valuable alternative to other commonly used procedures and be reliably used on any CT and MRI scanner.

  2. Physics-based multiscale coupling for full core nuclear reactor simulation

    DOE PAGES

    Gaston, Derek R.; Permann, Cody J.; Peterson, John W.; ...

    2015-10-01

    Numerical simulation of nuclear reactors is a key technology in the quest for improvements in efficiency, safety, and reliability of both existing and future reactor designs. Historically, simulation of an entire reactor was accomplished by linking together multiple existing codes that each simulated a subset of the relevant multiphysics phenomena. Recent advances in the MOOSE (Multiphysics Object Oriented Simulation Environment) framework have enabled a new approach: multiple domain-specific applications, all built on the same software framework, are efficiently linked to create a cohesive application. This is accomplished with a flexible coupling capability that allows for a variety of different datamore » exchanges to occur simultaneously on high performance parallel computational hardware. Examples based on the KAIST-3A benchmark core, as well as a simplified Westinghouse AP-1000 configuration, demonstrate the power of this new framework for tackling—in a coupled, multiscale manner—crucial reactor phenomena such as CRUD-induced power shift and fuel shuffle. 2014 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-SA license« less

  3. Deployable System for Crash-Load Attenuation

    NASA Technical Reports Server (NTRS)

    Kellas, Sotiris; Jackson, Karen E.

    2007-01-01

    An externally deployable honeycomb structure is investigated with respect to crash energy management for light aircraft. The new concept utilizes an expandable honeycomb-like structure to absorb impact energy by crushing. Distinguished by flexible hinges between cell wall junctions that enable effortless deployment, the new energy absorber offers most of the desirable features of an external airbag system without the limitations of poor shear stability, system complexity, and timing sensitivity. Like conventional honeycomb, once expanded, the energy absorber is transformed into a crush efficient and stable cellular structure. Other advantages, afforded by the flexible hinge feature, include a variety of deployment options such as linear, radial, and/or hybrid deployment methods. Radial deployment is utilized when omnidirectional cushioning is required. Linear deployment offers better efficiency, which is preferred when the impact orientation is known in advance. Several energy absorbers utilizing different deployment modes could also be combined to optimize overall performance and/or improve system reliability as outlined in the paper. Results from a series of component and full scale demonstration tests are presented as well as typical deployment techniques and mechanisms. LS-DYNA analytical simulations of selected tests are also presented.

  4. On the importance of electron impact processes in excimer-pumped alkali laser-induced plasmas

    DOE PAGES

    Markosyan, Aram H.

    2017-10-18

    We present that the excimer-pumped alkali laser (XPAL) system has recently been demonstrated in several different mixtures of alkali vapor and rare gas. Without special preventive measures, plasma formation during operation of XPAL is unavoidable. Some recent advancements in the availability of reliable data for electron impact collisions with atoms and molecules have enabled development of a complete reaction mechanism to investigate XPAL-induced plasmas. Here, we report on pathways leading to plasma formation in an Ar/C 2H 6/CsAr/C2H6/Cs XPAL sustained at different cell temperatures. We find that depending on the operating conditions, the contribution of electron impact processes can bemore » as little as bringing the excitation of Cs(P 2) states to higher level Cs** states, and can be as high as bringing Cs(P 2) excited states to a full ionization. Increasing the input pumping power or cell temperature, or decreasing the C 2H 6 mole fraction leads to electron impact processes dominating in plasma formation over the energy pooling mechanisms previously reported in literature.« less

  5. FPGA-Based X-Ray Detection and Measurement for an X-Ray Polarimeter

    NASA Technical Reports Server (NTRS)

    Gregory, Kyle; Hill, Joanne; Black, Kevin; Baumgartner, Wayne

    2013-01-01

    This technology enables detection and measurement of x-rays in an x-ray polarimeter using a field-programmable gate array (FPGA). The technology was developed for the Gravitational and Extreme Magnetism Small Explorer (GEMS) mission. It performs precision energy and timing measurements, as well as rejection of non-x-ray events. It enables the GEMS polarimeter to detect precisely when an event has taken place so that additional measurements can be made. The technology also enables this function to be performed in an FPGA using limited resources so that mass and power can be minimized while reliability for a space application is maximized and precise real-time operation is achieved. This design requires a low-noise, charge-sensitive preamplifier; a highspeed analog to digital converter (ADC); and an x-ray detector with a cathode terminal. It functions by computing a sum of differences for time-samples whose difference exceeds a programmable threshold. A state machine advances through states as a programmable number of consecutive samples exceeds or fails to exceed this threshold. The pulse height is recorded as the accumulated sum. The track length is also measured based on the time from the start to the end of accumulation. For track lengths longer than a certain length, the algorithm estimates the barycenter of charge deposit by comparing the accumulator value at the midpoint to the final accumulator value. The design also employs a number of techniques for rejecting background events. This innovation enables the function to be performed in space where it can operate autonomously with a rapid response time. This implementation combines advantages of computing system-based approaches with those of pure analog approaches. The result is an implementation that is highly reliable, performs in real-time, rejects background events, and consumes minimal power.

  6. Conducting Real-Time Videofluoroscopic Swallow Study via Telepractice: A Preliminary Feasibility and Reliability Study.

    PubMed

    Burns, Clare L; Ward, Elizabeth C; Hill, Anne J; Phillips, Nick; Porter, Linda

    2016-06-01

    A small number of studies have examined the feasibility of conducting videofluoroscopic swallow studies (VFSS) via telepractice. While the results have confirmed this potential, the systems tested to date have either reported issues that impacted the ability to analyze/interpret the VFSS recordings in real time, or they were not designed to enable real-time interpretation. Further system design is needed to establish a telepractice model that enables the VFSS assessment to be both guided and interpreted live in real time. The aim of this study was to test the feasibility and reliability of using a telepractice system to enable live VFSS assessment. Twenty adult patients underwent a VFSS assessment directed by a telepractice SLP with competency in VFSS located in another room of the hospital. The telepractice clinician led the sessions using a C20 Cisco TelePresence System. This was linked in real time via a secure telehealth network (at 4 megabits per second (Mbit/s)) to a C60 Cisco TelePresence System located in a fluoroscopy suite, connected to the digital fluoroscopy system. Levels of agreement were calculated between the telepractice clinician and a face-to-face clinician who simultaneously rated the VFSS in real time. High levels of agreement for swallowing parameters (range = 75-100 %; k = -0.34 to 1.0) and management decisions (range = 70-100 %, k = 0.64-1.0) were found. A post-session questionnaire revealed clinicians agreed that the telepractice system enabled successful remote assessment of VFSS. The findings support the potential to conduct live VFSS assessment via a telepractice model.

  7. A novel paraplegia model in awake behaving macaques.

    PubMed

    Krucoff, Max O; Zhuang, Katie; MacLeod, David; Yin, Allen; Byun, Yoon Woo; Manson, Roberto Jose; Turner, Dennis A; Oliveira, Laura; Lebedev, Mikhail A

    2017-09-01

    Lower limb paralysis from spinal cord injury (SCI) or neurological disease carries a poor prognosis for recovery and remains a large societal burden. Neurophysiological and neuroprosthetic research have the potential to improve quality of life for these patients; however, the lack of an ethical and sustainable nonhuman primate model for paraplegia hinders their advancement. Therefore, our multidisciplinary team developed a way to induce temporary paralysis in awake behaving macaques by creating a fully implantable lumbar epidural catheter-subcutaneous port system that enables easy and reliable targeted drug delivery for sensorimotor blockade. During treadmill walking, aliquots of 1.5% lidocaine with 1:200,000 epinephrine were percutaneously injected into the ports of three rhesus macaques while surface electromyography (EMG) recorded muscle activity from their quadriceps and gastrocnemii. Diminution of EMG amplitude, loss of voluntary leg movement, and inability to bear weight were achieved for 60-90 min in each animal, followed by a complete recovery of function. The monkeys remained alert and cooperative during the paralysis trials and continued to take food rewards, and the ports remained functional after several months. This technique will enable recording from the cortex and/or spinal cord in awake behaving nonhuman primates during the onset, maintenance, and resolution of paraplegia for the first time, thus opening the door to answering basic neurophysiological questions about the acute neurological response to spinal cord injury and recovery. It will also negate the need to permanently injure otherwise high-value research animals for certain experimental paradigms aimed at developing and testing neural interface decoding algorithms for patients with lower extremity dysfunction. NEW & NOTEWORTHY A novel implantable lumbar epidural catheter-subcutaneous port system enables targeted drug delivery and induction of temporary paraplegia in awake, behaving nonhuman primates. Three macaques displayed loss of voluntary leg movement for 60-90 min after injection of lidocaine with epinephrine, followed by a full recovery. This technique for the first time will enable ethical live recording from the proximal central nervous system during the acute onset, maintenance, and resolution of paraplegia. Copyright © 2017 the American Physiological Society.

  8. Wireless monitoring of reconstructed 12-lead ECG in atrial fibrillation patients enables differential diagnosis of recurrent arrhythmias.

    PubMed

    Vukajlovic, Dejan; Gussak, Ihor; George, Samuel; Simic, Goran; Bojovic, Bosko; Hadzievski, Ljupco; Stojanovic, Bojan; Angelkov, Lazar; Panescu, Dorin

    2011-01-01

    Differential diagnosis of symptomatic events in post-ablation atrial fibrillation (AF) patients (pts) is important; in particular, accurate, reliable detection of AF or atrial flutter (AFL) is essential. However, existing remote monitoring devices usually require attached leads and are not suitable for prolonged monitoring; moreover, most do not provide sufficient information to assess atrial activity, since they generally monitor only 1-3 ECG leads and rely on RR interval variability for AF diagnosis. A new hand-held, wireless, symptom-activated event monitor (CardioBip; CB) does not require attached leads and hence can be conveniently used for extended periods. Moreover, CB provides data that enables remote reconstruction of full 12-lead ECG data including atrial signal information. We hypothesized that these CB features would enable accurate remote differential diagnosis of symptomatic arrhythmias in post-ablation AF pts. 21 pts who underwent catheter ablation for AF were instructed to make a CB transmission (TX) whenever palpitations, lightheadedness, or similar symptoms occurred, and at multiple times daily when asymptomatic, during a 60 day post-ablation time period. CB transmissions (TXs) were analyzed blindly by 2 expert readers, with differences adjudicated by consensus. 7 pts had no symptomatic episodes during the monitoring period. 14 of 21 pts had symptomatic events and made a total of 1699 TX, 164 of which were during symptoms. TX quality was acceptable for rhythm diagnosis and atrial activity in 96%. 118 TX from 10 symptomatic pts showed AF (96 TX from 10 pts) or AFL (22 TX from 3 pts), and 46 TX from 9 pts showed frequent PACs or PVCs. No other arrhythmias were detected. Five pts made symptomatic TX during AF/AFL and also during PACs/PVCs. Use of CB during symptomatic episodes enabled detection and differential diagnosis of symptomatic arrhythmias. The ability of CB to provide accurate reconstruction of 12 L ECGs including atrial activity, combined with its ease of use, makes it suitable for long-term surveillance for recurrent AF in post-ablation patients.

  9. Psychometric properties of the Swedish PedsQL, Pediatric Quality of Life Inventory 4.0 generic core scales.

    PubMed

    Petersen, Solveig; Hägglöf, Bruno; Stenlund, Hans; Bergström, Erik

    2009-09-01

    To study the psychometric performance of the Swedish version of the Pediatric Quality of Life Inventory (PedsQL) 4.0 generic core scales in a general child population in Sweden. PedsQL forms were distributed to 2403 schoolchildren and 888 parents in two different school settings. Reliability and validity was studied for self-reports and proxy reports, full forms and short forms. Confirmatory factor analysis tested the factor structure and multigroup confirmatory factor analysis tested measurement invariance between boys and girls. Test-retest reliability was demonstrated for all scales and internal consistency reliability was shown with alpha value exceeding 0.70 for all scales but one (self-report short form: social functioning). Child-parent agreement was low to moderate. The four-factor structure of the PedsQL and factorial invariance across sex subgroups were confirmed for the self-report forms and for the proxy short form, while model fit indices suggested improvement of several proxy full-form scales. The Swedish PedsQL 4.0 generic core scales are a reliable and valid tool for health-related quality of life (HRQoL) assessment in Swedish child populations. The proxy full form, however, should be used with caution. The study also support continued use of the PedsQL as a four-factor model, capable of revealing meaningful HRQoL differences between boys and girls.

  10. Standard Isotherm Fit Information for Dry CO2 on Sorbents for 4-Bed Molecular Sieve

    NASA Technical Reports Server (NTRS)

    Cmarik, G. E.; Son, K. N.; Knox, J. C.

    2017-01-01

    Onboard the ISS, one of the systems tasked with removal of metabolic carbon dioxide (CO2) is a 4-bed molecular sieve (4BMS) system. In order to enable a 4-person mission to succeed, systems for removal of metabolic CO2 must reliably operate for several years while minimizing power, mass, and volume requirements. This minimization can be achieved through system redesign and/or changes to the separation material(s). A material screening process has identified the most reliable sorbent materials for the next 4BMS. Sorbent characterization will provide the information necessary to guide system design by providing inputs for computer simulations.

  11. Adaptive vehicle motion estimation and prediction

    NASA Astrophysics Data System (ADS)

    Zhao, Liang; Thorpe, Chuck E.

    1999-01-01

    Accurate motion estimation and reliable maneuver prediction enable an automated car to react quickly and correctly to the rapid maneuvers of the other vehicles, and so allow safe and efficient navigation. In this paper, we present a car tracking system which provides motion estimation, maneuver prediction and detection of the tracked car. The three strategies employed - adaptive motion modeling, adaptive data sampling, and adaptive model switching probabilities - result in an adaptive interacting multiple model algorithm (AIMM). The experimental results on simulated and real data demonstrate that our tracking system is reliable, flexible, and robust. The adaptive tracking makes the system intelligent and useful in various autonomous driving tasks.

  12. Solid-Body Fuse Developed for High- Voltage Space Power Missions

    NASA Technical Reports Server (NTRS)

    Dolce, James L.; Baez, Anastacio N.

    2001-01-01

    AEM Incorporated has completed the development, under a NASA Glenn Research Center contract, of a solid-body fuse for high-voltage power systems of satellites and spacecraft systems. High-reliability fuses presently defined by MIL-PRF-23419 do not meet the increased voltage and amperage requirements for the next generation of spacecraft. Solid-body fuses exhibit electrical and mechanical attributes that enable these fuses to perform reliably in the vacuum and high-vibration and -shock environments typically present in spacecraft applications. The construction and screening techniques for solid-body fuses described by MIL-PRF-23419/12 offer an excellent roadmap for the development of high-voltage solid-body fuses.

  13. Final Technical Report for Automated Manufacturing of Innovative CPV/PV Modules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okawa, David

    Cogenra’s Dense Cell Interconnect system was designed to use traditional front-contact cells and string them together into high efficiency and high reliability “supercells”. This novel stringer allows one to take advantage of the ~100 GW/year of existing cell production capacity and create a solar product for the customer that will produce more power and last longer than traditional PV products. The goal for this program was for Cogenra Solar to design and develop a first-of-kind automated solar manufacturing line that produces strings of overlapping cells or “supercells” based on Cogenra’s Dense Cell Interconnect (DCI) technology for their Low Concentration Photovoltaicmore » (LCPV) systems. This will enable the commercialization of DCI technology to improve the efficiency, reliability and economics for their Low Concentration Photovoltaic systems. In this program, Cogenra Solar very successfully designed, developed, built, installed, and started up the ground-breaking manufacturing tools required to assemble supercells. Cogenra then successfully demonstrated operation of the integrated line at high yield and throughput far exceeding expectations. The development of a supercell production line represents a critical step toward a high volume and low cost Low Concentration Photovoltaic Module with Dense Cell Interconnect technology and has enabled the evaluation of the technology for reliability and yield. Unfortunately, performance and cost headwinds on Low Concentration Photovoltaics systems including lack of diffuse capture (10-15% hit) and more expensive tracker requirements resulted in a move away from LCPV technology. Fortunately, the versatility of Dense Cell Interconnect technology allows for application to flat plate module technology as well and Cogenra has worked with the DOE to utilize the learning from this grant to commercialize DCI technology for the solar market through the on-going grant: Catalyzing PV Manufacturing in the US With Cogenra Solar’s Next-Generation Dense Cell Interconnect PV Module Manufacturing Technology. This program is now very successfully building off of this work and commercializing the technology to enable increased solar adoption.« less

  14. Test-retest reliability and agreement of the SPI-Questionnaire to detect symptoms of digital ischemia in elite volleyball players.

    PubMed

    van de Pol, Daan; Zacharian, Tigran; Maas, Mario; Kuijer, P Paul F M

    2017-06-01

    The Shoulder posterior circumflex humeral artery Pathology and digital Ischemia - questionnaire (SPI-Q) has been developed to enable periodic surveillance of elite volleyball players, who are at risk for digital ischemia. Prior to implementation, assessing reliability is mandatory. Therefore, the test-retest reliability and agreement of the SPI-Q were evaluated among the population at risk. A questionnaire survey was performed with a 2-week interval among 65 elite male volleyball players assessing symptoms of cold, pale and blue digits in the dominant hand during or after practice or competition using a 4-point Likert scale (never, sometimes, often and always). Kappa (κ) and percentage of agreement (POA) were calculated for individual symptoms, and to distinguish symptomatic and asymptomatic players. For the individual symptoms, κ ranged from "poor" (0.25) to "good" (0.63), and POA ranged from "moderate" (78%) to "good" (97%). To classify symptomatic players, the SPI-Q showed "good" reliability (κ = 0.83; 95%CI 0.69-0.97) and "good" agreement (POA = 92%). The current study has proven the SPI-Q to be reliable for detecting elite male indoor volleyball players with symptoms of digital ischemia.

  15. Update on MTTF figures for linear and rotary coolers of Thales Cryogenics

    NASA Astrophysics Data System (ADS)

    van de Groep, W.; van der Weijden, H.; van Leeuwen, R.; Benschop, T.; Cauquil, J. M.; Griot, R.

    2012-06-01

    Thales Cryogenics has an extensive background in delivering linear and rotary coolers for military, civil and space programs. During the last years several technical improvements have increased the lifetime of all Thales coolers resulting in significantly higher Mean Time To Failure (MTTF) figures. In this paper not only updated MTTF values for most of the products in our portfolio will be presented but also the methodology used to come to these reliability figures will be explained. The differences between rotary and linear coolers will be highlighted including the different failure modes influencing the lifetime under operational conditions. These updated reliability figures are based on extensive test results for both rotary and linear coolers as well as Weibull analysis, failure mode identifications, various types of lifetime testing and field results of operational coolers. The impact of the cooler selection for typical applications will be outlined. This updated reliability approach will enable an improved tradeoff for cooler selection in applications where MTTF and a correct reliability assessment is key. Improbing on cooler selection and an increased insight in cooler reliability will result in a higher uptime and operability of equipment, less risk on unexpected failures and lower costs of ownership.

  16. Innovations in mission architectures for exploration beyond low Earth orbit

    NASA Technical Reports Server (NTRS)

    Cooke, D. R.; Joosten, B. J.; Lo, M. W.; Ford, K. M.; Hansen, R. J.

    2003-01-01

    Through the application of advanced technologies and mission concepts, architectures for missions beyond Earth orbit have been dramatically simplified. These concepts enable a stepping stone approach to science driven; technology enabled human and robotic exploration. Numbers and masses of vehicles required are greatly reduced, yet the pursuit of a broader range of science objectives is enabled. The scope of human missions considered range from the assembly and maintenance of large aperture telescopes for emplacement at the Sun-Earth libration point L2, to human missions to asteroids, the moon and Mars. The vehicle designs are developed for proof of concept, to validate mission approaches and understand the value of new technologies. The stepping stone approach employs an incremental buildup of capabilities, which allows for future decision points on exploration objectives. It enables testing of technologies to achieve greater reliability and understanding of costs for the next steps in exploration. c2003 American Institute of Aeronautics and Astronautics. Published by Elsevier Science Ltd. All rights reserved.

  17. Reliability models: the influence of model specification in generation expansion planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stremel, J.P.

    1982-10-01

    This paper is a critical evaluation of reliability methods used for generation expansion planning. It is shown that the methods for treating uncertainty are critical for determining the relative reliability value of expansion alternatives. It is also shown that the specification of the reliability model will not favor all expansion options equally. Consequently, the model is biased. In addition, reliability models should be augmented with an economic value of reliability (such as the cost of emergency procedures or energy not served). Generation expansion evaluations which ignore the economic value of excess reliability can be shown to be inconsistent. The conclusionsmore » are that, in general, a reliability model simplifies generation expansion planning evaluations. However, for a thorough analysis, the expansion options should be reviewed for candidates which may be unduly rejected because of the bias of the reliability model. And this implies that for a consistent formulation in an optimization framework, the reliability model should be replaced with a full economic optimization which includes the costs of emergency procedures and interruptions in the objective function.« less

  18. Screening for depression in clinical practice: reliability and validity of a five-item subset of the CES-Depression.

    PubMed

    Bohannon, Richard W; Maljanian, Rose; Goethe, John

    2003-12-01

    Individuals with chronic disease are not screened routinely for depression. Availability of an abbreviated test with demonstrated reliability and validity might encourage screening so we explored the reliability and validity of a 5-item subset of the 20-item Center for Epidemiological Studies Depression Scale among inner-city outpatients with chronic asthma or diabetes. Most patients were female (73.1%) and Hispanic (61.8%). Acceptable reliability was shown by Cronbach alpha (.76) for the subset of 5 items. Validity was supported by the high correlation of .91 between patients' scores on the 5-item subset and the full 20 items. The 5 items reflected a single factor (eigenvalue = 2.66). Receiver operating characteristic curve analysis identified cut-points for the 5 items that were sensitive (> .84) and specific (> or = .80) in identifying patients classified as depressed by full 20 items. The reduced patient and clinician burden of the subset of 5 items, as well as its desirable psychometric properties, support broader application of this subset as a screening tool for depression.

  19. Bumper and grille airbags concept for enhanced vehicle compatibility in side impact: phase II.

    PubMed

    Barbat, Saeed; Li, Xiaowei; Prasad, Priya

    2013-01-01

    Fundamental physics and numerous field studies have shown a higher injury and fatality risk for occupants in smaller and lighter vehicles when struck by heavier, taller and higher vehicles. The consensus is that the significant parameters influencing compatibility in front-to-side crashes are geometric interaction, vehicle stiffness, and vehicle mass. The objective of this research is to develop a concept of deployable bumper and grille airbags for improved vehicle compatibility in side impact. The external airbags, deployed upon signals from sensors, may help mitigate the effect of weight, geometry and stiffness differences and reduce side intrusions. However, a highly reliable pre-crash sensing system is required to enable the reliable deployment, which is currently not technologically feasible. Analytical and numerical methods and hardware testing were used to help develop the deployable external airbags concept. Various Finite Element (FE) models at different stages were developed and an extensive number of iterations were conducted to help optimize airbag and inflator parameters to achieve desired targets. The concept development was executed and validated in two phases. This paper covers Phase II ONLY, which includes: (1) Re-design of the airbag geometry, pressure, and deployment strategies; (2) Further validation using a Via sled test of a 48 kph perpendicular side impact of an SUV-type impactor against a stationary car with US-SID-H3 crash dummy in the struck side; (3) Design of the reaction surface necessary for the bumper airbag functionality. The concept was demonstrated through live deployment of external airbags with a reaction surface in a full-scale perpendicular side impact of an SUV against a stationary passenger car at 48 kph. This research investigated only the concept of the inflatable devices since pre-crash sensing development was beyond the scope of this research. The concept design parameters of the bumper and grille airbags are presented in this paper. Full vehicle-to-vehicle crash test results, Via sled test, and simulation results are also presented. Head peak acceleration, Head Injury Criteria (HIC), Thoracic Trauma Index (TTI), and Pelvic acceleration for the SID-H3 dummy and structural intrusion profiles were used as performance metrics for the bumper and grille airbags. Results obtained from the Via sled tests and the full vehicle-to-vehicle tests with bumper and grille airbags were compared to those of baseline test results with no external airbags.

  20. Calculating system reliability with SRFYDO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morzinski, Jerome; Anderson - Cook, Christine M; Klamann, Richard M

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for themore » system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.« less

  1. Reliable gene expression analysis by reverse transcription-quantitative PCR: reporting and minimizing the uncertainty in data accuracy.

    PubMed

    Remans, Tony; Keunen, Els; Bex, Geert Jan; Smeets, Karen; Vangronsveld, Jaco; Cuypers, Ann

    2014-10-01

    Reverse transcription-quantitative PCR (RT-qPCR) has been widely adopted to measure differences in mRNA levels; however, biological and technical variation strongly affects the accuracy of the reported differences. RT-qPCR specialists have warned that, unless researchers minimize this variability, they may report inaccurate differences and draw incorrect biological conclusions. The Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines describe procedures for conducting and reporting RT-qPCR experiments. The MIQE guidelines enable others to judge the reliability of reported results; however, a recent literature survey found low adherence to these guidelines. Additionally, even experiments that use appropriate procedures remain subject to individual variation that statistical methods cannot correct. For example, since ideal reference genes do not exist, the widely used method of normalizing RT-qPCR data to reference genes generates background noise that affects the accuracy of measured changes in mRNA levels. However, current RT-qPCR data reporting styles ignore this source of variation. In this commentary, we direct researchers to appropriate procedures, outline a method to present the remaining uncertainty in data accuracy, and propose an intuitive way to select reference genes to minimize uncertainty. Reporting the uncertainty in data accuracy also serves for quality assessment, enabling researchers and peer reviewers to confidently evaluate the reliability of gene expression data. © 2014 American Society of Plant Biologists. All rights reserved.

  2. DOE-NE Proliferation and Terrorism Risk Assessment: FY12 Plans Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadasivan, Pratap

    2012-06-21

    This presentation provides background information on FY12 plans for the DOE Office of Nuclear Energy Proliferation and Terrorism Risk Assessment program. Program plans, organization, and individual project elements are described. Research objectives are: (1) Develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the life of current reactors; (2) Develop improvements in the affordability of new reactors to enable nuclear energy; (3) Develop Sustainable Nuclear Fuel Cycles; and (4) Understand and minimize the risks of nuclear proliferation and terrorism - Goal is to enable the use of risk information to inform NE R&D programmore » planning.« less

  3. PHM Enabled Autonomous Propellant Loading Operations

    NASA Technical Reports Server (NTRS)

    Walker, Mark; Figueroa, Fernando

    2017-01-01

    The utility of Prognostics and Health Management (PHM) software capability applied to Autonomous Operations (AO) remains an active research area within aerospace applications. The ability to gain insight into which assets and subsystems are functioning properly, along with the derivation of confident predictions concerning future ability, reliability, and availability, are important enablers for making sound mission planning decisions. When coupled with software that fully supports mission planning and execution, an integrated solution can be developed that leverages state assessment and estimation for the purposes of delivering autonomous operations. The authors have been applying this integrated, model-based approach to the autonomous loading of cryogenic spacecraft propellants at Kennedy Space Center.

  4. Enabling aspects of fiber optic acoustic sensing in harsh environments

    NASA Astrophysics Data System (ADS)

    Saxena, Indu F.

    2013-05-01

    The advantages of optical fiber sensing in harsh electromagnetic as well as physical stress environments make them uniquely suited for structural health monitoring and non-destructive testing. In addition to aerospace applications they are making a strong footprint in geophysical monitoring and exploration applications for higher temperature and pressure environments, due to the high temperature resilience of fused silica glass sensors. Deeper oil searches and geothermal exploration and harvesting are possible with these novel capabilities. Progress in components and technologies that are enabling these systems to be fieldworthy are reviewed and emerging techniques summarized that could leapfrog the system performance and reliability.

  5. A vector matching method for analysing logic Petri nets

    NASA Astrophysics Data System (ADS)

    Du, YuYue; Qi, Liang; Zhou, MengChu

    2011-11-01

    Batch processing function and passing value indeterminacy in cooperative systems can be described and analysed by logic Petri nets (LPNs). To directly analyse the properties of LPNs, the concept of transition enabling vector sets is presented and a vector matching method used to judge the enabling transitions is proposed in this article. The incidence matrix of LPNs is defined; an equation about marking change due to a transition's firing is given; and a reachable tree is constructed. The state space explosion is mitigated to a certain extent from directly analysing LPNs. Finally, the validity and reliability of the proposed method are illustrated by an example in electronic commerce.

  6. Smart phones: platform enabling modular, chemical, biological, and explosives sensing

    NASA Astrophysics Data System (ADS)

    Finch, Amethist S.; Coppock, Matthew; Bickford, Justin R.; Conn, Marvin A.; Proctor, Thomas J.; Stratis-Cullum, Dimitra N.

    2013-05-01

    Reliable, robust, and portable technologies are needed for the rapid identification and detection of chemical, biological, and explosive (CBE) materials. A key to addressing the persistent threat to U.S. troops in the current war on terror is the rapid detection and identification of the precursor materials used in development of improvised explosive devices, homemade explosives, and bio-warfare agents. However, a universal methodology for detection and prevention of CBE materials in the use of these devices has proven difficult. Herein, we discuss our efforts towards the development of a modular, robust, inexpensive, pervasive, archival, and compact platform (android based smart phone) enabling the rapid detection of these materials.

  7. Quantitative Glycomics Strategies*

    PubMed Central

    Mechref, Yehia; Hu, Yunli; Desantos-Garcia, Janie L.; Hussein, Ahmed; Tang, Haixu

    2013-01-01

    The correlations between protein glycosylation and many biological processes and diseases are increasing the demand for quantitative glycomics strategies enabling sensitive monitoring of changes in the abundance and structure of glycans. This is currently attained through multiple strategies employing several analytical techniques such as capillary electrophoresis, liquid chromatography, and mass spectrometry. The detection and quantification of glycans often involve labeling with ionic and/or hydrophobic reagents. This step is needed in order to enhance detection in spectroscopic and mass spectrometric measurements. Recently, labeling with stable isotopic reagents has also been presented as a very viable strategy enabling relative quantitation. The different strategies available for reliable and sensitive quantitative glycomics are herein described and discussed. PMID:23325767

  8. Drilling into seismogenic zones of M2.0 - M5.5 earthquakes in deep South African gold mines (DSeis)

    NASA Astrophysics Data System (ADS)

    Ogasawara, Hiroshi; Durrheim, Raymond; Yabe, Yasuo; Ito, Takatoshi; van Aswegen, Gerrie; Cichowicz, Artur; Onstott, Tullis; Kieft, Tom; Boettcher, Margaret; Wiemer, Stefan; Ziegler, Martin; Janssen, Christoph; Shapiro, Serge; Gupta, Harsh; Dight, Phil

    2016-04-01

    Several times a year, mining-induced earthquakes with magnitudes equal to or larger than 2 take place only a few tens of meters away from active workings in South African gold mines at depths of up to 3.4 km. The largest event recorded in mining regions, a M5.5 earthquake, took place near Orkney, South Africa on 5 August 2014, with the upper edge of the activated fault being only some hundred meters below the nearest mine workings (3.0 km depth). This is one of the rare events for which detailed seismological data are available, both from surface and underground seismometers and strainmeters, allowing for a detailed seismological analysis and comparison with in-situ observed data. Therefore, this earthquake calls for drilling to investigate the seismogenic zones before aftershocks diminish. Such a project will have a significantly better spatial coverage (including nuclei of ruptures, strong motion sources, asperities, and rupture edges) than drilling in seismogenic zones of natural large earthquakes and will be possible with a lower risk and at much smaller costs. In seismogenic zones in a critical state of stress, it is difficult to delineate reliably the local spatial variation in both directions and magnitudes of principal stresses (3D full stress tensor) reliably. However, we have overcome this problem. We are able to numerically model stress better than before, enabling us to orient boreholes so that the chance of stress-induced damage during stress measurement is minimized, and enabling us to measure the full 3D stress tensor successively in a hole within reasonable time even when stresses are as large as those expected in seismogenic zones. Better recovery of cores with less stress-induced damage during drilling is also feasible. These will allow us to address key scientific questions in earthquake science and associated deep biosphere activities which have remained elusive. We held a 4-day workshop sponsored by ICDP and Ritsumeikan University in October/November 2015, which confirmed the great scientific value as well as technical feasibility, flexibility, and cost-effectiveness of drilling into the targets which have already been well seismologically probed. The value will be maximized if we combine outcomes from a limited number of holes drilled from 3 km depth into the M5.5 seismogenic zones (~ 4 km depth) with larger number of boreholes from mining horizons into the other targets (M~2 faults) already extensively exhumed by mining or which will be in future. We could have additional inputs during the 2015 AGU Fall Meeting period. We intend to start drilling before the M5.5 aftershocks diminish or mining around the M2.8 fault starts to alter stress considerably.

  9. Enhancing Critical Infrastructure and Key Resources (CIKR) Level-0 Physical Process Security Using Field Device Distinct Native Attribute Features

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopez, Juan; Liefer, Nathan C.; Busho, Colin R.

    Here, the need for improved Critical Infrastructure and Key Resource (CIKR) security is unquestioned and there has been minimal emphasis on Level-0 (PHY Process) improvements. Wired Signal Distinct Native Attribute (WS-DNA) Fingerprinting is investigated here as a non-intrusive PHY-based security augmentation to support an envisioned layered security strategy. Results are based on experimental response collections from Highway Addressable Remote Transducer (HART) Differential Pressure Transmitter (DPT) devices from three manufacturers (Yokogawa, Honeywell, Endress+Hauer) installed in an automated process control system. Device discrimination is assessed using Time Domain (TD) and Slope-Based FSK (SB-FSK) fingerprints input to Multiple Discriminant Analysis, Maximum Likelihood (MDA/ML)more » and Random Forest (RndF) classifiers. For 12 different classes (two devices per manufacturer at two distinct set points), both classifiers performed reliably and achieved an arbitrary performance benchmark of average cross-class percent correct of %C > 90%. The least challenging cross-manufacturer results included near-perfect %C ≈ 100%, while the more challenging like-model (serial number) discrimination results included 90%< %C < 100%, with TD Fingerprinting marginally outperforming SB-FSK Fingerprinting; SB-FSK benefits from having less stringent response alignment and registration requirements. The RndF classifier was most beneficial and enabled reliable selection of dimensionally reduced fingerprint subsets that minimize data storage and computational requirements. The RndF selected feature sets contained 15% of the full-dimensional feature sets and only suffered a worst case %CΔ = 3% to 4% performance degradation.« less

  10. Requirements for diagnosis of malaria at different levels of the laboratory network in Africa.

    PubMed

    Long, Earl G

    2009-06-01

    The rapid increase of resistance to cheap, reliable antimalarials, the increasing cost of effective drugs, and the low specificity of clinical diagnosis has increased the need for more reliable diagnostic methods for malaria. The most commonly used and most reliable remains microscopic examination of stained blood smears, but this technique requires skilled personnel, precision instruments, and ideally a source of electricity. Microscopy has the advantage of enabling the examiner to identify the species, stage, and density of an infection. An alternative to microscopy is the rapid diagnostic test (RDT), which uses a labeled monoclonal antibody to detect circulating parasitic antigens. This test is most commonly used to detect Plasmodium falciparum infections and is available in a plastic cassette format. Both microscopy and RDTs should be available at all levels of laboratory service in endemic areas, but in peripheral laboratories with minimally trained staff, the RDT may be a more practical diagnostic method.

  11. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-01-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  12. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-03-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  13. Hybrid propulsion technology program

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Technology was identified which will enable application of hybrid propulsion to manned and unmanned space launch vehicles. Two design concepts are proposed. The first is a hybrid propulsion system using the classical method of regression (classical hybrid) resulting from the flow of oxidizer across a fuel grain surface. The second system uses a self-sustaining gas generator (gas generator hybrid) to produce a fuel rich exhaust that was mixed with oxidizer in a separate combustor. Both systems offer cost and reliability improvement over the existing solid rocket booster and proposed liquid boosters. The designs were evaluated using life cycle cost and reliability. The program consisted of: (1) identification and evaluation of candidate oxidizers and fuels; (2) preliminary evaluation of booster design concepts; (3) preparation of a detailed point design including life cycle costs and reliability analyses; (4) identification of those hybrid specific technologies needing improvement; and (5) preperation of a technology acquisition plan and large scale demonstration plan.

  14. Psychometric testing of the modified Care Dependency Scale among hospitalized school-aged children in Germany.

    PubMed

    Tork, Hanan; Lohrmann, Christa; Dassen, Theo

    2008-03-01

    The objectives of this study were to examine the psychometric properties of the modified Care Dependency Scale in a pediatric setting and to explore the extent of dependency of school-aged children regarding their self-care. The data were collected from 130 hospitalized children, aged 6-12 years. The reliability was determined by Cronbach's alpha, which showed a high level of consistency. The subsequent inter-rater reliability revealed moderate-to-substantial agreement. The criterion-related validity was tested by comparing the sum scores of the Care Dependency Scale for Paediatrics and the Visual Analog Scale. Factor analysis was used to investigate the construct validity and resulted in a one-factor solution. In conclusion, this study provides evidence that the Care Dependency Scale for Paediatrics is a valid and reliable measure that offers a comprehensive assessment from a nursing perspective and enables nurses to help children acquire independence.

  15. Health On the Net's 20 Years of Transparent and Reliable Health Information.

    PubMed

    Boyer, Célia; Appel, Ron D; Ball, Marion J; van Bemmel, Jan H; Bergmans, Jean-Paul; Carpentier, Michel; Hochstrasser, Denis; Lindberg, Donald; Miller, Randolph; Peterschmitt, Jean-Claude; Safran, Charlie; Thonnet, Michèle; Geissbühler, Antoine

    2016-01-01

    The Health On the Net Foundation (HON) was born in 1996, during the beginning of the World Wide Web, from a collective decision by health specialists, led by the late Jean-Raoul Scherrer, who anticipated the need for online trustworthy health information. Because the Internet is a free space that everyone shares, a search for quality information is like a shot in the dark: neither will reliably hit their target. Thus, HON was created to promote deployment of useful and reliable online health information, and to enable its appropriate and efficient use. Two decades on, HON is the oldest and most valued quality marker for online health information. The organization has maintained its reputation through dynamic measures, innovative endeavors and dedication to upholding key values and goals. This paper provides an overview of the HON Foundation, and its activities, challenges, and achievements over the years.

  16. A Robust Compositional Architecture for Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Deney, Ewen; Farrell, Kimberley; Giannakopoulos, Dimitra; Jonsson, Ari; Frank, Jeremy; Bobby, Mark; Carpenter, Todd; Estlin, Tara

    2006-01-01

    Space exploration applications can benefit greatly from autonomous systems. Great distances, limited communications and high costs make direct operations impossible while mandating operations reliability and efficiency beyond what traditional commanding can provide. Autonomous systems can improve reliability and enhance spacecraft capability significantly. However, there is reluctance to utilizing autonomous systems. In part this is due to general hesitation about new technologies, but a more tangible concern is that of reliability of predictability of autonomous software. In this paper, we describe ongoing work aimed at increasing robustness and predictability of autonomous software, with the ultimate goal of building trust in such systems. The work combines state-of-the-art technologies and capabilities in autonomous systems with advanced validation and synthesis techniques. The focus of this paper is on the autonomous system architecture that has been defined, and on how it enables the application of validation techniques for resulting autonomous systems.

  17. CASTOR: Widely Distributed Scalable Infospaces

    DTIC Science & Technology

    2008-11-01

    1  i Progress against Planned Objectives Enable nimble apps that react fast as...generation of scalable, reliable, ultra- fast event notification in Linux data centers. • Maelstrom, a spin-off from Ricochet, offers a powerful new option...out potential enhancements to WS-EVENTING and WS-NOTIFICATION based on our work. Potential impact for the warflighter. QSM achieves extremely fast

  18. Feasibility of fiberglass-reinforced bolted wood connections

    Treesearch

    D. F. Windorski; L. A. Soltis; R. J. Ross

    Bolted connections often fail by a shear plug or a splitting beneath the bolt caused by tension perpendicular-to-grain stresses as the bolt wedges its way through the wood. Preventing this type of failure would enhance the capacity and reliability of the bolted connection, thus increasing the overall integrity of a timber structure and enabling wood to compete...

  19. 75 FR 57761 - North American Electric Reliability Corporation; Notice of Filing September 14, 2010.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-22

    ... request for approval of implementation plans for Generator Owners and Generator Operators of nuclear power..., 888 First Street, NE., Washington, DC 20426. This filing is accessible on-line at http://www.ferc.gov... Washington, DC. There is an ``eSubscription'' link on the Web site that enables subscribers to receive e-mail...

  20. Home Economics. Sample Test Items. Levels I and II.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Bureau of Elementary and Secondary Educational Testing.

    A sample of behavioral objectives and related test items that could be developed for content modules in Home Economics levels I and II, this book is intended to enable teachers to construct more valid and reliable test materials. Forty-eight one-page modules are presented, and opposite each module are listed two to seven specific behavioral…

  1. NETL's Hybrid Performance, or Hyper, facility

    ScienceCinema

    None

    2018-02-13

    NETL's Hybrid Performance, or Hyper, facility is a one-of-a-kind laboratory built to develop control strategies for the reliable operation of fuel cell/turbine hybrids and enable the simulation, design, and implementation of commercial equipment. The Hyper facility provides a unique opportunity for researchers to explore issues related to coupling fuel cell and gas turbine technologies.

  2. A simple reliable procedure for obtaining metaphases from human leukemic bone-marrow aspirates suitable for Giemsa banding.

    PubMed

    Srivastava, A K; Smith, R D

    1980-02-01

    Short incubation of heparinized human leukemic bone-marrow cells in phosphate buffered saline containing colcemid and overnight chilling of fixed cells yields metaphases with elongated and well-spread chromosomes. This technique enables us to do trypsin-Giemsa banding of chromosomes obtained from leukemic marrow cells otherwise difficult to band.

  3. Summary: High Temperature Downhole Motor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raymond, David W.

    2017-10-01

    Directional drilling can be used to enable multi-lateral completions from a single well pad to improve well productivity and decrease environmental impact. Downhole rotation is typically developed with a motor in the Bottom Hole Assembly (BHA) that develops drilling power (speed and torque) necessary to drive rock reduction mechanisms (i.e., the bit) apart from the rotation developed by the surface rig. Historically, wellbore deviation has been introduced by a “bent-sub,” located in the BHA, that introduces a small angular deviation, typically less than 3 degrees, to allow the bit to drill off-axis with orientation of the BHA controlled at themore » surface. The development of a high temperature downhole motor would allow reliable use of bent subs for geothermal directional drilling. Sandia National Laboratories is pursuing the development of a high temperature motor that will operate on either drilling fluid (water-based mud) or compressed air to enable drilling high temperature, high strength, fractured rock. The project consists of designing a power section based upon geothermal drilling requirements; modeling and analysis of potential solutions; and design, development and testing of prototype hardware to validate the concept. Drilling costs contribute substantially to geothermal electricity production costs. The present development will result in more reliable access to deep, hot geothermal resources and allow preferential wellbore trajectories to be achieved. This will enable development of geothermal wells with multi-lateral completions resulting in improved geothermal resource recovery, decreased environmental impact and enhanced well construction economics.« less

  4. Microarray Я US: a user-friendly graphical interface to Bioconductor tools that enables accurate microarray data analysis and expedites comprehensive functional analysis of microarray results.

    PubMed

    Dai, Yilin; Guo, Ling; Li, Meng; Chen, Yi-Bu

    2012-06-08

    Microarray data analysis presents a significant challenge to researchers who are unable to use the powerful Bioconductor and its numerous tools due to their lack of knowledge of R language. Among the few existing software programs that offer a graphic user interface to Bioconductor packages, none have implemented a comprehensive strategy to address the accuracy and reliability issue of microarray data analysis due to the well known probe design problems associated with many widely used microarray chips. There is also a lack of tools that would expedite the functional analysis of microarray results. We present Microarray Я US, an R-based graphical user interface that implements over a dozen popular Bioconductor packages to offer researchers a streamlined workflow for routine differential microarray expression data analysis without the need to learn R language. In order to enable a more accurate analysis and interpretation of microarray data, we incorporated the latest custom probe re-definition and re-annotation for Affymetrix and Illumina chips. A versatile microarray results output utility tool was also implemented for easy and fast generation of input files for over 20 of the most widely used functional analysis software programs. Coupled with a well-designed user interface, Microarray Я US leverages cutting edge Bioconductor packages for researchers with no knowledge in R language. It also enables a more reliable and accurate microarray data analysis and expedites downstream functional analysis of microarray results.

  5. The future of drug discovery: enabling technologies for enhancing lead characterization and profiling therapeutic potential.

    PubMed

    Janero, David R

    2014-08-01

    Technology often serves as a handmaiden and catalyst of invention. The discovery of safe, effective medications depends critically upon experimental approaches capable of providing high-impact information on the biological effects of drug candidates early in the discovery pipeline. This information can enable reliable lead identification, pharmacological compound differentiation and successful translation of research output into clinically useful therapeutics. The shallow preclinical profiling of candidate compounds promulgates a minimalistic understanding of their biological effects and undermines the level of value creation necessary for finding quality leads worth moving forward within the development pipeline with efficiency and prognostic reliability sufficient to help remediate the current pharma-industry productivity drought. Three specific technologies discussed herein, in addition to experimental areas intimately associated with contemporary drug discovery, appear to hold particular promise for strengthening the preclinical valuation of drug candidates by deepening lead characterization. These are: i) hydrogen-deuterium exchange mass spectrometry for characterizing structural and ligand-interaction dynamics of disease-relevant proteins; ii) activity-based chemoproteomics for profiling the functional diversity of mammalian proteomes; and iii) nuclease-mediated precision gene editing for developing more translatable cellular and in vivo models of human diseases. When applied in an informed manner congruent with the clinical understanding of disease processes, technologies such as these that span levels of biological organization can serve as valuable enablers of drug discovery and potentially contribute to reducing the current, unacceptably high rates of compound clinical failure.

  6. The Reliability and Validity of the Thin Slice Technique: Observational Research on Video Recorded Medical Interactions

    ERIC Educational Resources Information Center

    Foster, Tanina S.

    2014-01-01

    Introduction: Observational research using the thin slice technique has been routinely incorporated in observational research methods, however there is limited evidence supporting use of this technique compared to full interaction coding. The purpose of this study was to determine if this technique could be reliability coded, if ratings are…

  7. Reliability evaluation of oil pipelines operating in aggressive environment

    NASA Astrophysics Data System (ADS)

    Magomedov, R. M.; Paizulaev, M. M.; Gebel, E. S.

    2017-08-01

    In connection with modern increased requirements for ecology and safety, the development of diagnostic services complex is obligatory and necessary enabling to ensure the reliable operation of the gas transportation infrastructure. Estimation of oil pipelines technical condition should be carried out not only to establish the current values of the equipment technological parameters in operation, but also to predict the dynamics of changes in the physical and mechanical characteristics of the material, the appearance of defects, etc. to ensure reliable and safe operation. In the paper, existing Russian and foreign methods for evaluation of the oil pipelines reliability are considered, taking into account one of the main factors leading to the appearance of crevice in the pipeline material, i.e. change the shape of its cross-section, - corrosion. Without compromising the generality of the reasoning, the assumption of uniform corrosion wear for the initial rectangular cross section has been made. As a result a formula for calculation the probability of failure-free operation was formulated. The proposed mathematical model makes it possible to predict emergency situations, as well as to determine optimal operating conditions for oil pipelines.

  8. Reliability of the Fox-walk test in patients with rheumatoid arthritis.

    PubMed

    Verberkt, Cornelia Antonia; Fridén, Cecilia; Grooten, Wilhelmus Johannes Andreas; Opava, Christina H

    2012-01-01

    The Fox-walk test is a new method used to estimate aerobic capacity outside a clinical environment, which may be useful in the implementation of daily health-enhancing physical activity. The aim of our study was to investigate the reliability of the test in people with rheumatoid arthritis (RA). Fifteen participants performed the Fox-walk test three times with weekly intervals. The intraclass correlation coefficient (ICC), the standard error of measurement (SEM) and the smallest detectable change (SDC) were used to estimate the reliability. General health perception, lower limb pain and fatigue were measured to determine their potential influence on the reliability. There were no systematic differences between the three test occasions (p = 0.190) and the reliability was almost perfect (ICC = 0.982). None of the covariates influenced the reliability. The SEM was 0.999 ml/kg/min or 3.4% and the SDC was 2.769 ml/kg/min or 9.4%. These findings demonstrate that the Fox-walk test is reliable in people with RA and enables differentiation between people with RA and monitoring progress. The validity of the test among people with RA is still to be determined. • The Fox-walk test is a new method to estimate aerobic capacity and could be performed walking or running. • The test is self administered without expensive equipment and is available in 150 public places in Sweden and several other European countries. • The Fox-walk test is a reliable test for use among people with rheumatoid arthritis monitoring the progress of their physical activity.

  9. A reliable sewage quality abnormal event monitoring system.

    PubMed

    Li, Tianling; Winnel, Melissa; Lin, Hao; Panther, Jared; Liu, Chang; O'Halloran, Roger; Wang, Kewen; An, Taicheng; Wong, Po Keung; Zhang, Shanqing; Zhao, Huijun

    2017-09-15

    With closing water loop through purified recycled water, wastewater becomes a part of source water, requiring reliable wastewater quality monitoring system (WQMS) to manage wastewater source and mitigate potential health risks. However, the development of reliable WQMS is fatally constrained by severe contamination and biofouling of sensors due to the hostile analytical environment of wastewaters, especially raw sewages, that challenges the limit of existing sensing technologies. In this work, we report a technological solution to enable the development of WQMS for real-time abnormal event detection with high reliability and practicality. A vectored high flow hydrodynamic self-cleaning approach and a dual-sensor self-diagnostic concept are adopted for WQMS to effectively encounter vital sensor failing issues caused by contamination and biofouling and ensure the integrity of sensing data. The performance of the WQMS has been evaluated over a 3-year trial period at different sewage catchment sites across three Australian states. It has demonstrated that the developed WQMS is capable of continuously operating in raw sewage for a prolonged period up to 24 months without maintenance and failure, signifying the high reliability and practicality. The demonstrated WQMS capability to reliably acquire real-time wastewater quality information leaps forward the development of effective wastewater source management system. The reported self-cleaning and self-diagnostic concepts should be applicable to other online water quality monitoring systems, opening a new way to encounter the common reliability and stability issues caused by sensor contamination and biofouling. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Reliability and validity of a talent identification test battery for seated and standing Paralympic throws.

    PubMed

    Spathis, Jemima Grace; Connick, Mark James; Beckman, Emma Maree; Newcombe, Peter Anthony; Tweedy, Sean Michael

    2015-01-01

    Paralympic throwing events for athletes with physical impairments comprise seated and standing javelin, shot put, discus and seated club throwing. Identification of talented throwers would enable prediction of future success and promote participation; however, a valid and reliable talent identification battery for Paralympic throwing has not been reported. This study evaluates the reliability and validity of a talent identification battery for Paralympic throws. Participants were non-disabled so that impairment would not confound analyses, and results would provide an indication of normative performance. Twenty-eight non-disabled participants (13 M; 15 F) aged 23.6 years (±5.44) performed five kinematically distinct criterion throws (three seated, two standing) and nine talent identification tests (three anthropometric, six motor); 23 were tested a second time to evaluate test-retest reliability. Talent identification test-retest reliability was evaluated using Intra-class Correlation Coefficient (ICC) and Bland-Altman plots (Limits of Agreement). Spearman's correlation assessed strength of association between criterion throws and talent identification tests. Reliability was generally acceptable (mean ICC = 0.89), but two seated talent identification tests require more extensive familiarisation. Correlation strength (mean rs = 0.76) indicated that the talent identification tests can be used to validly identify individuals with competitively advantageous attributes for each of the five kinematically distinct throwing activities. Results facilitate further research in this understudied area.

  11. Affordable Launch Services using the Sport Orbit Transfer System

    NASA Astrophysics Data System (ADS)

    Goldstein, D. J.

    2002-01-01

    Despite many advances in small satellite technology, a low-cost, reliable method is needed to place spacecraft in their de- sired orbits. AeroAstro has developed the Small Payload ORbit Transfer (SPORTTM) system to provide a flexible low-cost orbit transfer capability, enabling small payloads to use low-cost secondary launch opportunities and still reach their desired final orbits. This capability allows small payloads to effectively use a wider variety of launch opportunities, including nu- merous under-utilized GTO slots. Its use, in conjunction with growing opportunities for secondary launches, enable in- creased access to space using proven technologies and highly reliable launch vehicles such as the Ariane family and the Starsem launcher. SPORT uses a suite of innovative technologies that are packaged in a simple, reliable, modular system. The command, control and data handling of SPORT is provided by the AeroAstro BitsyTM core electronics module. The Bitsy module also provides power regulation for the batteries and optional solar arrays. The primary orbital maneuvering capability is provided by a nitrous oxide monopropellant propulsion system. This system exploits the unique features of nitrous oxide, which in- clude self-pressurization, good performance, and safe handling, to provide a light-weight, low-cost and reliable propulsion capability. When transferring from a higher energy orbit to a lower energy orbit (i.e. GTO to LEO), SPORT uses aerobraking technol- ogy. After using the propulsion system to lower the orbit perigee, the aerobrake gradually slows SPORT via atmospheric drag. After the orbit apogee is reduced to the target level, an apogee burn raises the perigee and ends the aerobraking. At the conclusion of the orbit transfer maneuver, either the aerobrake or SPORT can be shed, as desired by the payload. SPORT uses a simple design for high reliability and a modular architecture for maximum mission flexibility. This paper will discuss the launch system and its application to small satellite launch without increasing risk. It will also discuss relevant issues such as aerobraking operations and radiation issues, as well as existing partnerships and patents for the system.

  12. A digital miniature x-ray tube with a high-density triode carbon nanotube field emitter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeong, Jin-Woo; Kang, Jun-Tae; Choi, Sungyoul

    2013-01-14

    We have fabricated a digital miniature x-ray tube (6 mm in diameter and 32 mm in length) with a high-density triode carbon nanotube (CNT) field emitter for special x-ray applications. The triode CNT emitter was densely formed within a diameter of below 4 mm with the focusing-functional gate. The brazing process enables us to obtain and maintain a desired vacuum level for the reliable electron emission from the CNT emitters after the vacuum packaging. The miniature x-ray tube exhibited a stable and reliable operation over 250 h in a pulse mode at an anode voltage of above 25 kV.

  13. Absolute accuracy of the Cyberware WB4 whole-body scanner

    NASA Astrophysics Data System (ADS)

    Daanen, Hein A. M.; Taylor, Stacie E.; Brunsman, Matthew A.; Nurre, Joseph H.

    1997-03-01

    The Cyberware WB4 whole body scanner is one of the first scanning systems in the world that generates a high resolution data set of the outer surface of the human body. The Computerized Anthropometric Research and Design (CARD) Laboratory of Wright-Patterson AFB intends to use the scanner to enable quick and reliable acquisition of anthropometric data. For this purpose, a validation study was initiated to check the accuracy, reliability and errors of the system. A calibration object, consisting of two boxes and a cylinder, was scanned in several locations in the scanning space. The object dimensions in the resulting scans compared favorably to the actual dimensions of the calibration object.

  14. Immunogenicity testing of therapeutic antibodies in ocular fluids after intravitreal injection.

    PubMed

    Wessels, Uwe; Zadak, Markus; Reiser, Astrid; Brockhaus, Janis; Ritter, Mirko; Abdolzade-Bavil, Afsaneh; Heinrich, Julia; Stubenrauch, Kay

    2018-04-11

    High drug concentrations in ocular fluids after intravitreal administration preclude the use of drug-sensitive immunoassays. A drug-tolerant immunoassay is therefore desirable for immunogenicity testing in ophthalmology. Immune complex (IC) antidrug antibody (ADA) assays were established for two species. The assays were compared with the bridging assay in ocular and plasma samples from two preclinical studies. The IC assays showed high drug tolerance, which enabled a reliable ADA detection in ocular fluids after intravitreal administration. The IC assays were superior to the bridging assay in the analysis of ocular fluids with high drug concentrations. The IC assay allows a reliable ADA detection in matrices with high drug concentrations, such as ocular fluids.

  15. Development of a Nondestructive Evaluation Technique for Degraded Thermal Barrier Coatings Using Microwave

    NASA Astrophysics Data System (ADS)

    Sayar, M.; Ogawa, K.; Shoji, T.

    2008-02-01

    Thermal barrier coatings have been widely used in gas turbine engines in order to protect substrate metal alloy against high temperature and to enhance turbine efficiency. Currently, there are no reliable nondestructive techniques available to monitor TBC integrity over lifetime of the coating. Hence, to detect top coating (TC) and TGO thicknesses, a microwave nondestructive technique that utilizes a rectangular waveguide was developed. The phase of the reflection coefficient at the interface of TC and waveguide varies for different TGO and TC thicknesses. Therefore, measuring the phase of the reflection coefficient enables us to accurately calculate these thicknesses. Finally, a theoretical analysis was used to evaluate the reliability of the experimental results.

  16. From Ions to Wires to the Grid: The Transformational Science of LANL Research in High-Tc Superconducting Tapes and Electric Power Applications

    ScienceCinema

    Marken, Ken

    2018-01-09

    The Department of Energy (DOE) Office of Electricity Delivery and Energy Reliability (OE) has been tasked to lead national efforts to modernize the electric grid, enhance security and reliability of the energy infrastructure, and facilitate recovery from disruptions to energy supplies. LANL has pioneered the development of coated conductors – high-temperature superconducting (HTS) tapes – which permit dramatically greater current densities than conventional copper cable, and enable new technologies to secure the national electric grid. Sustained world-class research from concept, demonstration, transfer, and ongoing industrial support has moved this idea from the laboratory to the commercial marketplace.

  17. NASA's Aeronautics Vision

    NASA Technical Reports Server (NTRS)

    Tenney, Darrel R.

    2004-01-01

    Six long-term technology focus areas are: 1. Environmentally Friendly, Clean Burning Engines. Focus: Develop innovative technologies to enable intelligent turbine engines that significantly reduce harmful emissions while maintaining high performance and increasing reliability. 2. New Aircraft Energy Sources and Management. Focus: Discover new energy sources and intelligent management techniques directed towards zero emissions and enable new vehicle concepts for public mobility and new science missions. 3. Quiet Aircraft for Community Friendly Service. Focus: Develop and integrate noise reduction technology to enable unrestricted air transportation service to all communities. 4. Aerodynamic Performance for Fuel Efficiency. Focus: Improve aerodynamic efficiency,structures and materials technologies, and design tools and methodologies to reduce fuel burn and minimize environmental impact and enable new vehicle concepts and capabilities for public mobility and new science missions. 5. Aircraft Weight Reduction and Community Access. Focus: Develop ultralight smart materials and structures, aerodynamic concepts, and lightweight subsystems to increase vehicle efficiency, leading to high altitude long endurance vehicles, planetary aircraft, advanced vertical and short takeoff and landing vehicles and beyond. 6. Smart Aircraft and Autonomous Control. Focus: Enable aircraft to fly with reduced or no human intervention, to optimize flight over multiple regimes, and to provide maintenance on demand towards the goal of a feeling, seeing, sensing, sentient air vehicle.

  18. MEAs and 3D nanoelectrodes: electrodeposition as tool for a precisely controlled nanofabrication.

    PubMed

    Weidlich, Sabrina; Krause, Kay J; Schnitker, Jan; Wolfrum, Bernhard; Offenhäusser, Andreas

    2017-01-31

    Microelectrode arrays (MEAs) are gaining increasing importance for the investigation of signaling processes between electrogenic cells. However, efficient cell-chip coupling for robust and long-term electrophysiological recording and stimulation still remains a challenge. A possible approach for the improvement of the cell-electrode contact is the utilization of three-dimensional structures. In recent years, various 3D electrode geometries have been developed, but we are still lacking a fabrication approach that enables the formation of different 3D structures on a single chip in a controlled manner. This, however, is needed to enable a direct and reliable comparison of the recording capabilities of the different structures. Here, we present a method for a precisely controlled deposition of nanoelectrodes, enabling the fabrication of multiple, well-defined types of structures on our 64 electrode MEAs towards a rapid-prototyping approach to 3D electrodes.

  19. [A reliability growth assessment method and its application in the development of equipment in space cabin].

    PubMed

    Chen, J D; Sun, H L

    1999-04-01

    Objective. To assess and predict reliability of an equipment dynamically by making full use of various test informations in the development of products. Method. A new reliability growth assessment method based on army material system analysis activity (AMSAA) model was developed. The method is composed of the AMSAA model and test data conversion technology. Result. The assessment and prediction results of a space-borne equipment conform to its expectations. Conclusion. It is suggested that this method should be further researched and popularized.

  20. Fifteen-Minute Comprehensive Alcohol Risk Survey: Reliability and Validity Across American Indian and White Adolescents

    PubMed Central

    Komro, Kelli A; Livingston, Melvin D; Kominsky, Terrence K; Livingston, Bethany J; Garrett, Brady A; Molina, Mildred Maldonado; Boyd, Misty L

    2015-01-01

    Objective: American Indians (AIs) suffer from significant alcohol-related health disparities, and increased risk begins early. This study examined the reliability and validity of measures to be used in a preventive intervention trial. Reliability and validity across racial/ethnic subgroups are crucial to evaluate intervention effectiveness and promote culturally appropriate evidence-based practice. Method: To assess reliability and validity, we used three baseline surveys of high school students participating in a preventive intervention trial within the jurisdictional service area of the Cherokee Nation in northeastern Oklahoma. The 15-minute alcohol risk survey included 16 multi-item scales and one composite score measuring key proximal, primary, and moderating variables. Forty-four percent of the students indicated that they were AI (of whom 82% were Cherokee), including 23% who reported being AI only (n = 435) and 18% both AI and White (n = 352). Forty-seven percent reported being White only (n = 901). Results: Scales were adequately reliable for the full sample and across race/ethnicity defined by AI, AI/White, and White subgroups. Among the full sample, all scales had acceptable internal consistency, with minor variation across race/ethnicity. All scales had extensive to exemplary test–retest reliability and showed minimal variation across race/ethnicity. The eight proximal and two primary outcome scales were each significantly associated with the frequency of alcohol use during the past month in both the cross-sectional and the longitudinal models, providing support for both criterion validity and predictive validity. For most scales, interpretation of the strength of association and statistical significance did not differ between the racial/ethnic subgroups. Conclusions: The results support the reliability and validity of scales of a brief questionnaire measuring risk and protective factors for alcohol use among AI adolescents, primarily members of the Cherokee Nation. PMID:25486402

  1. Reliability and Validity of Survey Instruments to Measure Work-Related Fatigue in the Emergency Medical Services Setting: A Systematic Review.

    PubMed

    Patterson, P Daniel; Weaver, Matthew D; Fabio, Anthony; Teasley, Ellen M; Renn, Megan L; Curtis, Brett R; Matthews, Margaret E; Kroemer, Andrew J; Xun, Xiaoshuang; Bizhanova, Zhadyra; Weiss, Patricia M; Sequeira, Denisse J; Coppler, Patrick J; Lang, Eddy S; Higgins, J Stephen

    2018-02-15

    This study sought to systematically search the literature to identify reliable and valid survey instruments for fatigue measurement in the Emergency Medical Services (EMS) occupational setting. A systematic review study design was used and searched six databases, including one website. The research question guiding the search was developed a priori and registered with the PROSPERO database of systematic reviews: "Are there reliable and valid instruments for measuring fatigue among EMS personnel?" (2016:CRD42016040097). The primary outcome of interest was criterion-related validity. Important outcomes of interest included reliability (e.g., internal consistency), and indicators of sensitivity and specificity. Members of the research team independently screened records from the databases. Full-text articles were evaluated by adapting the Bolster and Rourke system for categorizing findings of systematic reviews, and the rated data abstracted from the body of literature as favorable, unfavorable, mixed/inconclusive, or no impact. The Grading of Recommendations, Assessment, Development and Evaluation (GRADE) methodology was used to evaluate the quality of evidence. The search strategy yielded 1,257 unique records. Thirty-four unique experimental and non-experimental studies were determined relevant following full-text review. Nineteen studies reported on the reliability and/or validity of ten different fatigue survey instruments. Eighteen different studies evaluated the reliability and/or validity of four different sleepiness survey instruments. None of the retained studies reported sensitivity or specificity. Evidence quality was rated as very low across all outcomes. In this systematic review, limited evidence of the reliability and validity of 14 different survey instruments to assess the fatigue and/or sleepiness status of EMS personnel and related shift worker groups was identified.

  2. An FEC Adaptive Multicast MAC Protocol for Providing Reliability in WLANs

    NASA Astrophysics Data System (ADS)

    Basalamah, Anas; Sato, Takuro

    For wireless multicast applications like multimedia conferencing, voice over IP and video/audio streaming, a reliable transmission of packets within short delivery delay is needed. Moreover, reliability is crucial to the performance of error intolerant applications like file transfer, distributed computing, chat and whiteboard sharing. Forward Error Correction (FEC) is frequently used in wireless multicast to enhance Packet Error Rate (PER) performance, but cannot assure full reliability unless coupled with Automatic Repeat Request forming what is knows as Hybrid-ARQ. While reliable FEC can be deployed at different levels of the protocol stack, it cannot be deployed on the MAC layer of the unreliable IEEE802.11 WLAN due to its inability to exchange ACKs with multiple recipients. In this paper, we propose a Multicast MAC protocol that enhances WLAN reliability by using Adaptive FEC and study it's performance through mathematical analysis and simulation. Our results show that our protocol can deliver high reliability and throughput performance.

  3. Brief oral stimulation, but especially oral fat exposure, elevates serum triglycerides in humans

    PubMed Central

    Mattes, Richard D.

    2009-01-01

    Oral exposure to dietary fat results in an early initial spike, followed by a prolonged elevation, of serum triglycerides in humans. The physiological and pathophysiological implications remain unknown. This study sought to determine the incidence of the effect, the required fat exposure duration, and its reliability. Thirty-four healthy adults participated in four to six response-driven trials held at least a week apart. They reported to the laboratory after an overnight fast, a catheter was placed in an antecubital vein, and a blood sample was obtained. Participants then ingested 50 g of safflower oil in capsules with 500 ml of water within 15 min to mimic a high fat meal but without oral fat exposure. Blood was collected 0, 10, 20, 30, 40, 50, 60, 120, 240, 360, and 480 min after capsule ingestion with different forms (full fat, nonfat, none) and durations of oral fat exposures (10 s, 5 min, 20 min, and/or 2 h). A triglyceride response (increase of triglyceride >10 mg/dl within 30 min) was observed in 88.2%, 70.5%, and 50% of participants with full-fat, nonfat, and no oral exposure, respectively. Test-retest reliability was 75% with full-fat exposure but only 45.4% with nonfat exposure. Full-fat and nonfat exposures led to comparable significant elevations of triglyceride over no oral stimulation with 10-s exposures, but full fat led to a greater rise than nonfat with 20 min of exposure. These data indicate that nutritionally relevant oral fat exposures reliably elevate serum triglyceride concentrations in most people. PMID:19074638

  4. An Argument from Acquisition: Comparing English Metrical Stress Representations by How Learnable They Are from Child-Directed Speech

    ERIC Educational Resources Information Center

    Pearl, Lisa; Ho, Timothy; Detrano, Zephyr

    2017-01-01

    It has long been recognized that there is a natural dependence between theories of knowledge representation and theories of knowledge acquisition, with the idea that the right knowledge representation enables acquisition to happen as reliably as it does. Given this, a reasonable criterion for a theory of knowledge representation is that it be…

  5. A Social Learning Management System Supporting Feedback for Incorrect Answers Based on Social Network Services

    ERIC Educational Resources Information Center

    Son, Jiseong; Kim, Jeong-Dong; Na, Hong-Seok; Baik, Doo-Kwon

    2016-01-01

    In this research, we propose a Social Learning Management System (SLMS) enabling real-time and reliable feedback for incorrect answers by learners using a social network service (SNS). The proposed system increases the accuracy of learners' assessment results by using a confidence scale and a variety of social feedback that is created and shared…

  6. A Simple and Reliable Method for Hybridization of Homothallic Wine Strains of Saccharomyces cerevisiae

    PubMed Central

    Ramírez, Manuel; Peréz, Francisco; Regodón, José A.

    1998-01-01

    A procedure was developed for the hybridization and improvement of homothallic industrial wine yeasts. Killer cycloheximide-sensitive strains were crossed with killer-sensitive cycloheximide-resistant strains to get killer cycloheximide-resistant hybrids, thereby enabling hybrid selection and identification. This procedure also allows backcrossing of spore colonies from the hybrids with parental strains. PMID:9835605

  7. Validation of an Instructional Observation Instrument for Teaching English as a Foreign Language in Spain

    ERIC Educational Resources Information Center

    Gomez-Garcia, Maria

    2011-01-01

    The design and validation of a classroom observation instrument to provide formative feedback for teachers of EFL in Spain is the overarching purpose of this study. This study proposes that a valid and reliable classroom observation instrument, based on effective practice in teaching EFL, can be developed and used in Spain to enable teachers to…

  8. Using Digital and Paper Diaries for Learning and Assessment Purposes in Higher Education: A Comparative Study of Feasibility and Reliability

    ERIC Educational Resources Information Center

    Gleaves, Alan; Walker, Caroline; Grey, John

    2007-01-01

    The incorporation of diaries and journals as learning and assessment vehicles into programmes of study within higher education has enabled the further growth of reflection, creative writing, critical thinking and meta-cognitive processes of students' learning. However, there is currently little research that aims to compare how different types of…

  9. Embedded Diagnostic/Prognostic Reasoning and Information Continuity for Improved Avionics Maintenance

    DTIC Science & Technology

    2006-01-01

    enabling technologies such as built-in-test, advanced health monitoring algorithms, reliability and component aging models, prognostics methods, and...deployment and acceptance. This framework and vision is consistent with the onboard PHM ( Prognostic and Health Management) as well as advanced... monitored . In addition to the prognostic forecasting capabilities provided by monitoring system power, multiple confounding errors by electronic

  10. Technology as the Enabler of a New Wave of Active Learning

    ERIC Educational Resources Information Center

    Rollag, Keith; Billsberry, Jon

    2012-01-01

    Education has always been slow on the uptake of new technology. Instructors have established time-worn methods of teaching, and the performance nature of the job puts an emphasis on reliability and predictability. The last thing an instructor wants to be doing is fumbling around trying to make something work in front of an audience of 200…

  11. A light for anaesthetists.

    PubMed

    Flowerdew, R M

    1976-11-01

    A new type of lighting device is incorporated into the ether screen cross-bar, enabling better illumination of the patient's face. A 12-V, DC power supply is used. The temperature, with vacuum-assisted cooling, should not cause burns to an unconscious patient. Assessment of patient colour was evaluated and found to be reliable. The lamp must not be used with flammable anaesthetic agents.

  12. Large space telescope engineering scale model optical design

    NASA Technical Reports Server (NTRS)

    Facey, T. A.

    1973-01-01

    The objective is to develop the detailed design and tolerance data for the LST engineering scale model optical system. This will enable MSFC to move forward to the optical element procurement phase and also to evaluate tolerances, manufacturing requirements, assembly/checkout procedures, reliability, operational complexity, stability requirements of the structure and thermal system, and the flexibility to change and grow.

  13. [Integral evaluation of immune homeostasis in rockets liquidators and role of this evaluation for prophylaxis].

    PubMed

    2010-01-01

    Long-standing clinical and immunologic monitoring and integral evaluation of immune homeostasis (through generalized parameter) in personnel of Center for liquid-fuel rockets liquidation demonstrated diagnostically reliable immunity parameters that enable to forecast changes in the workers' health state. The authors defined boundary values of the generalized parameter to form risk groups for specific entities formation.

  14. Diagnostic multiplex PCR for toxin genotyping of Clostridium perfringens isolates.

    PubMed

    Baums, Christoph G; Schotte, Ulrich; Amtsberg, Gunter; Goethe, Ralph

    2004-05-20

    In this study we provide a protocol for genotyping Clostridium perfringens with a new multiplex PCR. This PCR enables reliable and specific detection of the toxin genes cpa, cpb, etx, iap, cpe and cpb2 from heat lysed bacterial suspensions. The efficiency of the protocol was demonstrated by typing C. perfringens reference strains and isolates from veterinary bacteriological routine diagnostic specimens.

  15. PV Module Reliability Experts Gather for DuraMAT Workshop | News | NREL

    Science.gov Websites

    DuraMAT Workshop June 20, 2017 On May 22 and 23, 2017, the Bay Area Photovoltaic Consortium (BAPVC) and with the photovoltaic and supply-chain industries to discover, develop, de-risk, and enable the commercialization of new materials and designs for photovoltaic modules-with the potential for a levelized cost of

  16. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  17. Further examination of the temporal stability of alcohol demand.

    PubMed

    Acuff, Samuel F; Murphy, James G

    2017-08-01

    Demand, or the amount of a substance consumed as a function of price, is a central dependent measure in behavioral economic research and represents the relative valuation of a substance. Although demand is often utilized as an index of substance use severity and is assumed to be relatively stable, recent experimental and clinical research has identified conditions in which demand can be manipulated, such as through craving and stress inductions, and treatment. Our study examines the 1-month reliability of the alcohol purchase task in a sample of heavy drinking college students. We also analyzed reliability in subgroup of individuals whose consumption decreased, increased, or stayed the same over the 1-month period, and in individuals with moderate/severe Alcohol Use Disorder (AUD) vs. those with no/mild AUD. Reliability was moderate in the full sample, high in the group with stable consumption, and did not differ appreciably between AUD groups. Observed indices and indices derived from an exponentiated equation (Koffarnus et al., 2015) were generally comparable, although P max observed had very low reliability. Area under the curve, O max derived, and essential value showed the greatest reliability in the full sample (rs=0.75-0.77). These results provide evidence for the relative stability over time of demand and across AUD groups, particularly in those whose consumption remains stable. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Development of PZT-excited stroboscopic shearography for full-field nondestructive evaluation.

    PubMed

    Asemani, Hamidreza; Park, Jinwoo; Lee, Jung-Ryul; Soltani, Nasser

    2017-05-01

    Nondestructive evaluation using shearography requires a way to stress the inspection target. This technique is able to directly measure the displacement gradient distribution on the object surface. Shearography visualizes the internal structural damages as the anomalous pattern in the shearograpic fringe pattern. A piezoelectric (PZT) excitation system is able to generate loadings in the vibrational, acoustic, and ultrasonic regimes. In this paper, we propose a PZT-excited stroboscopic shearography. The PZT excitation could generate vibrational loading, a stationary wavefield, and a nonstationary propagation wave to fulfill the external loading requirement of shearography. The sweeping of the PZT excitation frequency, the formation of a standing wave, and a small shearing to suppress the incident wave were powerful controllable tools to detect the defects. The sweeping of the PZT excitation frequency enabled us to determine one of the defect-sensitive frequencies almost in real time. In addition, because the defect sensitive frequencies always existed in wide and plural ranges, the risk of the defect being overlooked by the inspector could be alleviated. The results of evaluation using stroboscopic shearography showed that an artificial 20 mm-diameter defect could be visualized at the excitation frequencies of 5-8 kHz range and 12.5-15.5 kHz range. This technique provided full field reliable and repeatable inspection results. Additionally, the proposed method overcame the important drawback of the time-averaged shearography, being required to identify the resonance vibration frequency sensitive to the defect.

  19. Application of computational fluid dynamics to closed-loop bioreactors: I. Characterization and simulation of fluid-flow pattern and oxygen transfer.

    PubMed

    Littleton, Helen X; Daigger, Glen T; Strom, Peter F

    2007-06-01

    A full-scale, closed-loop bioreactor (Orbal oxidation ditch, Envirex brand technologies, Siemens, Waukesha, Wisconsin), previously examined for simultaneous biological nutrient removal (SBNR), was further evaluated using computational fluid dynamics (CFD). A CFD model was developed first by imparting the known momentum (calculated by tank fluid velocity and mass flowrate) to the fluid at the aeration disc region. Oxygen source (aeration) and sink (consumption) terms were introduced, and statistical analysis was applied to the CFD simulation results. The CFD model was validated with field data obtained from a test tank and a full-scale tank. The results indicated that CFD could predict the mixing pattern in closed-loop bioreactors. This enables visualization of the flow pattern, both with regard to flow velocity and dissolved-oxygen-distribution profiles. The velocity and oxygen-distribution gradients suggested that the flow patterns produced by directional aeration in closed-loop bioreactors created a heterogeneous environment that can result in dissolved oxygen variations throughout the bioreactor. Distinct anaerobic zones on a macroenvironment scale were not observed, but it is clear that, when flow passed around curves, a secondary spiral flow was generated. This second current, along with the main recirculation flow, could create alternating anaerobic and aerobic conditions vertically and horizontally, which would allow SBNR to occur. Reliable SBNR performance in Orbal oxidation ditches may be a result, at least in part, of such a spatially varying environment.

  20. Co-speech hand movements during narrations: What is the impact of right vs. left hemisphere brain damage?

    PubMed

    Hogrefe, Katharina; Rein, Robert; Skomroch, Harald; Lausberg, Hedda

    2016-12-01

    Persons with brain damage show deviant patterns of co-speech hand movement behaviour in comparison to healthy speakers. It has been claimed by several authors that gesture and speech rely on a single production mechanism that depends on the same neurological substrate while others claim that both modalities are closely related but separate production channels. Thus, findings so far are contradictory and there is a lack of studies that systematically analyse the full range of hand movements that accompany speech in the condition of brain damage. In the present study, we aimed to fill this gap by comparing hand movement behaviour in persons with unilateral brain damage to the left and the right hemisphere and a matched control group of healthy persons. For hand movement coding, we applied Module I of NEUROGES, an objective and reliable analysis system that enables to analyse the full repertoire of hand movements independent of speech, which makes it specifically suited for the examination of persons with aphasia. The main results of our study show a decreased use of communicative conceptual gestures in persons with damage to the right hemisphere and an increased use of these gestures in persons with left brain damage and aphasia. These results not only suggest that the production of gesture and speech do not rely on the same neurological substrate but also underline the important role of right hemisphere functioning for gesture production. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Aqueous dispersion polymerization: a new paradigm for in situ block copolymer self-assembly in concentrated solution.

    PubMed

    Sugihara, Shinji; Blanazs, Adam; Armes, Steven P; Ryan, Anthony J; Lewis, Andrew L

    2011-10-05

    Reversible addition-fragmentation chain transfer polymerization has been utilized to polymerize 2-hydroxypropyl methacrylate (HPMA) using a water-soluble macromolecular chain transfer agent based on poly(2-(methacryloyloxy)ethylphosphorylcholine) (PMPC). A detailed phase diagram has been elucidated for this aqueous dispersion polymerization formulation that reliably predicts the precise block compositions associated with well-defined particle morphologies (i.e., pure phases). Unlike the ad hoc approaches described in the literature, this strategy enables the facile, efficient, and reproducible preparation of diblock copolymer spheres, worms, or vesicles directly in concentrated aqueous solution. Chain extension of the highly hydrated zwitterionic PMPC block with HPMA in water at 70 °C produces a hydrophobic poly(2-hydroxypropyl methacrylate) (PHPMA) block, which drives in situ self-assembly to form well-defined diblock copolymer spheres, worms, or vesicles. The final particle morphology obtained at full monomer conversion is dictated by (i) the target degree of polymerization of the PHPMA block and (ii) the total solids concentration at which the HPMA polymerization is conducted. Moreover, if the targeted diblock copolymer composition corresponds to vesicle phase space at full monomer conversion, the in situ particle morphology evolves from spheres to worms to vesicles during the in situ polymerization of HPMA. In the case of PMPC(25)-PHPMA(400) particles, this systematic approach allows the direct, reproducible, and highly efficient preparation of either block copolymer vesicles at up to 25% solids or well-defined worms at 16-25% solids in aqueous solution.

  2. Big data analytics for the Future Circular Collider reliability and availability studies

    NASA Astrophysics Data System (ADS)

    Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter

    2017-10-01

    Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.

  3. The reliability of Cavalier's principle of stereological method in determining volumes of enchondromas using the computerized tomography tools.

    PubMed

    Acar, Nihat; Karakasli, Ahmet; Karaarslan, Ahmet; Mas, Nermin Ng; Hapa, Onur

    2017-01-01

    Volumetric measurements of benign tumors enable surgeons to trace volume changes during follow-up periods. For a volumetric measurement technique to be applicable, it should be easy, rapid, and inexpensive and should carry a high interobserver reliability. We aimed to assess the interobserver reliability of a volumetric measurement technique using the Cavalier's principle of stereological methods. The computerized tomography (CT) of 15 patients with a histopathologically confirmed diagnosis of enchondroma with variant tumor sizes and localizations was retrospectively reviewed for interobserver reliability evaluation of the volumetric stereological measurement with the Cavalier's principle, V = t × [((SU) × d) /SL]2 × Σ P. The volumes of the 15 tumors collected by the observers are demonstrated in Table 1. There was no statistical significance between the first and second observers ( p = 0.000 and intraclass correlation coefficient = 0.970) and between the first and third observers ( p = 0.000 and intraclass correlation coefficient = 0.981). No statistical significance was detected between the second and third observers ( p = 0.000 and intraclass correlation coefficient = 0.976). The Cavalier's principle with the stereological technique using the CT scans is an easy, rapid, and inexpensive technique in volumetric evaluation of enchondromas with a trustable interobserver reliability.

  4. Systems engineering principles for the design of biomedical signal processing systems.

    PubMed

    Faust, Oliver; Acharya U, Rajendra; Sputh, Bernhard H C; Min, Lim Choo

    2011-06-01

    Systems engineering aims to produce reliable systems which function according to specification. In this paper we follow a systems engineering approach to design a biomedical signal processing system. We discuss requirements capturing, specification definition, implementation and testing of a classification system. These steps are executed as formal as possible. The requirements, which motivate the system design, are based on diabetes research. The main requirement for the classification system is to be a reliable component of a machine which controls diabetes. Reliability is very important, because uncontrolled diabetes may lead to hyperglycaemia (raised blood sugar) and over a period of time may cause serious damage to many of the body systems, especially the nerves and blood vessels. In a second step, these requirements are refined into a formal CSP‖ B model. The formal model expresses the system functionality in a clear and semantically strong way. Subsequently, the proven system model was translated into an implementation. This implementation was tested with use cases and failure cases. Formal modeling and automated model checking gave us deep insight in the system functionality. This insight enabled us to create a reliable and trustworthy implementation. With extensive tests we established trust in the reliability of the implementation. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  5. Response of the human vestibulo-ocular reflex system to constant angular acceleration. I. Theoretical study.

    PubMed

    Boumans, L J; Rodenburg, M; Maas, A J

    1983-01-01

    The response of the human vestibulo-ocular reflex system to a constant angular acceleration is calculated using a second order model with an adaptation term. After first reaching a maximum the peracceleratory response declines. When the stimulus duration is long the decay is mainly governed by the adaptation time constant Ta, which enables to reliably estimate this time constant. In the postacceleratory period of constant velocity there is a reversal in response. The magnitude and the time course of the per- and postacceleratory response are calculated for various values of the cupular time constant T1, the adaptation time constant Ta, and the stimulus duration, thus enabling their influence to be assessed.

  6. Neural-network-enhanced evolutionary algorithm applied to supported metal nanoparticles

    NASA Astrophysics Data System (ADS)

    Kolsbjerg, E. L.; Peterson, A. A.; Hammer, B.

    2018-05-01

    We show that approximate structural relaxation with a neural network enables orders of magnitude faster global optimization with an evolutionary algorithm in a density functional theory framework. The increased speed facilitates reliable identification of global minimum energy structures, as exemplified by our finding of a hollow Pt13 nanoparticle on an MgO support. We highlight the importance of knowing the correct structure when studying the catalytic reactivity of the different particle shapes. The computational speedup further enables screening of hundreds of different pathways in the search for optimum kinetic transitions between low-energy conformers and hence pushes the limits of the insight into thermal ensembles that can be obtained from theory.

  7. Lithography-induced limits to scaling of design quality

    NASA Astrophysics Data System (ADS)

    Kahng, Andrew B.

    2014-03-01

    Quality and value of an IC product are functions of power, performance, area, cost and reliability. The forthcoming 2013 ITRS roadmap observes that while manufacturers continue to enable potential Moore's Law scaling of layout densities, the "realizable" scaling in competitive products has for some years been significantly less. In this paper, we consider aspects of the question, "To what extent should this scaling gap be blamed on lithography?" Non-ideal scaling of layout densities has been attributed to (i) layout restrictions associated with multi-patterning technologies (SADP, LELE, LELELE), as well as (ii) various ground rule and layout style choices that stem from misalignment, reliability, variability, device architecture, and electrical performance vs. power constraints. Certain impacts seem obvious, e.g., loss of 2D flexibility and new line-end placement constraints with SADP, or algorithmically intractable layout stitching and mask coloring formulations with LELELE. However, these impacts may well be outweighed by weaknesses in design methodology and tooling. Arguably, the industry has entered a new era in which many new factors - (i) standard-cell library architecture, and layout guardbanding for automated place-and-route: (ii) performance model guardbanding and signoff analyses: (iii) physical design and manufacturing handoff algorithms spanning detailed placement and routing, stitching and RET; and (iv) reliability guardbanding - all contribute, hand in hand with lithography, to a newly-identified "design capability gap". How specific aspects of process and design enablements limit the scaling of design quality is a fundamental question whose answer must guide future RandD investment at the design-manufacturing interface. terface.

  8. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE PAGES

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...

    2016-07-05

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  9. Better Minds, Better Morals: A Procedural Guide to Better Judgment.

    PubMed

    Schaefer, G Owen; Savulescu, Julian

    2017-01-01

    Making more moral decisions - an uncontroversial goal, if ever there was one. But how to go about it? In this article, we offer a practical guide on ways to promote good judgment in our personal and professional lives. We will do this not by outlining what the good life consists in or which values we should accept.Rather, we offer a theory of procedural reliability : a set of dimensions of thought that are generally conducive to good moral reasoning. At the end of the day, we all have to decide for ourselves what is good and bad, right and wrong. The best way to ensure we make the right choices is to ensure the procedures we're employing are sound and reliable. We identify four broad categories of judgment to be targeted - cognitive, self-management, motivational and interpersonal. Specific factors within each category are further delineated, with a total of 14 factors to be discussed. For each, we will go through the reasons it generally leads to more morally reliable decision-making, how various thinkers have historically addressed the topic, and the insights of recent research that can offer new ways to promote good reasoning. The result is a wide-ranging survey that contains practical advice on how to make better choices. Finally, we relate this to the project of transhumanism and prudential decision-making. We argue that transhumans will employ better moral procedures like these. We also argue that the same virtues will enable us to take better control of our own lives, enhancing our responsibility and enabling us to lead better lives from the prudential perspective.

  10. Consideration of the use of origami-style solar panels for use on a terrestrial/orbital wireless power generation and transmission spacecraft

    NASA Astrophysics Data System (ADS)

    Holland, Alexander F.; Pearson, Jens; Lysford, Wilson; Straub, Jeremy

    2016-05-01

    This paper presents work on the development of Origami-style solar panels and their adaption and efficacy for use in Earth orbit. It focuses on the enabling capability of this technology for the generation and transmission of power. The proposed approach provides increased collection (solar panel) and transmission (microwave radiation) surface area, as compared to other systems with similar mass and volume. An overview of the system is presented, including its pre-deployment configuration, the deployment process and its final configuration. Its utility for wireless power transmission mission is then considered. An economic discussion is then presented to consider how the mass and volume efficiencies provided enable the system to approach target willingness-to-pay values that were presented and considered in prior work. A key consideration regarding the use of wireless power transfer in Earth orbit is the reliability of the technology. This has several different areas of consideration. It must reliably supply power to its customers (or they would have to have local generation capabilities sufficient for their needs, defeating the benefit of this system). It must also be shown to reliably supply power only to designated locations (and not inadvertently or otherwise beam power at other locations). The effect of the system design (including the Origami structure and deployment / rigidity mechanisms) is considered to assess whether the use of this technology may impair either of these key mission/safety-critical goals. This analysis is presented and a discussion of mitigation techniques to several prospective problems is presented, before concluding with a discussion of future work.

  11. Development of a Kalman Filter in the Gauss-Helmert Model for Reliability Analysis in Orientation Determination with Smartphone Sensors

    PubMed Central

    Ettlinger, Andreas; Neuner, Hans; Burgess, Thomas

    2018-01-01

    The topic of indoor positioning and indoor navigation by using observations from smartphone sensors is very challenging as the determined trajectories can be subject to significant deviations compared to the route travelled in reality. Especially the calculation of the direction of movement is the critical part of pedestrian positioning approaches such as Pedestrian Dead Reckoning (“PDR”). Due to distinct systematic effects in filtered trajectories, it can be assumed that there are systematic deviations present in the observations from smartphone sensors. This article has two aims: one is to enable the estimation of partial redundancies for each observation as well as for observation groups. Partial redundancies are a measure for the reliability indicating how well systematic deviations can be detected in single observations used in PDR. The second aim is to analyze the behavior of partial redundancy by modifying the stochastic and functional model of the Kalman filter. The equations relating the observations to the orientation are condition equations, which do not exhibit the typical structure of the Gauss-Markov model (“GMM”), wherein the observations are linear and can be formulated as functions of the states. To calculate and analyze the partial redundancy of the observations from smartphone-sensors used in PDR, the system equation and the measurement equation of a Kalman filter as well as the redundancy matrix need to be derived in the Gauss-Helmert model (“GHM”). These derivations are introduced in this article and lead to a novel Kalman filter structure based on condition equations, enabling reliability assessment of each observation. PMID:29385076

  12. Development of a Kalman Filter in the Gauss-Helmert Model for Reliability Analysis in Orientation Determination with Smartphone Sensors.

    PubMed

    Ettlinger, Andreas; Neuner, Hans; Burgess, Thomas

    2018-01-31

    The topic of indoor positioning and indoor navigation by using observations from smartphone sensors is very challenging as the determined trajectories can be subject to significant deviations compared to the route travelled in reality. Especially the calculation of the direction of movement is the critical part of pedestrian positioning approaches such as Pedestrian Dead Reckoning ("PDR"). Due to distinct systematic effects in filtered trajectories, it can be assumed that there are systematic deviations present in the observations from smartphone sensors. This article has two aims: one is to enable the estimation of partial redundancies for each observation as well as for observation groups. Partial redundancies are a measure for the reliability indicating how well systematic deviations can be detected in single observations used in PDR. The second aim is to analyze the behavior of partial redundancy by modifying the stochastic and functional model of the Kalman filter. The equations relating the observations to the orientation are condition equations, which do not exhibit the typical structure of the Gauss-Markov model ("GMM"), wherein the observations are linear and can be formulated as functions of the states. To calculate and analyze the partial redundancy of the observations from smartphone-sensors used in PDR, the system equation and the measurement equation of a Kalman filter as well as the redundancy matrix need to be derived in the Gauss-Helmert model ("GHM"). These derivations are introduced in this article and lead to a novel Kalman filter structure based on condition equations, enabling reliability assessment of each observation.

  13. GPS/Optical/Inertial Integration for 3D Navigation Using Multi-Copter Platforms

    NASA Technical Reports Server (NTRS)

    Dill, Evan T.; Young, Steven D.; Uijt De Haag, Maarten

    2017-01-01

    In concert with the continued advancement of a UAS traffic management system (UTM), the proposed uses of autonomous unmanned aerial systems (UAS) have become more prevalent in both the public and private sectors. To facilitate this anticipated growth, a reliable three-dimensional (3D) positioning, navigation, and mapping (PNM) capability will be required to enable operation of these platforms in challenging environments where global navigation satellite systems (GNSS) may not be available continuously. Especially, when the platform's mission requires maneuvering through different and difficult environments like outdoor opensky, outdoor under foliage, outdoor-urban and indoor, and may include transitions between these environments. There may not be a single method to solve the PNM problem for all environments. The research presented in this paper is a subset of a broader research effort, described in [1]. The research is focused on combining data from dissimilar sensor technologies to create an integrated navigation and mapping method that can enable reliable operation in both an outdoor and structured indoor environment. The integrated navigation and mapping design is utilizes a Global Positioning System (GPS) receiver, an Inertial Measurement Unit (IMU), a monocular digital camera, and three short to medium range laser scanners. This paper describes specifically the techniques necessary to effectively integrate the monocular camera data within the established mechanization. To evaluate the developed algorithms a hexacopter was built, equipped with the discussed sensors, and both hand-carried and flown through representative environments. This paper highlights the effect that the monocular camera has on the aforementioned sensor integration scheme's reliability, accuracy and availability.

  14. Web-Enabled Mechanistic Case Diagramming: A Novel Tool for Assessing Students' Ability to Integrate Foundational and Clinical Sciences.

    PubMed

    Ferguson, Kristi J; Kreiter, Clarence D; Haugen, Thomas H; Dee, Fred R

    2018-02-20

    As medical schools move from discipline-based courses to more integrated approaches, identifying assessment tools that parallel this change is an important goal. The authors describe the use of test item statistics to assess the reliability and validity of web-enabled mechanistic case diagrams (MCDs) as a potential tool to assess students' ability to integrate basic science and clinical information. Students review a narrative clinical case and construct an MCD using items provided by the case author. Students identify the relationships among underlying risk factors, etiology, pathogenesis and pathophysiology, and the patients' signs and symptoms. They receive one point for each correctly-identified link. In 2014-15 and 2015-16, case diagrams were implemented in consecutive classes of 150 medical students. The alpha reliability coefficient for the overall score, constructed using each student's mean proportion correct across all cases, was 0.82. Discrimination indices for each of the case scores with the overall score ranged from 0.23 to 0.51. In a G study using those students with complete data (n = 251) on all 16 cases, 10% of the variance was true score variance, and systematic case variance was large. Using 16 cases generated a G coefficient (relative score reliability) equal to .72 and a Phi equal to .65. The next phase of the project will involve deploying MCDs in higher-stakes settings to determine whether similar results can be achieved. Further analyses will determine whether these assessments correlate with other measures of higher-order thinking skills.

  15. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  16. The Evaluation Method of the Lightning Strike on Transmission Lines Aiming at Power Grid Reliability

    NASA Astrophysics Data System (ADS)

    Wen, Jianfeng; Wu, Jianwei; Huang, Liandong; Geng, Yinan; Yu, zhanqing

    2018-01-01

    Lightning protection of power system focuses on reducing the flashover rate, only distinguishing by the voltage level, without considering the functional differences between the transmission lines, and being lack of analysis the effect on the reliability of power grid. This will lead lightning protection design of general transmission lines is surplus but insufficient for key lines. In order to solve this problem, the analysis method of lightning striking on transmission lines for power grid reliability is given. Full wave process theory is used to analyze the lightning back striking; the leader propagation model is used to describe the process of shielding failure of transmission lines. The index of power grid reliability is introduced and the effect of transmission line fault on the reliability of power system is discussed in detail.

  17. Transformational Teaching: Connecting the Full-Range Leadership Theory and Graduate Teaching Practice

    ERIC Educational Resources Information Center

    Kim, Won J.

    2012-01-01

    Reliable measurements for effective teaching are lacking. In contrast, some theories of leadership (particularly transformational leadership) have been tested and found to have efficacy in a variety of organizational settings. In this study, the full-range leadership theory, which includes transformational leadership, was applied to the…

  18. A Quantitative Socio-hydrological Characterization of Water Security in Large-Scale Irrigation Systems

    NASA Astrophysics Data System (ADS)

    Siddiqi, A.; Muhammad, A.; Wescoat, J. L., Jr.

    2017-12-01

    Large-scale, legacy canal systems, such as the irrigation infrastructure in the Indus Basin in Punjab, Pakistan, have been primarily conceived, constructed, and operated with a techno-centric approach. The emerging socio-hydrological approaches provide a new lens for studying such systems to potentially identify fresh insights for addressing contemporary challenges of water security. In this work, using the partial definition of water security as "the reliable availability of an acceptable quantity and quality of water", supply reliability is construed as a partial measure of water security in irrigation systems. A set of metrics are used to quantitatively study reliability of surface supply in the canal systems of Punjab, Pakistan using an extensive dataset of 10-daily surface water deliveries over a decade (2007-2016) and of high frequency (10-minute) flow measurements over one year. The reliability quantification is based on comparison of actual deliveries and entitlements, which are a combination of hydrological and social constructs. The socio-hydrological lens highlights critical issues of how flows are measured, monitored, perceived, and experienced from the perspective of operators (government officials) and users (famers). The analysis reveals varying levels of reliability (and by extension security) of supply when data is examined across multiple temporal and spatial scales. The results shed new light on evolution of water security (as partially measured by supply reliability) for surface irrigation in the Punjab province of Pakistan and demonstrate that "information security" (defined as reliable availability of sufficiently detailed data) is vital for enabling water security. It is found that forecasting and management (that are social processes) lead to differences between entitlements and actual deliveries, and there is significant potential to positively affect supply reliability through interventions in the social realm.

  19. A Methodology for the Development of a Reliability Database for an Advanced Reactor Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of a reliability database (RDB) methodology to determine applicable reliability data for inclusion in the quantification of the PRA. The RDBmore » method developed during this project seeks to satisfy the requirements of the Data Analysis element of the ASME/ANS Non-LWR PRA standard. The RDB methodology utilizes a relevancy test to examine reliability data and determine whether it is appropriate to include as part of the reliability database for the PRA. The relevancy test compares three component properties to establish the level of similarity to components examined as part of the PRA. These properties include the component function, the component failure modes, and the environment/boundary conditions of the component. The relevancy test is used to gauge the quality of data found in a variety of sources, such as advanced reactor-specific databases, non-advanced reactor nuclear databases, and non-nuclear databases. The RDB also establishes the integration of expert judgment or separate reliability analysis with past reliability data. This paper provides details on the RDB methodology, and includes an example application of the RDB methodology for determining the reliability of the intermediate heat exchanger of a sodium fast reactor. The example explores a variety of reliability data sources, and assesses their applicability for the PRA of interest through the use of the relevancy test.« less

  20. High-Reliability Health Care: Getting There from Here

    PubMed Central

    Chassin, Mark R; Loeb, Jerod M

    2013-01-01

    Context Despite serious and widespread efforts to improve the quality of health care, many patients still suffer preventable harm every day. Hospitals find improvement difficult to sustain, and they suffer “project fatigue” because so many problems need attention. No hospitals or health systems have achieved consistent excellence throughout their institutions. High-reliability science is the study of organizations in industries like commercial aviation and nuclear power that operate under hazardous conditions while maintaining safety levels that are far better than those of health care. Adapting and applying the lessons of this science to health care offer the promise of enabling hospitals to reach levels of quality and safety that are comparable to those of the best high-reliability organizations. Methods We combined the Joint Commission's knowledge of health care organizations with knowledge from the published literature and from experts in high-reliability industries and leading safety scholars outside health care. We developed a conceptual and practical framework for assessing hospitals’ readiness for and progress toward high reliability. By iterative testing with hospital leaders, we refined the framework and, for each of its fourteen components, defined stages of maturity through which we believe hospitals must pass to reach high reliability. Findings We discovered that the ways that high-reliability organizations generate and maintain high levels of safety cannot be directly applied to today's hospitals. We defined a series of incremental changes that hospitals should undertake to progress toward high reliability. These changes involve the leadership's commitment to achieving zero patient harm, a fully functional culture of safety throughout the organization, and the widespread deployment of highly effective process improvement tools. Conclusions Hospitals can make substantial progress toward high reliability by undertaking several specific organizational change initiatives. Further research and practical experience will be necessary to determine the validity and effectiveness of this framework for high-reliability health care. PMID:24028696

  1. High-reliability health care: getting there from here.

    PubMed

    Chassin, Mark R; Loeb, Jerod M

    2013-09-01

    Despite serious and widespread efforts to improve the quality of health care, many patients still suffer preventable harm every day. Hospitals find improvement difficult to sustain, and they suffer "project fatigue" because so many problems need attention. No hospitals or health systems have achieved consistent excellence throughout their institutions. High-reliability science is the study of organizations in industries like commercial aviation and nuclear power that operate under hazardous conditions while maintaining safety levels that are far better than those of health care. Adapting and applying the lessons of this science to health care offer the promise of enabling hospitals to reach levels of quality and safety that are comparable to those of the best high-reliability organizations. We combined the Joint Commission's knowledge of health care organizations with knowledge from the published literature and from experts in high-reliability industries and leading safety scholars outside health care. We developed a conceptual and practical framework for assessing hospitals' readiness for and progress toward high reliability. By iterative testing with hospital leaders, we refined the framework and, for each of its fourteen components, defined stages of maturity through which we believe hospitals must pass to reach high reliability. We discovered that the ways that high-reliability organizations generate and maintain high levels of safety cannot be directly applied to today's hospitals. We defined a series of incremental changes that hospitals should undertake to progress toward high reliability. These changes involve the leadership's commitment to achieving zero patient harm, a fully functional culture of safety throughout the organization, and the widespread deployment of highly effective process improvement tools. Hospitals can make substantial progress toward high reliability by undertaking several specific organizational change initiatives. Further research and practical experience will be necessary to determine the validity and effectiveness of this framework for high-reliability health care. © 2013 The Authors. The Milbank Quarterly published by Wiley Periodicals Inc. on behalf of Milbank Memorial Fund.

  2. Evolution of NADPH Oxidase Inhibitors: Selectivity and Mechanisms for Target Engagement.

    PubMed

    Altenhöfer, Sebastian; Radermacher, Kim A; Kleikers, Pamela W M; Wingler, Kirstin; Schmidt, Harald H H W

    2015-08-10

    Oxidative stress, an excess of reactive oxygen species (ROS) production versus consumption, may be involved in the pathogenesis of different diseases. The only known enzymes solely dedicated to ROS generation are nicotinamide adenine dinucleotide phosphate (NADPH) oxidases with their catalytic subunits (NOX). After the clinical failure of most antioxidant trials, NOX inhibitors are the most promising therapeutic option for diseases associated with oxidative stress. Historical NADPH oxidase inhibitors, apocynin and diphenylene iodonium, are un-specific and not isoform selective. Novel NOX inhibitors stemming from rational drug discovery approaches, for example, GKT137831, ML171, and VAS2870, show improved specificity for NADPH oxidases and moderate NOX isoform selectivity. Along with NOX2 docking sequence (NOX2ds)-tat, a peptide-based inhibitor, the use of these novel small molecules in animal models has provided preliminary in vivo evidence for a pathophysiological role of specific NOX isoforms. Here, we discuss whether novel NOX inhibitors enable reliable validation of NOX isoforms' pathological roles and whether this knowledge supports translation into pharmacological applications. Modern NOX inhibitors have increased the evidence for pathophysiological roles of NADPH oxidases. However, in comparison to knockout mouse models, NOX inhibitors have limited isoform selectivity. Thus, their use does not enable clear statements on the involvement of individual NOX isoforms in a given disease. The development of isoform-selective NOX inhibitors and biologicals will enable reliable validation of specific NOX isoforms in disease models other than the mouse. Finally, GKT137831, the first NOX inhibitor in clinical development, is poised to provide proof of principle for the clinical potential of NOX inhibition.

  3. Assessing cognitive functioning in females with Rett syndrome by eye-tracking methodology.

    PubMed

    Ahonniska-Assa, Jaana; Polack, Orli; Saraf, Einat; Wine, Judy; Silberg, Tamar; Nissenkorn, Andreea; Ben-Zeev, Bruria

    2018-01-01

    While many individuals with severe developmental impairments learn to communicate with augmentative and alternative communication (AAC) devices, a significant number of individuals show major difficulties in the effective use of AAC. Recent technological innovations, i.e., eye-tracking technology (ETT), aim to improve the transparency of communication and may also enable a more valid cognitive assessment. To investigate whether ETT in forced-choice tasks can enable children with very severe motor and speech impairments to respond consistently, allowing a more reliable evaluation of their language comprehension. Participants were 17 girls with Rett syndrome (M = 6:06 years). Their ability to respond by eye gaze was first practiced with computer games using ETT. Afterwards, their receptive vocabulary was assessed using the Peabody Picture Vocabulary Test-4 (PPVT-4). Target words were orally presented and participants responded by focusing their eyes on the preferred picture. Remarkable differences between the participants in receptive vocabulary were demonstrated using ETT. The verbal comprehension abilities of 32% of the participants ranged from low-average to mild cognitive impairment, and the other 68% of the participants showed moderate to severe impairment. Young age at the time of assessment was positively correlated with higher receptive vocabulary. The use of ETT seems to make the communicational signals of children with severe motor and communication impairments more easily understood. Early practice of ETT may improve the quality of communication and enable more reliable conclusions in learning and assessment sessions. Copyright © 2017 European Paediatric Neurology Society. Published by Elsevier Ltd. All rights reserved.

  4. Eureka-DMA: an easy-to-operate graphical user interface for fast comprehensive investigation and analysis of DNA microarray data.

    PubMed

    Abelson, Sagi

    2014-02-24

    In the past decade, the field of molecular biology has become increasingly quantitative; rapid development of new technologies enables researchers to investigate and address fundamental issues quickly and in an efficient manner which were once impossible. Among these technologies, DNA microarray provides methodology for many applications such as gene discovery, diseases diagnosis, drug development and toxicological research and it has been used increasingly since it first emerged. Multiple tools have been developed to interpret the high-throughput data produced by microarrays. However, many times, less consideration has been given to the fact that an extensive and effective interpretation requires close interplay between the bioinformaticians who analyze the data and the biologists who generate it. To bridge this gap and to simplify the usability of such tools we developed Eureka-DMA - an easy-to-operate graphical user interface that allows bioinformaticians and bench-biologists alike to initiate analyses as well as to investigate the data produced by DNA microarrays. In this paper, we describe Eureka-DMA, a user-friendly software that comprises a set of methods for the interpretation of gene expression arrays. Eureka-DMA includes methods for the identification of genes with differential expression between conditions; it searches for enriched pathways and gene ontology terms and combines them with other relevant features. It thus enables the full understanding of the data for following testing as well as generating new hypotheses. Here we show two analyses, demonstrating examples of how Eureka-DMA can be used and its capability to produce relevant and reliable results. We have integrated several elementary expression analysis tools to provide a unified interface for their implementation. Eureka-DMA's simple graphical user interface provides effective and efficient framework in which the investigator has the full set of tools for the visualization and interpretation of the data with the option of exporting the analysis results for later use in other platforms. Eureka-DMA is freely available for academic users and can be downloaded at http://blue-meduza.org/Eureka-DMA.

  5. Hydrogen Transport to Mars Enables the Sabatier/Electrolysis Process

    NASA Technical Reports Server (NTRS)

    Mueller, P. J.; Rapp, D.

    1997-01-01

    The Sabatier/Electrolysis (S/E) process is an attractive approach to in situ propellant production (ISPP), and a breadboard demonstration of this process at Lockheed Martin Astronautics funded by JPL performed very well, with high conversion efficiency, and reliable diurnal operation. There is a net usage of hydrogen in the S/E process, and this has been the principal problem for this approach to ISPP.

  6. Acquisition Community Team Dynamics: The Tuckman Model vs. the DAU Model

    DTIC Science & Technology

    2007-04-30

    courses . These student teams are used to enable the generation of more complex products and to prepare the students for the ...requirement for stage discreteness was met, I developed a stage-separation test that, when applied to the data representing the experience of a... test the reliability, and validate an improved questionnaire instrument that: – Redefines “Storming” with new storming questions Less focused

  7. Health Evaluation of Experimental Laboratory Mice.

    PubMed

    Burkholder, Tanya; Foltz, Charmaine; Karlsson, Eleanor; Linton, C Garry; Smith, Joanne M

    2012-06-01

    Good science and good animal care go hand in hand. A sick or distressed animal does not produce the reliable results that a healthy and unstressed animal produces. This unit describes the essentials of assessing mouse health, colony health surveillance, common conditions, and determination of appropriate endpoints. Understanding the health and well-being of the mice used in research enables the investigator to optimize research results and animal care.

  8. "My Son Is Reliable": Young Drivers' Parents' Optimism and Views on the Norms of Parental Involvement in Youth Driving

    ERIC Educational Resources Information Center

    Guttman, Nurit

    2013-01-01

    The high crash rates among teenage drivers are of great concern across nations. Parents' involvement is known to help increase their young drivers' driving safety. In particular, parents can place restrictions on their son's/daughter's driving (e.g., restrict night time driving), which can enable the young driver to gain driving experience in…

  9. Case-mix adjustment and enabled reporting of the health care experiences of adults with disabilities.

    PubMed

    Palsbo, Susan E; Diao, Guoqing; Palsbo, Gregory A; Tang, Liansheng; Rosenberger, William F; Mastal, Margaret F

    2010-09-01

    To develop activity limitation clusters for case-mix adjustment of health care ratings and as a population profiler, and to develop a cognitively accessible report of statistically reliable quality and access measures comparing the health care experiences of adults with and without disabilities, within and across health delivery organizations. Observational study. Three California Medicaid health care organizations. Adults (N = 1086) of working age enrolled for at least 1 year in Medicaid because of disability. Not applicable. Principal components analysis created 4 clusters of activity limitations that we used to characterize case mix. We identified and calculated 28 quality measures using responses from a proposed enabled version of the Consumer Assessment of Healthcare Providers and Systems (CAHPS) survey. We calculated scores for overall care as the weighted mean of the case-mix adjusted ratings. Disability caused a greater bias on health plan ratings and specialist ratings than did demographic factors. Proxy respondents rated care the same as self-respondents. Telephone and mail administration were equivalent for service reports, but telephone respondents tended to offer more positive global ratings. Plan-level reliability estimates for new composites on shared decision making and advice on healthy living are .79 and .87, respectively. Plan-level reliability estimates for a new composite measure on family planning did not discriminate between health plans because respondents rated all health plans poorly. Approximately 125 respondents per site are necessary to detect group differences. Self-reported activity limitations incorporating standard questions from the American Community Survey can be used to create a disability case-mix index and to construct profiles of a population's activity limitations. The enabled comparative report, which we call the Assessment of Health Plans and Providers by People with Activity Limitations, is more cognitively accessible than typical CAHPS report templates for state Medicaid plans. The CAHPS Medicaid reporting tools may provide misleading ratings of health plan and physician quality by people with disabilities because the mean ratings do not account for systematic biases associated with disability. More testing on larger populations would help to quantify the strength of various reporting biases.

  10. High reliability level on single-mode 980nm-1060 nm diode lasers for telecommunication and industrial applications

    NASA Astrophysics Data System (ADS)

    Van de Casteele, J.; Bettiati, M.; Laruelle, F.; Cargemel, V.; Pagnod-Rossiaux, P.; Garabedian, P.; Raymond, L.; Laffitte, D.; Fromy, S.; Chambonnet, D.; Hirtz, J. P.

    2008-02-01

    We demonstrate very high reliability level on 980-1060nm high-power single-mode lasers through multi-cell tests. First, we show how our chip design and technology enables high reliability levels. Then, we aged 758 devices during 9500 hours among 6 cells with high current (0.8A-1.2A) and high submount temperature (65°C-105°C) for the reliability demonstration. Sudden catastrophic failure is the main degradation mechanism observed. A statistical failure rate model gives an Arrhenius thermal activation energy of 0.51eV and a power law forward current acceleration factor of 5.9. For high-power submarine applications (360mW pump module output optical power), this model exhibits a failure rate as low as 9 FIT at 13°C, while ultra-high power terrestrial modules (600mW) lie below 220 FIT at 25°C. Wear-out phenomena is observed only for very high current level without any reliability impact under 1.1A. For the 1060nm chip, step-stress tests were performed and a set of devices were aged during more than 2000 hours in different stress conditions. First results are in accordance with 980nm product with more than 100khours estimated MTTF. These reliability and performance features of 980-1060nm laser diodes will make high-power single-mode emitters the best choice for a number of telecommunication and industrial applications in the next few years.

  11. Evolution of Safety Analysis to Support New Exploration Missions

    NASA Technical Reports Server (NTRS)

    Thrasher, Chard W.

    2008-01-01

    NASA is currently developing the Ares I launch vehicle as a key component of the Constellation program which will provide safe and reliable transportation to the International Space Station, back to the moon, and later to Mars. The risks and costs of the Ares I must be significantly lowered, as compared to other manned launch vehicles, to enable the continuation of space exploration. It is essential that safety be significantly improved, and cost-effectively incorporated into the design process. This paper justifies early and effective safety analysis of complex space systems. Interactions and dependences between design, logistics, modeling, reliability, and safety engineers will be discussed to illustrate methods to lower cost, reduce design cycles and lessen the likelihood of catastrophic events.

  12. Sensor Systems for Prognostics and Health Management

    PubMed Central

    Cheng, Shunfeng; Azarian, Michael H.; Pecht, Michael G.

    2010-01-01

    Prognostics and health management (PHM) is an enabling discipline consisting of technologies and methods to assess the reliability of a product in its actual life cycle conditions to determine the advent of failure and mitigate system risk. Sensor systems are needed for PHM to monitor environmental, operational, and performance-related characteristics. The gathered data can be analyzed to assess product health and predict remaining life. In this paper, the considerations for sensor system selection for PHM applications, including the parameters to be measured, the performance needs, the electrical and physical attributes, reliability, and cost of the sensor system, are discussed. The state-of-the-art sensor systems for PHM and the emerging trends in technologies of sensor systems for PHM are presented. PMID:22219686

  13. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates

    USGS Publications Warehouse

    Hobbs, Michael T.; Brehme, Cheryl S.

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  14. Ferroelectric nanoparticle-embedded sponge structure triboelectric generators

    NASA Astrophysics Data System (ADS)

    Park, Daehoon; Shin, Sung-Ho; Yoon, Ick-Jae; Nah, Junghyo

    2018-05-01

    We report high-performance triboelectric nanogenerators (TENGs) employing ferroelectric nanoparticles (NPs) embedded in a sponge structure. The ferroelectric BaTiO3 NPs inside the sponge structure play an important role in increasing surface charge density by polarized spontaneous dipoles, enabling the packaging of TENGs even with a minimal separation gap. Since the friction surfaces are encapsulated in the packaged device structure, it suffers negligible performance degradation even at a high relative humidity of 80%. The TENGs also demonstrated excellent mechanical durability due to the elasticity and flexibility of the sponge structure. Consequently, the TENGs can reliably harvest energy even under harsh conditions. The approach introduced here is a simple, effective, and reliable way to fabricate compact and packaged TENGs for potential applications in wearable energy-harvesting devices.

  15. Sensor systems for prognostics and health management.

    PubMed

    Cheng, Shunfeng; Azarian, Michael H; Pecht, Michael G

    2010-01-01

    Prognostics and health management (PHM) is an enabling discipline consisting of technologies and methods to assess the reliability of a product in its actual life cycle conditions to determine the advent of failure and mitigate system risk. Sensor systems are needed for PHM to monitor environmental, operational, and performance-related characteristics. The gathered data can be analyzed to assess product health and predict remaining life. In this paper, the considerations for sensor system selection for PHM applications, including the parameters to be measured, the performance needs, the electrical and physical attributes, reliability, and cost of the sensor system, are discussed. The state-of-the-art sensor systems for PHM and the emerging trends in technologies of sensor systems for PHM are presented.

  16. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates.

    PubMed

    Hobbs, Michael T; Brehme, Cheryl S

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  17. Development of highly efficient laser bars emitting at around 1060 nm for medical applications

    NASA Astrophysics Data System (ADS)

    Pietrzak, Agnieszka; Zorn, Martin; Meusel, Jens; Huelsewede, Ralf; Sebastian, Juergen

    2018-02-01

    An overview is presented on the recent progress in the development of high power laser bars at wavelengths around 1060nm. The development is focused on highly efficient and reliable laser performance under pulsed operation for medical applications. The epitaxial structure and lateral layout of the laser bars were tailored to meet the application requirements. Reliable operation peak powers of 350W and 500W are demonstrated from laser bars with fill-factor FF=75% and resonator lengths 1.5mm and 2.0mm, respectively. Moreover, 60W at current 65A with lifetime <10.000h are presented. The power scaling with fill-factor enables a cost reduction ($/W) up to 35%.

  18. Quantitative nondestructive evaluation: Requirements for tomorrow's reliability

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.

    1991-01-01

    Quantitative Nondestructive Evaluation (QNDE) is the technology of measurement, analysis, and prediction of the state of material/structural systems for safety, reliability, and mission assurance. QNDE has impact on everyday life from the cars we drive, the planes we fly, the buildings we work or live in, literally to the infrastructure of our world. Here, researchers highlight some of the new sciences and technologies that are part of a safer, cost effective tomorrow. Specific technologies that are discussed are thermal QNDE of aircraft structural integrity, ultrasonic QNDE for materials characterization, and technology spinoffs from aerospace to the medical sector. In each case, examples are given of how new requirements result in enabling measurement technologies, which in turn change the boundaries of design/practice.

  19. Ferroelectric nanoparticle-embedded sponge structure triboelectric generators.

    PubMed

    Park, Daehoon; Shin, Sung-Ho; Yoon, Ick-Jae; Nah, Junghyo

    2018-05-04

    We report high-performance triboelectric nanogenerators (TENGs) employing ferroelectric nanoparticles (NPs) embedded in a sponge structure. The ferroelectric BaTiO 3 NPs inside the sponge structure play an important role in increasing surface charge density by polarized spontaneous dipoles, enabling the packaging of TENGs even with a minimal separation gap. Since the friction surfaces are encapsulated in the packaged device structure, it suffers negligible performance degradation even at a high relative humidity of 80%. The TENGs also demonstrated excellent mechanical durability due to the elasticity and flexibility of the sponge structure. Consequently, the TENGs can reliably harvest energy even under harsh conditions. The approach introduced here is a simple, effective, and reliable way to fabricate compact and packaged TENGs for potential applications in wearable energy-harvesting devices.

  20. Cold Climate Foundation Retrofit Experimental Hygrothermal Performance: Cloquet Residential Research Facility Laboratory Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Louise F.; Harmon, Anna C.

    2015-04-01

    Thermal and moisture problems in existing basements create a unique challenge because the exterior face of the wall is not easily or inexpensively accessible. This approach addresses thermal and moisture management from the interior face of the wall without disturbing the exterior soil and landscaping. the interior and exterior environments. This approach has the potential for improving durability, comfort, and indoor air quality. This project was funded jointly by the National Renewable Energy Laboratory (NREL) and Oak Ridge National Laboratory (ORNL). ORNL focused on developing a full basement wall system experimental database to enable others to validate hygrothermal simulation codes.more » NREL focused on testing the moisture durability of practical basement wall interior insulation retrofit solutions for cold climates. The project has produced a physically credible and reliable long-term hygrothermal performance database for retrofit foundation wall insulation systems in zone 6 and 7 climates that are fully compliant with the performance criteria in the 2009 Minnesota Energy Code. The experimental data were configured into a standard format that can be published online and that is compatible with standard commercially available spreadsheet and database software.« less

  1. Coupling Damage-Sensing Particles to the Digitial Twin Concept

    NASA Technical Reports Server (NTRS)

    Hochhalter, Jacob; Leser, William P.; Newman, John A.; Gupta, Vipul K.; Yamakov, Vesselin; Cornell, Stephen R.; Willard, Scott A.; Heber, Gerd

    2014-01-01

    The research presented herein is a first step toward integrating two emerging structural health management paradigms: digital twin and sensory materials. Digital twin is an emerging life management and certification paradigm whereby models and simulations consist of as-built vehicle state, as-experienced loads and environments, and other vehicle-specific history to enable high-fidelity modeling of individual aerospace vehicles throughout their service lives. The digital twin concept spans many disciplines, and an extensive study on the full domain is out of the scope of this study. Therefore, as it pertains to the digital twin, this research focused on one major concept: modeling specifically the as-manufactured geometry of a component and its microstructure (to the degree possible). The second aspect of this research was to develop the concept of sensory materials such that they can be employed within the digital twin framework. Sensory materials are shape-memory alloys that undergo an audible phase transformation while experiencing sufficient strain. Upon embedding sensory materials with a structural alloy, this audible transformation helps improve the reliability of crack detection especially at the early stages of crack growth. By combining these two early-stage technologies, an automated approach to evidence-based inspection and maintenance of aerospace vehicles is sought.

  2. A Novel Telometric Metric for In-Situ Measurement of Intrauterine Pressure (IUP) in Pregnant and Parturient Rats

    NASA Technical Reports Server (NTRS)

    Baer, Lisa A.; LaFramboise, M. N.; Hills, E. M.; Daly, M. E.; Mills, N. A.; Wade, C. E.; Ronca, A. E.; Dalton, Bonnie (Technical Monitor)

    2001-01-01

    During labor and birth, considerable forces exerted on fetuses help instigate certain adaptive postpartum responses (viz., breathing and suckling). To make precise, reliable measures of the forces experienced by rat fetuses during parturition, we developed a novel method for measuring intrauterine pressure (IUP) in late pregnant rats. A small (1.25 x 4cm) telemetric blood pressure sensor is fitted within a fluid-filled balloon, similar in size to a full term rat fetus. The balloon is surgically implanted in the uterus on Gestational Day 19 of the rats' 22-day pregnancy. During birth, dams are able to deliver their pups and the balloon. IUP arsenals are recorded during labor (G22 or 23) and birth. Data derived from a group of implanted rats indicated that pressures on the balloon increased across the period of birth, reaching 18 mmHg during labor, 25 mmHg during pup births and 39 mmHg just prior to delivery of the balloon. These data are within the range reported for conventional IUP measurement techniques. Dams are simultaneously videotaped, enabling us to analyze behavioral expressions of labor contractions and to integrate in-situ and behavioral findings.

  3. A Waveguide-coupled Thermally-isolated Radiometric Source

    NASA Technical Reports Server (NTRS)

    Rostem, Karwan; Chuss, David T.; Lourie, Nathan P.; Voellmer, George M.; Wollack, Edward

    2013-01-01

    The design and validation of a dual polarization source for waveguide-coupled millimeter and sub-millimeter wave cryogenic sensors is presented. The thermal source is a waveguide mounted absorbing conical dielectric taper. The absorber is thermally isolated with a kinematic suspension that allows the guide to be heat sunk to the lowest bath temperature of the cryogenic system. This approach enables the thermal emission from the metallic waveguide walls to be subdominant to that from the source. The use of low thermal conductivity Kevlar threads for the kinematic mount effectively decouples the absorber from the sensor cold stage. Hence, the absorber can be heated to significantly higher temperatures than the sensor with negligible conductive loading. The kinematic suspension provides high mechanical repeatability and reliability with thermal cycling. A 33-50 GHz blackbody source demonstrates an emissivity of 0.999 over the full waveguide band where the dominant deviation from unity arises from the waveguide ohmic loss. The observed thermal time constant of the source is 40 s when the absorber temperature is 15 K. The specific heat of the lossy dielectric MF-117 is well approximated by Cv(T) = 0.12 T(exp 2.06) mJ/g/K between 3.5 K and 15 K.

  4. Making Temporal Search More Central in Spatial Data Infrastructures

    NASA Astrophysics Data System (ADS)

    Corti, P.; Lewis, B.

    2017-10-01

    A temporally enabled Spatial Data Infrastructure (SDI) is a framework of geospatial data, metadata, users, and tools intended to provide an efficient and flexible way to use spatial information which includes the historical dimension. One of the key software components of an SDI is the catalogue service which is needed to discover, query, and manage the metadata. A search engine is a software system capable of supporting fast and reliable search, which may use any means necessary to get users to the resources they need quickly and efficiently. These techniques may include features such as full text search, natural language processing, weighted results, temporal search based on enrichment, visualization of patterns in distributions of results in time and space using temporal and spatial faceting, and many others. In this paper we will focus on the temporal aspects of search which include temporal enrichment using a time miner - a software engine able to search for date components within a larger block of text, the storage of time ranges in the search engine, handling historical dates, and the use of temporal histograms in the user interface to display the temporal distribution of search results.

  5. Multivendor nuclear medicine PACS provide fully digital clinical operation at the University of Miami/Jackson Memorial Hospital

    NASA Astrophysics Data System (ADS)

    Georgiou, Mike F.; Sfakianakis, George N.; Johnson, Gary; Douligeris, Christos; Scandar, Silvia; Eisler, E.; Binkley, B.

    1994-05-01

    In an effort to improve patient care while considering cost-effectiveness, we developed a Picture Archiving and Communication System (PACS), which combines imaging cameras, computers and other peripheral equipment from multiple nuclear medicine vectors. The PACS provides fully-digital clinical operation which includes acquisition and automatic organization of patient data, distribution of the data to all networked units inside the department and other remote locations, digital analysis and quantitation of images, digital diagnostic reading of image studies and permanent data archival with the ability for fast retrieval. The PACS enabled us to significantly reduce the amount of film used, and we are currently proceeding with implementing a film-less laboratory. Hard copies are produced on paper or transparent sheets for non-digitally connected parts of the hospital. The PACS provides full-digital operation which is faster, more reliable, better organized and managed, and overall more efficient than a conventional film-based operation. In this paper, the integration of the various PACS components from multiple vendors is reviewed, and the impact of PACS, with its advantages and limitations on our clinical operation is analyzed.

  6. 7Li MRI of Li batteries reveals location of microstructural lithium.

    PubMed

    Chandrashekar, S; Trease, Nicole M; Chang, Hee Jung; Du, Lin-Shu; Grey, Clare P; Jerschow, Alexej

    2012-02-12

    There is an ever-increasing need for advanced batteries for portable electronics, to power electric vehicles and to facilitate the distribution and storage of energy derived from renewable energy sources. The increasing demands on batteries and other electrochemical devices have spurred research into the development of new electrode materials that could lead to better performance and lower cost (increased capacity, stability and cycle life, and safety). These developments have, in turn, given rise to a vigorous search for the development of robust and reliable diagnostic tools to monitor and analyse battery performance, where possible, in situ. Yet, a proven, convenient and non-invasive technology, with an ability to image in three dimensions the chemical changes that occur inside a full battery as it cycles, has yet to emerge. Here we demonstrate techniques based on magnetic resonance imaging, which enable a completely non-invasive visualization and characterization of the changes that occur on battery electrodes and in the electrolyte. The current application focuses on lithium-metal batteries and the observation of electrode microstructure build-up as a result of charging. The methods developed here will be highly valuable in the quest for enhanced battery performance and in the evaluation of other electrochemical devices.

  7. 7Li MRI of Li batteries reveals location of microstructural lithium

    NASA Astrophysics Data System (ADS)

    Chandrashekar, S.; Trease, Nicole M.; Chang, Hee Jung; Du, Lin-Shu; Grey, Clare P.; Jerschow, Alexej

    2012-04-01

    There is an ever-increasing need for advanced batteries for portable electronics, to power electric vehicles and to facilitate the distribution and storage of energy derived from renewable energy sources. The increasing demands on batteries and other electrochemical devices have spurred research into the development of new electrode materials that could lead to better performance and lower cost (increased capacity, stability and cycle life, and safety). These developments have, in turn, given rise to a vigorous search for the development of robust and reliable diagnostic tools to monitor and analyse battery performance, where possible, in situ. Yet, a proven, convenient and non-invasive technology, with an ability to image in three dimensions the chemical changes that occur inside a full battery as it cycles, has yet to emerge. Here we demonstrate techniques based on magnetic resonance imaging, which enable a completely non-invasive visualization and characterization of the changes that occur on battery electrodes and in the electrolyte. The current application focuses on lithium-metal batteries and the observation of electrode microstructure build-up as a result of charging. The methods developed here will be highly valuable in the quest for enhanced battery performance and in the evaluation of other electrochemical devices.

  8. Fiber-optic temperature profiling for thermal protection system heat shields

    NASA Astrophysics Data System (ADS)

    Black, Richard J.; Costa, Joannes M.; Zarnescu, Livia; Hackney, Drew A.; Moslehi, Behzad; Peters, Kara J.

    2016-11-01

    To achieve better designs for spacecraft heat shields for missions requiring atmospheric aero-capture or entry/reentry, reliable thermal protection system (TPS) sensors are needed. Such sensors will provide both risk reduction and heat-shield mass minimization, which will facilitate more missions and enable increased payloads and returns. This paper discusses TPS thermal measurements provided by a temperature monitoring system involving lightweight, electromagnetic interference-immune, high-temperature resistant fiber Bragg grating (FBG) sensors with a thermal mass near that of TPS materials together with fast FBG sensor interrogation. Such fiber-optic sensing technology is highly sensitive and accurate, as well as suitable for high-volume production. Multiple sensing FBGs can be fabricated as arrays on a single fiber for simplified design and reduced cost. Experimental results are provided to demonstrate the temperature monitoring system using multisensor FBG arrays embedded in a small-size super-light ablator (SLA) coupon which was thermally loaded to temperatures in the vicinity of the SLA charring temperature. In addition, a high-temperature FBG array was fabricated and tested for 1000°C operation, and the temperature dependence considered over the full range (cryogenic to high temperature) for which silica fiber FBGs have been subjected.

  9. AN-CASE NET-CENTRIC modeling and simulation

    NASA Astrophysics Data System (ADS)

    Baskinger, Patricia J.; Chruscicki, Mary Carol; Turck, Kurt

    2009-05-01

    The objective of mission training exercises is to immerse the trainees into an environment that enables them to train like they would fight. The integration of modeling and simulation environments that can seamlessly leverage Live systems, and Virtual or Constructive models (LVC) as they are available offers a flexible and cost effective solution to extending the "war-gaming" environment to a realistic mission experience while evolving the development of the net-centric enterprise. From concept to full production, the impact of new capabilities on the infrastructure and concept of operations, can be assessed in the context of the enterprise, while also exposing them to the warfighter. Training is extended to tomorrow's tools, processes, and Tactics, Techniques and Procedures (TTPs). This paper addresses the challenges of a net-centric modeling and simulation environment that is capable of representing a net-centric enterprise. An overview of the Air Force Research Laboratory's (AFRL) Airborne Networking Component Architecture Simulation Environment (AN-CASE) is provide as well as a discussion on how it is being used to assess technologies for the purpose of experimenting with new infrastructure mechanisms that enhance the scalability and reliability of the distributed mission operations environment.

  10. Utilization of native oxygen in Eu(RE)-doped GaN for enabling device compatibility in optoelectronic applications

    DOE PAGES

    Mitchell, Brandon; Timmerman, D.; Poplawsky, Jonathan D.; ...

    2016-01-04

    The detrimental influence of oxygen on the performance and reliability of V/III nitride based devices is well known. However, the influence of oxygen on the nature of the incorporation of other co-dopants, such as rare earth ions, has been largely overlooked in GaN. Here, we report the first comprehensive study of the critical role that oxygen has on Eu in GaN, as well as atomic scale observation of diffusion and local concentration of both atoms in the crystal lattice. We find that oxygen plays an integral role in the location, stability, and local defect structure around the Eu ions thatmore » were doped into the GaN host. Although the availability of oxygen is essential for these properties, it renders the material incompatible with GaN-based devices. However, the utilization of the normally occurring oxygen in GaN is promoted through structural manipulation, reducing its concentration by 2 orders of magnitude, while maintaining both the material quality and the favorable optical properties of the Eu ions. Furthermore, these findings open the way for full integration of RE dopants for optoelectronic functionalities in the existing GaN platform.« less

  11. MATLAB-implemented estimation procedure for model-based assessment of hepatic insulin degradation from standard intravenous glucose tolerance test data.

    PubMed

    Di Nardo, Francesco; Mengoni, Michele; Morettini, Micaela

    2013-05-01

    Present study provides a novel MATLAB-based parameter estimation procedure for individual assessment of hepatic insulin degradation (HID) process from standard frequently-sampled intravenous glucose tolerance test (FSIGTT) data. Direct access to the source code, offered by MATLAB, enabled us to design an optimization procedure based on the alternating use of Gauss-Newton's and Levenberg-Marquardt's algorithms, which assures the full convergence of the process and the containment of computational time. Reliability was tested by direct comparison with the application, in eighteen non-diabetic subjects, of well-known kinetic analysis software package SAAM II, and by application on different data. Agreement between MATLAB and SAAM II was warranted by intraclass correlation coefficients ≥0.73; no significant differences between corresponding mean parameter estimates and prediction of HID rate; and consistent residual analysis. Moreover, MATLAB optimization procedure resulted in a significant 51% reduction of CV% for the worst-estimated parameter by SAAM II and in maintaining all model-parameter CV% <20%. In conclusion, our MATLAB-based procedure was suggested as a suitable tool for the individual assessment of HID process. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  12. Multifunctional Single-Phase Photocatalysts: Extended Near Infrared Photoactivity and Reliable Magnetic Recyclability

    NASA Astrophysics Data System (ADS)

    Li, Xiaoning; Zhu, Zhu; Li, Feng; Huang, Yan; Hu, Xiang; Huang, Haoliang; Peng, Ranran; Zhai, Xiaofang; Fu, Zhengping; Lu, Yalin

    2015-10-01

    A practical photocatalyst should be able to integrate together various functions including the extended solar conversion, a feasible and economic recyclability, and above the room temperature operation potential, et al., in order to fulfill the spreading application needs in nowadays. In this report, a multifunctional single-phase photocatalyst which possesses a high photoactivity extended into the near infrared region, an easy magnetic recyclability and the high temperature stability was developed by doping Co into a new layer-structured Bi7Fe3Ti3O21 material. Light absorption and photocatalytic activity of the resulted Bi7Fe3-xCoxTi3O21 photocatalyst were extended to the long wavelength as far as 800 nm. Its strong ferromagnetism above the room temperature enables the nanopowders fully recyclable in viscous solutions simply with a magnet bar in an experimental demonstration. Furthermore, such photoactivity and magnetic recyclability were heavily tested under high-temperature and high-viscosity conditions, which was intended to simulate the actual industrial environments. This work brings the bright light to a full availability of a new multifunctional photocatalyst, via integrating the much enhanced ferromagnetic, ferroelectric, optoelectronic properties, most importantly, into a single-phase structure.

  13. Hyperspectral image analysis for rapid and accurate discrimination of bacterial infections: A benchmark study.

    PubMed

    Arrigoni, Simone; Turra, Giovanni; Signoroni, Alberto

    2017-09-01

    With the rapid diffusion of Full Laboratory Automation systems, Clinical Microbiology is currently experiencing a new digital revolution. The ability to capture and process large amounts of visual data from microbiological specimen processing enables the definition of completely new objectives. These include the direct identification of pathogens growing on culturing plates, with expected improvements in rapid definition of the right treatment for patients affected by bacterial infections. In this framework, the synergies between light spectroscopy and image analysis, offered by hyperspectral imaging, are of prominent interest. This leads us to assess the feasibility of a reliable and rapid discrimination of pathogens through the classification of their spectral signatures extracted from hyperspectral image acquisitions of bacteria colonies growing on blood agar plates. We designed and implemented the whole data acquisition and processing pipeline and performed a comprehensive comparison among 40 combinations of different data preprocessing and classification techniques. High discrimination performance has been achieved also thanks to improved colony segmentation and spectral signature extraction. Experimental results reveal the high accuracy and suitability of the proposed approach, driving the selection of most suitable and scalable classification pipelines and stimulating clinical validations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. The BGAN extension programme

    NASA Astrophysics Data System (ADS)

    Rivera, Juan J.; Trachtman, Eyal; Richharia, Madhavendra

    2005-11-01

    Mobile satellite telecommunications systems have undergone an enormous evolution in the last decades, with the interest in having advanced telecommunications services available on demand, anywhere and at any time, leading to incredible advances. The demand for braodband data is therefore rapidly gathering pace, but current solutions are finding it increasingly difficult to combine large bandwidth with ubiquitous coverage, reliability and portability. The BGAN (Broadband Global Area Network) system, designed to operate with the Inmarsat-4 satellites, provides breakthrough services that meet all of these requirements. It will enable broadband connection on the move, delivering all the key tools of the modern office. Recognising the great impact that Inmarsat's BGAN system will have on the European satellite communications industry, and the benefits that it will bring to a wide range of European industries, in 2003 ESA initiated the "BGAN Extension" project. Its primary goals are to provide the full range of BGAN services to truly mobile platforms, operating in aeronautical, vehicular and maritime environments, and to introduce a multicast service capability. The project is supported by the ARTES Programme which establishes a collaboration agreement between ESA, Inmarsat and a group of key industrial and academic institutions which includes EMS, Logica, Nera and the University of Surrey (UK).

  15. Redox Indicator Mice Stably Expressing Genetically Encoded Neuronal roGFP: Versatile Tools to Decipher Subcellular Redox Dynamics in Neuropathophysiology.

    PubMed

    Wagener, Kerstin C; Kolbrink, Benedikt; Dietrich, Katharina; Kizina, Kathrin M; Terwitte, Lukas S; Kempkes, Belinda; Bao, Guobin; Müller, Michael

    2016-07-01

    Reactive oxygen species (ROS) and downstream redox alterations not only mediate physiological signaling but also neuropathology. For long, ROS/redox imaging was hampered by a lack of reliable probes. Genetically encoded redox sensors overcame this gap and revolutionized (sub)cellular redox imaging. Yet, the successful delivery of sensor-coding DNA, which demands transfection/transduction of cultured preparations or stereotaxic microinjections of each subject, remains challenging. By generating transgenic mice, we aimed to overcome limiting cultured preparations, circumvent surgical interventions, and to extend effectively redox imaging to complex and adult preparations. Our redox indicator mice widely express Thy1-driven roGFP1 (reduction-oxidation-sensitive green fluorescent protein 1) in neuronal cytosol or mitochondria. Negative phenotypic effects of roGFP1 were excluded and its proper targeting and functionality confirmed. Redox mapping by ratiometric wide-field imaging reveals most oxidizing conditions in CA3 neurons. Furthermore, mitochondria are more oxidized than cytosol. Cytosolic and mitochondrial roGFP1s reliably report cell endogenous redox dynamics upon metabolic challenge or stimulation. Fluorescence lifetime imaging yields stable, but marginal, response ranges. We therefore developed automated excitation ratiometric 2-photon imaging. It offers superior sensitivity, spatial resolution, and response dynamics. Redox indicator mice enable quantitative analyses of subcellular redox dynamics in a multitude of preparations and at all postnatal stages. This will uncover cell- and compartment-specific cerebral redox signals and their defined alterations during development, maturation, and aging. Cross-breeding with other disease models will reveal molecular details on compartmental redox homeostasis in neuropathology. Combined with ratiometric 2-photon imaging, this will foster our mechanistic understanding of cellular redox signals in their full complexity. Antioxid. Redox Signal. 25, 41-58.

  16. An Improved Approach for RSSI-Based only Calibration-Free Real-Time Indoor Localization on IEEE 802.11 and 802.15.4 Wireless Networks.

    PubMed

    Passafiume, Marco; Maddio, Stefano; Cidronali, Alessandro

    2017-03-29

    Assuming a reliable and responsive spatial contextualization service is a must-have in IEEE 802.11 and 802.15.4 wireless networks, a suitable approach consists of the implementation of localization capabilities, as an additional application layer to the communication protocol stack. Considering the applicative scenario where satellite-based positioning applications are denied, such as indoor environments, and excluding data packet arrivals time measurements due to lack of time resolution, received signal strength indicator (RSSI) measurements, obtained according to IEEE 802.11 and 802.15.4 data access technologies, are the unique data sources suitable for indoor geo-referencing using COTS devices. In the existing literature, many RSSI based localization systems are introduced and experimentally validated, nevertheless they require periodic calibrations and significant information fusion from different sensors that dramatically decrease overall systems reliability and their effective availability. This motivates the work presented in this paper, which introduces an approach for an RSSI-based calibration-free and real-time indoor localization. While switched-beam array-based hardware (compliant with IEEE 802.15.4 router functionality) has already been presented by the author, the focus of this paper is the creation of an algorithmic layer for use with the pre-existing hardware capable to enable full localization and data contextualization over a standard 802.15.4 wireless sensor network using only RSSI information without the need of lengthy offline calibration phase. System validation reports the localization results in a typical indoor site, where the system has shown high accuracy, leading to a sub-metrical overall mean error and an almost 100% site coverage within 1 m localization error.

  17. Performance demonstration of hydrogen advanced loop heat pipe for 20-30K cryocooling of far infrared sensors

    NASA Astrophysics Data System (ADS)

    Hoang, Triem T.; O'Connell, Tamara A.; Ku, Jentung; Butler, C. D.; Swanson, Theodore D.

    2005-08-01

    The James Webb Space Telescope (JWST) program have identified the need for cryogenic cooling transport devices that (i) provide robust/reliable thermal management for Infrared (IR) sensors/detectors in the temperature range of 20-30K, (ii) minimize vibration effects of mechanical cryocoolers on the instruments, (iii) reduce spatial temperature gradients in cryogenic components, and (iv) afford long continuous service life of the telescope. Passive two-phase capillary cooling technologies such as heat pipes, Loop Heat Pipes (LHPs), and Capillary pumped Loops (CPLs) have proven themselves capable of performing necessary thermal control functions for room temperature applications. They have no mechanical moving part to wear out or to introduce unwanted vibration to the instruments and, hence, are reliable and maintenancefree. However, utilizing these capillary devices for cryogenic cooling still remains a challenge because of difficulties involving the system start-up and operation in a warm environment. An advanced concept of LHP using Hydrogen as the working fluid was recently developed to demonstrate the cryocooling transport capabilities in the temperature range of 20-30K. A full-size demonstration test loop - appropriately called H2-ALHP_2 - was constructed and performance tested extensively in a thermal vacuum chamber. It was designed specifically to manage "heat parasitics" from a warm surrounding, enabling it to start up from an initially supercritical state and operate without requiring a rigid heat shield. Like room temperature LHPs, the H2-ALHP transport lines were made of small-diameter stainless steel tubing that are flexible enough to isolate the cryocooler-induced vibration from the IR instruments. In addition, focus of the H2-ALHP research and development effort was also placed on the system weight saving for space-based applications.

  18. Evaluating unsupervised methods to size and classify suspended particles using digital in-line holography

    USGS Publications Warehouse

    Davies, Emlyn J.; Buscombe, Daniel D.; Graham, George W.; Nimmo-Smith, W. Alex M.

    2015-01-01

    Substantial information can be gained from digital in-line holography of marine particles, eliminating depth-of-field and focusing errors associated with standard lens-based imaging methods. However, for the technique to reach its full potential in oceanographic research, fully unsupervised (automated) methods are required for focusing, segmentation, sizing and classification of particles. These computational challenges are the subject of this paper, in which we draw upon data collected using a variety of holographic systems developed at Plymouth University, UK, from a significant range of particle types, sizes and shapes. A new method for noise reduction in reconstructed planes is found to be successful in aiding particle segmentation and sizing. The performance of an automated routine for deriving particle characteristics (and subsequent size distributions) is evaluated against equivalent size metrics obtained by a trained operative measuring grain axes on screen. The unsupervised method is found to be reliable, despite some errors resulting from over-segmentation of particles. A simple unsupervised particle classification system is developed, and is capable of successfully differentiating sand grains, bubbles and diatoms from within the surf-zone. Avoiding miscounting bubbles and biological particles as sand grains enables more accurate estimates of sand concentrations, and is especially important in deployments of particle monitoring instrumentation in aerated water. Perhaps the greatest potential for further development in the computational aspects of particle holography is in the area of unsupervised particle classification. The simple method proposed here provides a foundation upon which further development could lead to reliable identification of more complex particle populations, such as those containing phytoplankton, zooplankton, flocculated cohesive sediments and oil droplets.

  19. The ``Leakage Current Sentinel'': A novel plug-in socket device for online biomedical equipment electrical safety surveillance

    NASA Astrophysics Data System (ADS)

    Cappa, Paolo; Marinozzi, Franco; Sciuto, Salvatore Andrea

    2000-07-01

    The Leakage Current Sentinel (LCS) has been designed and implemented for the detection of hazardous situations caused by dangerous earth leakage current values in intensive care units and operating theaters. The device, designed and manufactured with full compliance of the high risk environment requirements, is able to monitor online the earth leakage current and detect ground wire faults. Operation utilizes a microammeter with an overall sensitivity of 2.5×104 V/A. In order to assure the reliability of the device in providing alarm signals, the simultaneous presence of absorbed power current is monitored by means of another ammeter with decreased sensitivity (3.0 V/A). The measured root mean square current values are compared with reference values in order to send signals to NAND and OR complementary metal-oxide-semiconductor gates to enable audible and visible alarms according to the possible hazardous cases examined in the article. The final LCS packaging was shaped as a wall socket adapter for common electromedical device power cord plugs, with particular attention to minimizing its dimensions and to provide analog voltage outputs for both measured leakage and power currents, in order to allow automatic data acquisition and computerized hazardous situation management. Finally, a personal computer based automatic measuring system has been configured to simultaneously monitor several LCSs installed in the same intensive care unit room and, as a consequence, to distinguish different hazardous scenarios and provide an adequate alert to the clinical personnel whose final decision is still required. The test results confirm the effectiveness and reliability of the LCS in giving an alert in case of leakage current anomalous values, either in case of a ground fault or in case of a dangerous leakage current.

  20. A comparison of quantitative methods for clinical imaging with hyperpolarized (13)C-pyruvate.

    PubMed

    Daniels, Charlie J; McLean, Mary A; Schulte, Rolf F; Robb, Fraser J; Gill, Andrew B; McGlashan, Nicholas; Graves, Martin J; Schwaiger, Markus; Lomas, David J; Brindle, Kevin M; Gallagher, Ferdia A

    2016-04-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized (13)C-labelled molecules, such as the conversion of [1-(13)C]pyruvate to [1-(13)C]lactate, to be dynamically and non-invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model-free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two-way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time-to-peak and the lactate-to-pyruvate area under the curve ratio were simple model-free approaches that accurately represented the full reaction, with the time-to-peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized (13)C data. © 2016 The Authors. NMR in Biomedicine published by John Wiley & Sons Ltd.

  1. Assuring measurement quality in person-centred healthcare

    NASA Astrophysics Data System (ADS)

    Pendrill, L. R.

    2018-03-01

    Is it realistic to aspire to the same kind of quality-assurance of measurement in person-centred care, currently being implemented in healthcare globally, as is established in the physical sciences and engineering? Ensuring metrological comparability (‘traceability’) and reliably declaring measurement uncertainty when assessing patient ability or increased social capital are however challenging for subjective measurements often characterised by large dispersion. Drawing simple analogies between ‘instruments’ in the social sciences—questionnaires, ability tests, etc—and engineering instruments such as thermometers does not go far enough. A possible way forward, apparently equally applicable to both physical and social measurement, seems to be to model inferences in terms of performance metrics of a measurement system. Person-centred care needs person-centred measurement and a full picture of the measurement process when man acts as a measurement instrument is given in the present paper. This complements previous work by presenting the process, step by step, from the observed indication (e.g. probability of success, P success, of achieving a task), through restitution with Rasch measurement theory, to the measurand (e.g. task difficulty). Rasch invariant measure theory can yield quantities—‘latent’ (or ‘explanatory’) variables such as task challenge or person ability—with characteristics akin to those of physical quantities. Metrological references for comparability via traceability and reliable estimates of uncertainty and decision risks are then in reach even for perceptive measurements (and other qualitative properties). As a case study, the person-centred measurement of cognitive ability is examined, as part of the EU project EMPIR 15HLT04 NeuroMet, for Alzheimer’s, where better analysis of correlations with brain atrophy is enabled thanks to the Rasch metrological approach.

  2. An Improved Approach for RSSI-Based only Calibration-Free Real-Time Indoor Localization on IEEE 802.11 and 802.15.4 Wireless Networks

    PubMed Central

    Passafiume, Marco; Maddio, Stefano; Cidronali, Alessandro

    2017-01-01

    Assuming a reliable and responsive spatial contextualization service is a must-have in IEEE 802.11 and 802.15.4 wireless networks, a suitable approach consists of the implementation of localization capabilities, as an additional application layer to the communication protocol stack. Considering the applicative scenario where satellite-based positioning applications are denied, such as indoor environments, and excluding data packet arrivals time measurements due to lack of time resolution, received signal strength indicator (RSSI) measurements, obtained according to IEEE 802.11 and 802.15.4 data access technologies, are the unique data sources suitable for indoor geo-referencing using COTS devices. In the existing literature, many RSSI based localization systems are introduced and experimentally validated, nevertheless they require periodic calibrations and significant information fusion from different sensors that dramatically decrease overall systems reliability and their effective availability. This motivates the work presented in this paper, which introduces an approach for an RSSI-based calibration-free and real-time indoor localization. While switched-beam array-based hardware (compliant with IEEE 802.15.4 router functionality) has already been presented by the author, the focus of this paper is the creation of an algorithmic layer for use with the pre-existing hardware capable to enable full localization and data contextualization over a standard 802.15.4 wireless sensor network using only RSSI information without the need of lengthy offline calibration phase. System validation reports the localization results in a typical indoor site, where the system has shown high accuracy, leading to a sub-metrical overall mean error and an almost 100% site coverage within 1 m localization error. PMID:28353676

  3. Development and validation of an abbreviated version of the Trust in Oncologist Scale-the Trust in Oncologist Scale-short form (TiOS-SF).

    PubMed

    Hillen, Marij A; Postma, Rosa-May; Verdam, Mathilde G E; Smets, Ellen M A

    2017-03-01

    The original 18-item, four-dimensional Trust in Oncologist Scale assesses cancer patients' trust in their oncologist. The current aim was to develop and validate a short form version of the scale to enable more efficient assessment of cancer patients' trust. Existing validation data of the full-length Trust in Oncologist Scale were used to create a short form of the Trust in Oncologist Scale. The resulting short form was validated in a new sample of cancer patients (n = 92). Socio-demographics, medical characteristics, trust in the oncologist, satisfaction with communication, trust in healthcare, willingness to recommend the oncologist to others and to contact the oncologist in case of questions were assessed. Internal consistency, reliability, convergent and structural validity were tested. The five-item Trust in Oncologist Scale Short Form was created by selecting the statistically best performing item from each dimension of the original scale, to ensure content validity. Mean trust in the oncologist was high in the validation sample (response rate 86%, M = 4.30, SD = 0.98). Exploratory factor analyses supported one-dimensionality of the short form. Internal consistency was high, and temporal stability was moderate. Initial convergent validity was suggested by moderate correlations between trust scores with associated constructs. The Trust in Oncologist Scale Short Form appears to efficiently, reliably and validly measures cancer patients' trust in their oncologist. It may be used in research and as a quality indicator in clinical practice. More thorough validation of the scale is recommended to confirm this initial evidence of its validity.

  4. Redox Indicator Mice Stably Expressing Genetically Encoded Neuronal roGFP: Versatile Tools to Decipher Subcellular Redox Dynamics in Neuropathophysiology

    PubMed Central

    Wagener, Kerstin C.; Kolbrink, Benedikt; Dietrich, Katharina; Kizina, Kathrin M.; Terwitte, Lukas S.; Kempkes, Belinda; Bao, Guobin

    2016-01-01

    Abstract Aims: Reactive oxygen species (ROS) and downstream redox alterations not only mediate physiological signaling but also neuropathology. For long, ROS/redox imaging was hampered by a lack of reliable probes. Genetically encoded redox sensors overcame this gap and revolutionized (sub)cellular redox imaging. Yet, the successful delivery of sensor-coding DNA, which demands transfection/transduction of cultured preparations or stereotaxic microinjections of each subject, remains challenging. By generating transgenic mice, we aimed to overcome limiting cultured preparations, circumvent surgical interventions, and to extend effectively redox imaging to complex and adult preparations. Results: Our redox indicator mice widely express Thy1-driven roGFP1 (reduction–oxidation-sensitive green fluorescent protein 1) in neuronal cytosol or mitochondria. Negative phenotypic effects of roGFP1 were excluded and its proper targeting and functionality confirmed. Redox mapping by ratiometric wide-field imaging reveals most oxidizing conditions in CA3 neurons. Furthermore, mitochondria are more oxidized than cytosol. Cytosolic and mitochondrial roGFP1s reliably report cell endogenous redox dynamics upon metabolic challenge or stimulation. Fluorescence lifetime imaging yields stable, but marginal, response ranges. We therefore developed automated excitation ratiometric 2-photon imaging. It offers superior sensitivity, spatial resolution, and response dynamics. Innovation and Conclusion: Redox indicator mice enable quantitative analyses of subcellular redox dynamics in a multitude of preparations and at all postnatal stages. This will uncover cell- and compartment-specific cerebral redox signals and their defined alterations during development, maturation, and aging. Cross-breeding with other disease models will reveal molecular details on compartmental redox homeostasis in neuropathology. Combined with ratiometric 2-photon imaging, this will foster our mechanistic understanding of cellular redox signals in their full complexity. Antioxid. Redox Signal. 25, 41–58. PMID:27059697

  5. A Reliable Data Transmission Model for IEEE 802.15.4e Enabled Wireless Sensor Network under WiFi Interference.

    PubMed

    Sahoo, Prasan Kumar; Pattanaik, Sudhir Ranjan; Wu, Shih-Lin

    2017-06-07

    The IEEE 802.15.4e standard proposes Medium Access Control (MAC) to support collision-free wireless channel access mechanisms for industrial, commercial and healthcare applications. However, unnecessary wastage of energy and bandwidth consumption occur due to inefficient backoff management and collisions. In this paper, a new channel access mechanism is designed for the buffer constraint sensor devices to reduce the packet drop rate, energy consumption and collisions. In order to avoid collision due to the hidden terminal problem, a new frame structure is designed for the data transmission. A new superframe structure is proposed to mitigate the problems due to WiFi and ZigBee interference. A modified superframe structure with a new retransmission opportunity for failure devices is proposed to reduce the collisions and retransmission delay with high reliability. Performance evaluation and validation of our scheme indicate that the packet drop rate, throughput, reliability, energy consumption and average delay of the nodes can be improved significantly.

  6. Thermal Management and Reliability of Power Electronics and Electric Machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narumanchi, Sreekant

    2016-09-19

    Increasing the number of electric-drive vehicles (EDVs) on America's roads has been identified as a strategy with near-term potential for dramatically decreasing the nation's dependence on oil - by the U.S. Department of Energy, the federal cross-agency EV-Everywhere Challenge, and the automotive industry. Mass-market deployment will rely on meeting aggressive technical targets, including improved efficiency and reduced size, weight, and cost. Many of these advances will depend on optimization of thermal management. Effective thermal management is critical to improving the performance and ensuring the reliability of EDVs. Efficient heat removal makes higher power densities and lower operating temperatures possible, andmore » in turn enables cost and size reductions. The National Renewable Energy Laboratory (NREL), along with DOE and industry partners is working to develop cost-effective thermal management solutions to increase device and component power densities. In this presentation, the activities in recent years related to thermal management and reliability of automotive power electronics and electric machines are presented.« less

  7. Fault-tolerant bandwidth reservation strategies for data transfers in high-performance networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuo, Liudong; Zhu, Michelle M.; Wu, Chase Q.

    2016-11-22

    Many next-generation e-science applications need fast and reliable transfer of large volumes of data with guaranteed performance, which is typically enabled by the bandwidth reservation service in high-performance networks. One prominent issue in such network environments with large footprints is that node and link failures are inevitable, hence potentially degrading the quality of data transfer. We consider two generic types of bandwidth reservation requests (BRRs) concerning data transfer reliability: (i) to achieve the highest data transfer reliability under a given data transfer deadline, and (ii) to achieve the earliest data transfer completion time while satisfying a given data transfer reliabilitymore » requirement. We propose two periodic bandwidth reservation algorithms with rigorous optimality proofs to optimize the scheduling of individual BRRs within BRR batches. The efficacy of the proposed algorithms is illustrated through extensive simulations in comparison with scheduling algorithms widely adopted in production networks in terms of various performance metrics.« less

  8. Design and Implementation of a MAC Protocol for Timely and Reliable Delivery of Command and Data in Dynamic Wireless Sensor Networks

    PubMed Central

    Oh, Hoon; Van Vinh, Phan

    2013-01-01

    This paper proposes and implements a new TDMA-based MAC protocol for providing timely and reliable delivery of data and command for monitoring and control networks. In this kind of network, sensor nodes are required to sense data from the monitoring environment periodically and then send the data to a sink. The sink determines whether the environment is safe or not by analyzing the acquired data. Sometimes, a command or control message is sent from the sink to a particular node or a group of nodes to execute the services or request further interested data. The proposed MAC protocol enables bidirectional communication, controls active and sleep modes of a sensor node to conserve energy, and addresses the problem of load unbalancing between the nodes near a sink and the other nodes. It can improve reliability of communication significantly while extending network lifetime. These claims are supported by the experimental results. PMID:24084116

  9. Design and implementation of a MAC protocol for timely and reliable delivery of command and data in dynamic wireless sensor networks.

    PubMed

    Oh, Hoon; Van Vinh, Phan

    2013-09-30

    This paper proposes and implements a new TDMA-based MAC protocol for providing timely and reliable delivery of data and command for monitoring and control networks. In this kind of network, sensor nodes are required to sense data from the monitoring environment periodically and then send the data to a sink. The sink determines whether the environment is safe or not by analyzing the acquired data. Sometimes, a command or control message is sent from the sink to a particular node or a group of nodes to execute the services or request further interested data. The proposed MAC protocol enables bidirectional communication, controls active and sleep modes of a sensor node to conserve energy, and addresses the problem of load unbalancing between the nodes near a sink and the other nodes. It can improve reliability of communication significantly while extending network lifetime. These claims are supported by the experimental results.

  10. Non-destructive Detection of Screw Dislocations and the Corresponding Defects Nucleated from Them During SiC Epitaxial Growth and Their Effect on Device Characteristics

    NASA Astrophysics Data System (ADS)

    Das, H.; Sunkari, S.; Naas, H.

    2018-06-01

    In high-volume manufacturing of SiC power devices like Schottky barrier diodes and MOSFETs, especially with the high demands of high reliability applications like the automotive market, the issue of reliability needs to be tackled from multiple angles. It becomes important to isolate and eliminate failure mechanisms at the source rather than just rely on electrical tests. As we enter volume production on 150-mm substrates, an added layer of reliability and improved yield can be added if potential sources of defects are identified and removed. In this work, we present the non-destructive detection of a subset of screw dislocations in N+ doped substrates, trace the preferential nucleation of V-type epitaxial defects and stacking faults from these screw dislocations, and study their electrical effects on Schottky diodes. This enables the screening of highly defective substrates even before committing them to epitaxial growth.

  11. Synthetic incoherent feedforward circuits show adaptation to the amount of their genetic template

    PubMed Central

    Bleris, Leonidas; Xie, Zhen; Glass, David; Adadey, Asa; Sontag, Eduardo; Benenson, Yaakov

    2011-01-01

    Natural and synthetic biological networks must function reliably in the face of fluctuating stoichiometry of their molecular components. These fluctuations are caused in part by changes in relative expression efficiency and the DNA template amount of the network-coding genes. Gene product levels could potentially be decoupled from these changes via built-in adaptation mechanisms, thereby boosting network reliability. Here, we show that a mechanism based on an incoherent feedforward motif enables adaptive gene expression in mammalian cells. We modeled, synthesized, and tested transcriptional and post-transcriptional incoherent loops and found that in all cases the gene product adapts to changes in DNA template abundance. We also observed that the post-transcriptional form results in superior adaptation behavior, higher absolute expression levels, and lower intrinsic fluctuations. Our results support a previously hypothesized endogenous role in gene dosage compensation for such motifs and suggest that their incorporation in synthetic networks will improve their robustness and reliability. PMID:21811230

  12. A Reliable Data Transmission Model for IEEE 802.15.4e Enabled Wireless Sensor Network under WiFi Interference

    PubMed Central

    Sahoo, Prasan Kumar; Pattanaik, Sudhir Ranjan; Wu, Shih-Lin

    2017-01-01

    The IEEE 802.15.4e standard proposes Medium Access Control (MAC) to support collision-free wireless channel access mechanisms for industrial, commercial and healthcare applications. However, unnecessary wastage of energy and bandwidth consumption occur due to inefficient backoff management and collisions. In this paper, a new channel access mechanism is designed for the buffer constraint sensor devices to reduce the packet drop rate, energy consumption and collisions. In order to avoid collision due to the hidden terminal problem, a new frame structure is designed for the data transmission. A new superframe structure is proposed to mitigate the problems due to WiFi and ZigBee interference. A modified superframe structure with a new retransmission opportunity for failure devices is proposed to reduce the collisions and retransmission delay with high reliability. Performance evaluation and validation of our scheme indicate that the packet drop rate, throughput, reliability, energy consumption and average delay of the nodes can be improved significantly. PMID:28590434

  13. Enhanced Communication Network Solution for Positive Train Control Implementation

    NASA Technical Reports Server (NTRS)

    Fatehi, M. T.; Simon, J.; Chang, W.; Chow, E. T.; Burleigh, S. C.

    2011-01-01

    The commuter and freight railroad industry is required to implement Positive Train Control (PTC) by 2015 (2012 for Metrolink), a challenging network communications problem. This paper will discuss present technologies developed by the National Aeronautics and Space Administration (NASA) to overcome comparable communication challenges encountered in deep space mission operations. PTC will be based on a new cellular wireless packet Internet Protocol (IP) network. However, ensuring reliability in such a network is difficult due to the "dead zones" and transient disruptions we commonly experience when we lose calls in commercial cellular networks. These disruptions make it difficult to meet PTC s stringent reliability (99.999%) and safety requirements, deployment deadlines, and budget. This paper proposes innovative solutions based on space-proven technologies that would help meet these challenges: (1) Delay Tolerant Networking (DTN) technology, designed for use in resource-constrained, embedded systems and currently in use on the International Space Station, enables reliable communication over networks in which timely data acknowledgments might not be possible due to transient link outages. (2) Policy-Based Management (PBM) provides dynamic management capabilities, allowing vital data to be exchanged selectively (with priority) by utilizing alternative communication resources. The resulting network may help railroads implement PTC faster, cheaper, and more reliably.

  14. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    PubMed

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2016-03-01

    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.

  15. Launch and Assembly Reliability Analysis for Human Space Exploration Missions

    NASA Technical Reports Server (NTRS)

    Cates, Grant; Gelito, Justin; Stromgren, Chel; Cirillo, William; Goodliff, Kandyce

    2012-01-01

    NASA's future human space exploration strategy includes single and multi-launch missions to various destinations including cis-lunar space, near Earth objects such as asteroids, and ultimately Mars. Each campaign is being defined by Design Reference Missions (DRMs). Many of these missions are complex, requiring multiple launches and assembly of vehicles in orbit. Certain missions also have constrained departure windows to the destination. These factors raise concerns regarding the reliability of launching and assembling all required elements in time to support planned departure. This paper describes an integrated methodology for analyzing launch and assembly reliability in any single DRM or set of DRMs starting with flight hardware manufacturing and ending with final departure to the destination. A discrete event simulation is built for each DRM that includes the pertinent risk factors including, but not limited to: manufacturing completion; ground transportation; ground processing; launch countdown; ascent; rendezvous and docking, assembly, and orbital operations leading up to trans-destination-injection. Each reliability factor can be selectively activated or deactivated so that the most critical risk factors can be identified. This enables NASA to prioritize mitigation actions so as to improve mission success.

  16. Beyond reliability to profitability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bond, T.H.; Mitchell, J.S.

    1996-07-01

    Reliability concerns have controlled much of power generation design and operations. Emerging from a strictly regulated environment, profitability is becoming a much more important concept for today`s power generation executives. This paper discusses the conceptual advance-view power plant maintenance as a profit center, go beyond reliability, and embrace profitability. Profit Centered Maintenance begins with the premise that financial considerations, namely profitability, drive most aspects of modern process and manufacturing operations. Profit Centered Maintenance is a continuous process of reliability and administrative improvement and optimization. For the power generation executives with troublesome maintenance programs, Profit Centered Maintenance can be the blueprintmore » to increased profitability. It requires the culture change to make decisions based on value, to reengineer the administration of maintenance, and to enable the people performing and administering maintenance to make the most of available maintenance information technology. The key steps are to optimize the physical function of maintenance and to resolve recurring maintenance problems so that the need for maintenance can be reduced. Profit Centered Maintenance is more than just an attitude it is a path to profitability, be it resulting in increased profits or increased market share.« less

  17. Reliability and availability evaluation of Wireless Sensor Networks for industrial applications.

    PubMed

    Silva, Ivanovitch; Guedes, Luiz Affonso; Portugal, Paulo; Vasques, Francisco

    2012-01-01

    Wireless Sensor Networks (WSN) currently represent the best candidate to be adopted as the communication solution for the last mile connection in process control and monitoring applications in industrial environments. Most of these applications have stringent dependability (reliability and availability) requirements, as a system failure may result in economic losses, put people in danger or lead to environmental damages. Among the different type of faults that can lead to a system failure, permanent faults on network devices have a major impact. They can hamper communications over long periods of time and consequently disturb, or even disable, control algorithms. The lack of a structured approach enabling the evaluation of permanent faults, prevents system designers to optimize decisions that minimize these occurrences. In this work we propose a methodology based on an automatic generation of a fault tree to evaluate the reliability and availability of Wireless Sensor Networks, when permanent faults occur on network devices. The proposal supports any topology, different levels of redundancy, network reconfigurations, criticality of devices and arbitrary failure conditions. The proposed methodology is particularly suitable for the design and validation of Wireless Sensor Networks when trying to optimize its reliability and availability requirements.

  18. Reliability and Availability Evaluation of Wireless Sensor Networks for Industrial Applications

    PubMed Central

    Silva, Ivanovitch; Guedes, Luiz Affonso; Portugal, Paulo; Vasques, Francisco

    2012-01-01

    Wireless Sensor Networks (WSN) currently represent the best candidate to be adopted as the communication solution for the last mile connection in process control and monitoring applications in industrial environments. Most of these applications have stringent dependability (reliability and availability) requirements, as a system failure may result in economic losses, put people in danger or lead to environmental damages. Among the different type of faults that can lead to a system failure, permanent faults on network devices have a major impact. They can hamper communications over long periods of time and consequently disturb, or even disable, control algorithms. The lack of a structured approach enabling the evaluation of permanent faults, prevents system designers to optimize decisions that minimize these occurrences. In this work we propose a methodology based on an automatic generation of a fault tree to evaluate the reliability and availability of Wireless Sensor Networks, when permanent faults occur on network devices. The proposal supports any topology, different levels of redundancy, network reconfigurations, criticality of devices and arbitrary failure conditions. The proposed methodology is particularly suitable for the design and validation of Wireless Sensor Networks when trying to optimize its reliability and availability requirements. PMID:22368497

  19. Portuguese version of the PTSD Checklist-Military Version (PCL-M)-I: Confirmatory Factor Analysis and reliability.

    PubMed

    Carvalho, Teresa; Cunha, Marina; Pinto-Gouveia, José; Duarte, Joana

    2015-03-30

    The PTSD Checklist-Military Version (PCL-M) is a brief self-report instrument widely used to assess Post-traumatic Stress Disorder (PTSD) symptomatology in war Veterans, according to DSM-IV. This study sought out to explore the factor structure and reliability of the Portuguese version of the PCL-M. A sample of 660 Portuguese Colonial War Veterans completed the PCL-M. Several Confirmatory Factor Analyses were conducted to test different structures for PCL-M PTSD symptoms. Although the respecified first-order four-factor model based on King et al.'s model showed the best fit to the data, the respecified first and second-order models based on the DSM-IV symptom clusters also presented an acceptable fit. In addition, the PCL-M showed adequate reliability. The Portuguese version of the PCL-M is thus a valid and reliable measure to assess the severity of PTSD symptoms as described in DSM-IV. Its use with Portuguese Colonial War Veterans may ease screening of possible PTSD cases, promote more suitable treatment planning, and enable monitoring of therapeutic outcomes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. Urbanisation, urbanicity, and health: a systematic review of the reliability and validity of urbanicity scales.

    PubMed

    Cyril, Sheila; Oldroyd, John C; Renzaho, Andre

    2013-05-28

    Despite a plethora of studies examining the effect of increased urbanisation on health, no single study has systematically examined the measurement properties of scales used to measure urbanicity. It is critical to distinguish findings from studies that use surrogate measures of urbanicity (e.g. population density) from those that use measures rigorously tested for reliability and validity. The purpose of this study was to assess the measurement reliability and validity of the available urbanicity scales and identify areas where more research is needed to facilitate the development of a standardised measure of urbanicity. Databases searched were MEDLINE with Full Text, CINAHL with Full Text, and PsycINFO (EBSCOhost) as well as Embase (Ovid) covering the period from January 1970 to April 2012. Studies included in this systematic review were those that focused on the development of an urbanicity scale with clearly defined items or the adoption of an existing scale, included at least one outcome measure related to health, published in peer-reviewed journals, the full text was available in English and tested for validity and reliability. Eleven studies met our inclusion criteria which were conducted in Sri Lanka, Austria, China, Nigeria, India and Philippines. They ranged in size from 3327 to 33,404 participants. The number of scale items ranged from 7 to 12 items in 5 studies. One study measured urban area socioeconomic disadvantage instead of urbanicity. The emerging evidence is that increased urbanisation is associated with deleterious health outcomes. It is possible that increased urbanisation is also associated with access and utilisation of health services. However, urbanicity measures differed across studies, and the reliability and validity properties of the used scales were not well established. There is an urgent need for studies to standardise measures of urbanicity. Longitudinal cohort studies to confirm the relationship between increased urbanisation and health outcomes are urgently needed.

  1. Urbanisation, urbanicity, and health: a systematic review of the reliability and validity of urbanicity scales

    PubMed Central

    2013-01-01

    Background Despite a plethora of studies examining the effect of increased urbanisation on health, no single study has systematically examined the measurement properties of scales used to measure urbanicity. It is critical to distinguish findings from studies that use surrogate measures of urbanicity (e.g. population density) from those that use measures rigorously tested for reliability and validity. The purpose of this study was to assess the measurement reliability and validity of the available urbanicity scales and identify areas where more research is needed to facilitate the development of a standardised measure of urbanicity. Methods Databases searched were MEDLINE with Full Text, CINAHL with Full Text, and PsycINFO (EBSCOhost) as well as Embase (Ovid) covering the period from January 1970 to April 2012. Studies included in this systematic review were those that focused on the development of an urbanicity scale with clearly defined items or the adoption of an existing scale, included at least one outcome measure related to health, published in peer-reviewed journals, the full text was available in English and tested for validity and reliability. Results Eleven studies met our inclusion criteria which were conducted in Sri Lanka, Austria, China, Nigeria, India and Philippines. They ranged in size from 3327 to 33,404 participants. The number of scale items ranged from 7 to 12 items in 5 studies. One study measured urban area socioeconomic disadvantage instead of urbanicity. The emerging evidence is that increased urbanisation is associated with deleterious health outcomes. It is possible that increased urbanisation is also associated with access and utilisation of health services. However, urbanicity measures differed across studies, and the reliability and validity properties of the used scales were not well established. Conclusion There is an urgent need for studies to standardise measures of urbanicity. Longitudinal cohort studies to confirm the relationship between increased urbanisation and health outcomes are urgently needed. PMID:23714282

  2. Investigating the Reliability and Factor Structure of Kalichman's "Survey 2: Research Misconduct" Questionnaire: A Post Hoc Analysis Among Biomedical Doctoral Students in Scandinavia.

    PubMed

    Holm, Søren; Hofmann, Bjørn

    2017-10-01

    A precondition for reducing scientific misconduct is evidence about scientists' attitudes. We need reliable survey instruments, and this study investigates the reliability of Kalichman's "Survey 2: research misconduct" questionnaire. The study is a post hoc analysis of data from three surveys among biomedical doctoral students in Scandinavia (2010-2015). We perform reliability analysis, and exploratory and confirmatory factor analysis using a split-sample design as a partial validation. The results indicate that a reliable 13-item scale can be formed (Cronbach's α = .705), and factor analysis indicates that there are four reliable subscales each tapping a different construct: (a) general attitude to misconduct (α = .768), (b) attitude to personal misconduct (α = .784), (c) attitude to whistleblowing (α = .841), and (d) attitude to blameworthiness/punishment (α = .877). A full validation of the questionnaire requires further research. We, nevertheless, hope that the results will facilitate the increased use of the questionnaire in research.

  3. Towards Designing Cognitively-Enriched Project-Oriented Courses within a Blended Problem-Based Learning Context

    ERIC Educational Resources Information Center

    Tambouris, Efthimios; Zotou, Maria; Tarabanis, Konstantinos

    2014-01-01

    Traditional education seems to gradually and moderately make way for self-directed and student-centred learning strategies that will efficiently enable students to reach their full potentials and will sufficiently prepare them for their upcoming professional careers. Problem-Based Learning (PBL) is such a strategy, since it enables active…

  4. Pennsylvania Online: A Curriculum Guide for School Library Media Centers.

    ERIC Educational Resources Information Center

    Pennsylvania State Library, Harrisburg.

    This curriculum guide is intended for any librarian in Pennsylvania committed to teaching online searching and looking for guidelines to integrate the skill into the full academic curriculum. The publication will enable school librarians to assist students in developing the skills that will enable them to search and retrieve information from…

  5. Optimal maintenance of a multi-unit system under dependencies

    NASA Astrophysics Data System (ADS)

    Sung, Ho-Joon

    The availability, or reliability, of an engineering component greatly influences the operational cost and safety characteristics of a modern system over its life-cycle. Until recently, the reliance on past empirical data has been the industry-standard practice to develop maintenance policies that provide the minimum level of system reliability. Because such empirically-derived policies are vulnerable to unforeseen or fast-changing external factors, recent advancements in the study of topic on maintenance, which is known as optimal maintenance problem, has gained considerable interest as a legitimate area of research. An extensive body of applicable work is available, ranging from those concerned with identifying maintenance policies aimed at providing required system availability at minimum possible cost, to topics on imperfect maintenance of multi-unit system under dependencies. Nonetheless, these existing mathematical approaches to solve for optimal maintenance policies must be treated with caution when considered for broader applications, as they are accompanied by specialized treatments to ease the mathematical derivation of unknown functions in both objective function and constraint for a given optimal maintenance problem. These unknown functions are defined as reliability measures in this thesis, and theses measures (e.g., expected number of failures, system renewal cycle, expected system up time, etc.) do not often lend themselves to possess closed-form formulas. It is thus quite common to impose simplifying assumptions on input probability distributions of components' lifetime or repair policies. Simplifying the complex structure of a multi-unit system to a k-out-of-n system by neglecting any sources of dependencies is another commonly practiced technique intended to increase the mathematical tractability of a particular model. This dissertation presents a proposal for an alternative methodology to solve optimal maintenance problems by aiming to achieve the same end-goals as Reliability Centered Maintenance (RCM). RCM was first introduced to the aircraft industry in an attempt to bridge the gap between the empirically-driven and theory-driven approaches to establishing optimal maintenance policies. Under RCM, qualitative processes that enable the prioritizing of functions based on the criticality and influence would be combined with mathematical modeling to obtain the optimal maintenance policies. Where this thesis work deviates from RCM is its proposal to directly apply quantitative processes to model the reliability measures in optimal maintenance problem. First, Monte Carlo (MC) simulation, in conjunction with a pre-determined Design of Experiments (DOE) table, can be used as a numerical means of obtaining the corresponding discrete simulated outcomes of the reliability measures based on the combination of decision variables (e.g., periodic preventive maintenance interval, trigger age for opportunistic maintenance, etc.). These discrete simulation results can then be regressed as Response Surface Equations (RSEs) with respect to the decision variables. Such an approach to represent the reliability measures with continuous surrogate functions (i.e., the RSEs) not only enables the application of the numerical optimization technique to solve for optimal maintenance policies, but also obviates the need to make mathematical assumptions or impose over-simplifications on the structure of a multi-unit system for the sake of mathematical tractability. The applicability of the proposed methodology to a real-world optimal maintenance problem is showcased through its application to a Time Limited Dispatch (TLD) of Full Authority Digital Engine Control (FADEC) system. In broader terms, this proof-of-concept exercise can be described as a constrained optimization problem, whose objective is to identify the optimal system inspection interval that guarantees a certain level of availability for a multi-unit system. A variety of reputable numerical techniques were used to model the problem as accurately as possible, including algorithms for the MC simulation, imperfect maintenance model from quasi renewal processes, repair time simulation, and state transition rules. Variance Reduction Techniques (VRTs) were also used in an effort to enhance MC simulation efficiency. After accurate MC simulation results are obtained, the RSEs are generated based on the goodness-of-fit measure to yield as parsimonious model as possible to construct the optimization problem. Under the assumption of constant failure rate for lifetime distributions, the inspection interval from the proposed methodology was found to be consistent with the one from the common approach used in industry that leverages Continuous Time Markov Chain (CTMC). While the latter does not consider maintenance cost settings, the proposed methodology enables an operator to consider different types of maintenance cost settings, e.g., inspection cost, system corrective maintenance cost, etc., to result in more flexible maintenance policies. When the proposed methodology was applied to the same TLD of FADEC example, but under the more generalized assumption of strictly Increasing Failure Rate (IFR) for lifetime distribution, it was shown to successfully capture component wear-out, as well as the economic dependencies among the system components.

  6. Restoring Faith in the bulk-power system: an early assessment of mandatory reliability standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McAllister, Levi; Dawson, Kelly L.

    2010-03-15

    The driving force underlying creation of mandatory reliability standards was the prevention of widespread outages, such as those that occurred in 1965, 1977 and 2003. So far, no similar outage has occurred when an entity is in full compliance with the standards, and NERC and FERC have demonstrated that they will actively enforce compliance while aggressively pursuing entities alleged to be non-compliant. (author)

  7. Planck 2015 results. XXVI. The Second Planck Catalogue of Compact Sources

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Argüeso, F.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Battaner, E.; Beichman, C.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Böhringer, H.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chary, R.-R.; Chiang, H. C.; Christensen, P. R.; Clemens, M.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Falgarone, E.; Fergusson, J.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frejsel, A.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; McGehee, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Negrello, M.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Reach, W. T.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rowan-Robinson, M.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Sanghera, H. S.; Santos, D.; Savelainen, M.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Stolyarov, V.; Sudiwala, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tornikoski, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Türler, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L. A.; Walter, B.; Wandelt, B. D.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zonca, A.

    2016-09-01

    The Second Planck Catalogue of Compact Sources is a list of discrete objects detected in single-frequency maps from the full duration of the Planck mission and supersedes previous versions. It consists of compact sources, both Galactic and extragalactic, detected over the entire sky. Compact sources detected in the lower frequency channels are assigned to the PCCS2, while at higher frequencies they are assigned to one of two subcatalogues, the PCCS2 or PCCS2E, depending on their location on the sky. The first of these (PCCS2) covers most of the sky and allows the user to produce subsamples at higher reliabilities than the target 80% integral reliability of the catalogue. The second (PCCS2E) contains sources detected in sky regions where the diffuse emission makes it difficult to quantify the reliability of the detections. Both the PCCS2 and PCCS2E include polarization measurements, in the form of polarized flux densities, or upper limits, and orientation angles for all seven polarization-sensitive Planck channels. The improved data-processing of the full-mission maps and their reduced noise levels allow us to increase the number of objects in the catalogue, improving its completeness for the target 80% reliability as compared with the previous versions, the PCCS and the Early Release Compact Source Catalogue (ERCSC).

  8. The Japanese version of the questionnaire about the process of recovery: development and validity and reliability testing.

    PubMed

    Kanehara, Akiko; Kotake, Risa; Miyamoto, Yuki; Kumakura, Yousuke; Morita, Kentaro; Ishiura, Tomoko; Shimizu, Kimiko; Fujieda, Yumiko; Ando, Shuntaro; Kondo, Shinsuke; Kasai, Kiyoto

    2017-11-07

    Personal recovery is increasingly recognised as an important outcome measure in mental health services. This study aimed to develop a Japanese version of the Questionnaire about the Process of Recovery (QPR-J) and test its validity and reliability. The study comprised two stages that employed the cross-sectional and prospective cohort designs, respectively. We translated the questionnaire using a standard translation/back-translation method. Convergent validity was examined by calculating Pearson's correlation coefficients with scores on the Recovery Assessment Scale (RAS) and the Short-Form-8 Health Survey (SF-8). An exploratory factor analysis (EFA) was conducted to examine factorial validity. We used intraclass correlation and Cronbach's alpha to examine the test-retest and internal consistency reliability of the QPR-J's 22-item full scale, 17-item intrapersonal and 5-item interpersonal subscales. We conducted an EFA along with a confirmatory factor analysis (CFA). Data were obtained from 197 users of mental health services (mean age: 42.0 years; 61.9% female; 49.2% diagnosed with schizophrenia). The QPR-J showed adequate convergent validity, exhibiting significant, positive correlations with the RAS and SF-8 scores. The QPR-J's full version, subscales, showed excellent test-retest and internal consistency reliability, with the exception of acceptable but relatively low internal consistency reliability for the interpersonal subscale. Based on the results of the CFA and EFA, we adopted the factor structure extracted from the original 2-factor model based on the present CFA. The QPR-J is an adequately valid and reliable measure of the process of recovery among Japanese users with mental health services.

  9. Overview of DOE-NE Proliferation and Terrorism Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadasivan, Pratap

    2012-08-24

    Research objectives are: (1) Develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the life of current reactors; (2) Develop improvements in the affordability of new reactors to enable nuclear energy; (3) Develop Sustainable Nuclear Fuel Cycles; and (4) Understand and minimize the risks of nuclear proliferation and terrorism. The goal is to enable the use of risk information to inform NE R&D program planning. The PTRA program supports DOE-NE's goal of using risk information to inform R&D program planning. The FY12 PTRA program is focused on terrorism risk. The program includes a mixmore » of innovative methods that support the general practice of risk assessments, and selected applications.« less

  10. Gas-fired duplex free-piston Stirling refrigerator

    NASA Astrophysics Data System (ADS)

    Urieli, L.

    1984-03-01

    The duplex free-piston Stirling refrigerator is a potentially high efficiency, high reliability device which is ideally suited to the home appliance field, in particular as a gas-fired refrigerator. It has significant advantages over other equivalent devices including freedom from halogenated hydrocarbons, extremely low temperatures available at a high efficiency, integrated water heating, and simple burner system control. The design and development of a portable working demonstration gas-fired duplex Stirling refrigeration unit is described. A unique combination of computer aided development and experimental development was used, enabling a continued interaction between the theoretical analysis and practical testing and evaluation. A universal test rig was developed in order to separately test and evaluate major subunits, enabling a smooth system integration phase.

  11. A Robust Damage-Reporting Strategy for Polymeric Materials Enabled by Aggregation-Induced Emission.

    PubMed

    Robb, Maxwell J; Li, Wenle; Gergely, Ryan C R; Matthews, Christopher C; White, Scott R; Sottos, Nancy R; Moore, Jeffrey S

    2016-09-28

    Microscopic damage inevitably leads to failure in polymers and composite materials, but it is difficult to detect without the aid of specialized equipment. The ability to enhance the detection of small-scale damage prior to catastrophic material failure is important for improving the safety and reliability of critical engineering components, while simultaneously reducing life cycle costs associated with regular maintenance and inspection. Here, we demonstrate a simple, robust, and sensitive fluorescence-based approach for autonomous detection of damage in polymeric materials and composites enabled by aggregation-induced emission (AIE). This simple, yet powerful system relies on a single active component, and the general mechanism delivers outstanding performance in a wide variety of materials with diverse chemical and mechanical properties.

  12. EHME: a new word database for research in Basque language.

    PubMed

    Acha, Joana; Laka, Itziar; Landa, Josu; Salaburu, Pello

    2014-11-14

    This article presents EHME, the frequency dictionary of Basque structure, an online program that enables researchers in psycholinguistics to extract word and nonword stimuli, based on a broad range of statistics concerning the properties of Basque words. The database consists of 22.7 million tokens, and properties available include morphological structure frequency and word-similarity measures, apart from classical indexes: word frequency, orthographic structure, orthographic similarity, bigram and biphone frequency, and syllable-based measures. Measures are indexed at the lemma, morpheme and word level. We include reliability and validation analysis. The application is freely available, and enables the user to extract words based on concrete statistical criteria 1 , as well as to obtain statistical characteristics from a list of words

  13. Concurrent extraction and reaction for the production of biodiesel from wet microalgae.

    PubMed

    Im, Hanjin; Lee, HanSol; Park, Min S; Yang, Ji-Won; Lee, Jae W

    2014-01-01

    This work addresses a reliable in situ transesterification process which integrates lipid extraction from wet microalgae, and its conversion to biodiesel, with a yield higher than 90 wt.%. This process enables single-step production of biodiesel from microalgae by mixing wet microalgal cells with solvent, methanol, and acid catalyst; and then heating them in one pot. The effects of reaction parameters such as reaction temperature, wet cell weight, reaction time, and catalyst volume on the conversion yield are investigated. This simultaneous extraction and transesterification of wet microalgae may enable a significant reduction in energy consumption by eliminating the drying process of algal cells and realize the economic production of biodiesel using wet microalgae. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. The relational underpinnings of quality internal auditing in medical clinics in Israel.

    PubMed

    Carmeli, Abraham; Zisu, Malka

    2009-03-01

    Internal auditing is a key mechanism in enhancing organizational reliability. However, research on the ways quality internal auditing is enabled through learning, deterrence, motivation and process improvement is scant. In particular, the relational underpinnings of internal auditing have been understudied. This study attempts to address this need by examining how organizational trust, perceived organizational support and psychological safety enable internal auditing. Data collected from employees in medical clinics of one of the largest healthcare organizations in Israel at two points in time six months apart. Our results show that organizational trust and perceived organizational support are positively related to psychological safety (measured at time 1), which, in turn, is associated with internal auditing (measured at time 2).

  15. Advances in shutter drive technology to enhance man-portable infrared cameras

    NASA Astrophysics Data System (ADS)

    Durfee, David

    2012-06-01

    With an emphasis on highest reliability, infrared (IR) imagers have traditionally used simplest-possible shutters and field-proven technology. Most commonly, single-step rotary or linear magnetic actuators have been used with good success. However, several newer shutter drive technologies offer benefits in size and power reduction, enabling man-portable imagers that are more compact, lighter, and more durable. This paper will discuss improvements in shutter and shutter drive technology, which enable smaller and more power-efficient imagers. Topics will transition from single-step magnetic actuators to multi-stepping magnetic drives, latching vs. balanced systems for blade position shock-resistance, motor and geared motor drives, and associated stepper driver electronics. It will highlight performance tradeoffs pertinent to man-portable military systems.

  16. Addressing the social dimensions of citizen observatories: The Ground Truth 2.0 socio-technical approach for sustainable implementation of citizen observatories

    NASA Astrophysics Data System (ADS)

    Wehn, Uta; Joshi, Somya; Pfeiffer, Ellen; Anema, Kim; Gharesifard, Mohammad; Momani, Abeer

    2017-04-01

    Owing to ICT-enabled citizen observatories, citizens can take on new roles in environmental monitoring, decision making and co-operative planning, and environmental stewardship. And yet implementing advanced citizen observatories for data collection, knowledge exchange and interactions to support policy objectives is neither always easy nor successful, given the required commitment, trust, and data reliability concerns. Many efforts are facing problems with the uptake and sustained engagement by citizens, limited scalability, unclear long-term sustainability and limited actual impact on governance processes. Similarly, to sustain the engagement of decision makers in citizen observatories, mechanisms are required from the start of the initiative in order to have them invest in and, hence, commit to and own the entire process. In order to implement sustainable citizen observatories, these social dimensions therefore need to be soundly managed. We provide empirical evidence of how the social dimensions of citizen observatories are being addressed in the Ground Truth 2.0 project, drawing on a range of relevant social science approaches. This project combines the social dimensions of citizen observatories with enabling technologies - via a socio-technical approach - so that their customisation and deployment is tailored to the envisaged societal and economic impacts of the observatories. The projects consists of the demonstration and validation of six scaled up citizen observatories in real operational conditions both in the EU and in Africa, with a specific focus on flora and fauna as well as water availability and water quality for land and natural resources management. The demonstration cases (4 EU and 2 African) cover the full 'spectrum' of citizen-sensed data usage and citizen engagement, and therefore allow testing and validation of the socio-technical concept for citizen observatories under a range of conditions.

  17. Rapid wide-scope screening of drugs of abuse, prescription drugs with potential for abuse and their metabolites in influent and effluent urban wastewater by ultrahigh pressure liquid chromatography-quadrupole-time-of-flight-mass spectrometry.

    PubMed

    Hernández, Félix; Bijlsma, Lubertus; Sancho, Juan V; Díaz, Ramon; Ibáñez, María

    2011-01-17

    This work illustrates the potential of hybrid quadrupole-time-of-flight mass spectrometry (QTOF MS) coupled to ultrahigh pressure liquid chromatography (UHPLC) to investigate the presence of drugs of abuse in wastewater. After solid-phase extraction with Oasis MCX cartridges, seventy-six illicit drugs, prescription drugs with potential for abuse, and metabolites were investigated in the samples by TOF MS using electrospray interface under positive ionization mode, with MS data acquired over an m/z range of 50-1000Da. For 11 compounds, reference standards were available, and experimental data (e.g., retention time and fragmentation data) could be obtained, facilitating a more confident identification. The use of a QTOF instrument enabled the simultaneous application of two acquisition functions with different collision energies: a low energy (LE) function, where none or poor fragmentation took place, and a high energy (HE) function, where fragmentation in the collision cell was promoted. This approach, known as MS(E), enabled the simultaneous acquisition of full-spectrum accurate mass data of both protonated molecules and fragment ions in a single injection, providing relevant information that facilitates the rapid detection and reliable identification of these emerging contaminants in the sample matrices analyzed. In addition, isomeric compounds, like the opiates, morphine and norcodeine, could be discriminated by their specific fragments observed in HE TOF MS spectra, without the need of reference standards. UHPLC-QTOF MS was proven to be a powerful and efficient technique for rapid wide-scope screening and identification of many relevant drugs in complex matrices, such as influent and effluent urban wastewater. Copyright © 2010 Elsevier B.V. All rights reserved.

  18. Nuclear Fabrication Consortium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levesque, Stephen

    2013-04-05

    This report summarizes the activities undertaken by EWI while under contract from the Department of Energy (DOE) Office of Nuclear Energy (NE) for the management and operation of the Nuclear Fabrication Consortium (NFC). The NFC was established by EWI to independently develop, evaluate, and deploy fabrication approaches and data that support the re-establishment of the U.S. nuclear industry: ensuring that the supply chain will be competitive on a global stage, enabling more cost-effective and reliable nuclear power in a carbon constrained environment. The NFC provided a forum for member original equipment manufactures (OEM), fabricators, manufacturers, and materials suppliers to effectivelymore » engage with each other and rebuild the capacity of this supply chain by : Identifying and removing impediments to the implementation of new construction and fabrication techniques and approaches for nuclear equipment, including system components and nuclear plants. Providing and facilitating detailed scientific-based studies on new approaches and technologies that will have positive impacts on the cost of building of nuclear plants. Analyzing and disseminating information about future nuclear fabrication technologies and how they could impact the North American and the International Nuclear Marketplace. Facilitating dialog and initiate alignment among fabricators, owners, trade associations, and government agencies. Supporting industry in helping to create a larger qualified nuclear supplier network. Acting as an unbiased technology resource to evaluate, develop, and demonstrate new manufacturing technologies. Creating welder and inspector training programs to help enable the necessary workforce for the upcoming construction work. Serving as a focal point for technology, policy, and politically interested parties to share ideas and concepts associated with fabrication across the nuclear industry. The report the objectives and summaries of the Nuclear Fabrication Consortium projects. Full technical reports for each of the projects have been submitted as well.« less

  19. Development of a fast PCR protocol enabling rapid generation of AmpFℓSTR® Identifiler® profiles for genotyping of human DNA

    PubMed Central

    2012-01-01

    Background Traditional PCR methods for forensic STR genotyping require approximately 2.5 to 4 hours to complete, contributing a significant portion of the time required to process forensic DNA samples. The purpose of this study was to develop and validate a fast PCR protocol that enabled amplification of the 16 loci targeted by the AmpFℓSTR® Identifiler® primer set, allowing decreased cycling times. Methods Fast PCR conditions were achieved by substituting the traditional Taq polymerase for SpeedSTAR™ HS DNA polymerase which is designed for fast PCR, by upgrading to a thermal cycler with faster temperature ramping rates and by modifying cycling parameters (less time at each temperature) and adopting a two-step PCR approach. Results The total time required for the optimized protocol is 26 min. A total of 147 forensically relevant DNA samples were amplified using the fast PCR protocol for Identifiler. Heterozygote peak height ratios were not affected by fast PCR conditions, and full profiles were generated for single-source DNA amounts between 0.125 ng and 2.0 ng. Individual loci in profiles produced with the fast PCR protocol exhibited average n-4 stutter percentages ranging from 2.5 ± 0.9% (THO1) to 9.9 ± 2.7% (D2S1338). No increase in non-adenylation or other amplification artefacts was observed. Minor contributor alleles in two-person DNA mixtures were reliably discerned. Low level cross-reactivity (monomorphic peaks) was observed with some domestic animal DNA. Conclusions The fast PCR protocol presented offers a feasible alternative to current amplification methods and could aid in reducing the overall time in STR profile production or could be incorporated into a fast STR genotyping procedure for time-sensitive situations. PMID:22394458

  20. Development of a fast PCR protocol enabling rapid generation of AmpFℓSTR® Identifiler® profiles for genotyping of human DNA.

    PubMed

    Foster, Amanda; Laurin, Nancy

    2012-03-06

    Traditional PCR methods for forensic STR genotyping require approximately 2.5 to 4 hours to complete, contributing a significant portion of the time required to process forensic DNA samples. The purpose of this study was to develop and validate a fast PCR protocol that enabled amplification of the 16 loci targeted by the AmpFℓSTR® Identifiler® primer set, allowing decreased cycling times. Fast PCR conditions were achieved by substituting the traditional Taq polymerase for SpeedSTAR™ HS DNA polymerase which is designed for fast PCR, by upgrading to a thermal cycler with faster temperature ramping rates and by modifying cycling parameters (less time at each temperature) and adopting a two-step PCR approach. The total time required for the optimized protocol is 26 min. A total of 147 forensically relevant DNA samples were amplified using the fast PCR protocol for Identifiler. Heterozygote peak height ratios were not affected by fast PCR conditions, and full profiles were generated for single-source DNA amounts between 0.125 ng and 2.0 ng. Individual loci in profiles produced with the fast PCR protocol exhibited average n-4 stutter percentages ranging from 2.5 ± 0.9% (THO1) to 9.9 ± 2.7% (D2S1338). No increase in non-adenylation or other amplification artefacts was observed. Minor contributor alleles in two-person DNA mixtures were reliably discerned. Low level cross-reactivity (monomorphic peaks) was observed with some domestic animal DNA. The fast PCR protocol presented offers a feasible alternative to current amplification methods and could aid in reducing the overall time in STR profile production or could be incorporated into a fast STR genotyping procedure for time-sensitive situations.

  1. A rose by any other name: Certification seen as process rather than content

    NASA Technical Reports Server (NTRS)

    Wilson, John R.

    1994-01-01

    Green (1990) believes that the two main factors safeguarding flying from human error are both related to certification and regulation. First is the increasingly proceduralized nature of flying whereby as much as possible is reduced to a rule-based activity. Second is the emphasis placed upon training and competency checking of aircrew in simulators and in the air, both generally and for all particular types of aircraft flown. This leaves, believes Green, other human factors that are relatively unaddressed as yet and which can give rise to human reliability problems. These include: hardware factors and especially pilot/co-pilot relationships; and system factors including fatigue and cost/safety trade-offs. He also, importantly, identifies problems with the integration of the 'electronic crew member' following increased automation. Human reliability failures with artificial intelligence and automation, due to over-reliance on the system fail-safe mechanisms, or to operator under- confidence in the integrity or self-regulating capacity of the system, or to out-of-loop effects, are widely accepted as being due to deficiencies in plant design, planning, management and maintenance more than to 'operator error' - Reason's (1990) latent error or organization pathogens argument. Reliability failures in complex systems are well enough documented to give cause for concern and at least promote a debate on the merits of a full certification program. The purpose of this short paper is to seek out and explore what is valuable in certification, at the least to show that the benefits outweigh the disadvantages and at best to identify positive outcomes perhaps not obtainable in other ways. On both sides of the debate on certification there is general agreement on the need for a better human factors perspective and effort in complex aviation systems design. What is at issue is how this is to be promoted. It is incumbent upon opponents of certification to say how else such promotion be enabled. This is an exploratory and philosophical review, not a focused and specific one, and it will draw upon much that is not firmly in the domain of complex aviation systems.

  2. Displaying contextual information reduces the costs of imperfect decision automation in rapid retasking of ISR assets.

    PubMed

    Rovira, Ericka; Cross, Austin; Leitch, Evan; Bonaceto, Craig

    2014-09-01

    The impact of a decision support tool designed to embed contextual mission factors was investigated. Contextual information may enable operators to infer the appropriateness of data underlying the automation's algorithm. Research has shown the costs of imperfect automation are more detrimental than perfectly reliable automation when operators are provided with decision support tools. Operators may trust and rely on the automation more appropriately if they understand the automation's algorithm. The need to develop decision support tools that are understandable to the operator provides the rationale for the current experiment. A total of 17 participants performed a simulated rapid retasking of intelligence, surveillance, and reconnaissance (ISR) assets task with manual, decision automation, or contextual decision automation differing in two levels of task demand: low or high. Automation reliability was set at 80%, resulting in participants experiencing a mixture of reliable and automation failure trials. Dependent variables included ISR coverage and response time of replanning routes. Reliable automation significantly improved ISR coverage when compared with manual performance. Although performance suffered under imperfect automation, contextual decision automation helped to reduce some of the decrements in performance. Contextual information helps overcome the costs of imperfect decision automation. Designers may mitigate some of the performance decrements experienced with imperfect automation by providing operators with interfaces that display contextual information, that is, the state of factors that affect the reliability of the automation's recommendation.

  3. Lifetime prediction and reliability estimation methodology for Stirling-type pulse tube refrigerators by gaseous contamination accelerated degradation testing

    NASA Astrophysics Data System (ADS)

    Wan, Fubin; Tan, Yuanyuan; Jiang, Zhenhua; Chen, Xun; Wu, Yinong; Zhao, Peng

    2017-12-01

    Lifetime and reliability are the two performance parameters of premium importance for modern space Stirling-type pulse tube refrigerators (SPTRs), which are required to operate in excess of 10 years. Demonstration of these parameters provides a significant challenge. This paper proposes a lifetime prediction and reliability estimation method that utilizes accelerated degradation testing (ADT) for SPTRs related to gaseous contamination failure. The method was experimentally validated via three groups of gaseous contamination ADT. First, the performance degradation model based on mechanism of contamination failure and material outgassing characteristics of SPTRs was established. Next, a preliminary test was performed to determine whether the mechanism of contamination failure of the SPTRs during ADT is consistent with normal life testing. Subsequently, the experimental program of ADT was designed for SPTRs. Then, three groups of gaseous contamination ADT were performed at elevated ambient temperatures of 40 °C, 50 °C, and 60 °C, respectively and the estimated lifetimes of the SPTRs under normal condition were obtained through acceleration model (Arrhenius model). The results show good fitting of the degradation model with the experimental data. Finally, we obtained the reliability estimation of SPTRs through using the Weibull distribution. The proposed novel methodology enables us to take less than one year time to estimate the reliability of the SPTRs designed for more than 10 years.

  4. A new communication protocol family for a distributed spacecraft control system

    NASA Technical Reports Server (NTRS)

    Baldi, Andrea; Pace, Marco

    1994-01-01

    In this paper we describe the concepts behind and architecture of a communication protocol family, which was designed to fulfill the communication requirements of ESOC's new distributed spacecraft control system SCOS 2. A distributed spacecraft control system needs a data delivery subsystem to be used for telemetry (TLM) distribution, telecommand (TLC) dispatch and inter-application communication, characterized by the following properties: reliability, so that any operational workstation is guaranteed to receive the data it needs to accomplish its role; efficiency, so that the telemetry distribution, even for missions with high telemetry rates, does not cause a degradation of the overall control system performance; scalability, so that the network is not the bottleneck both in terms of bandwidth and reconfiguration; flexibility, so that it can be efficiently used in many different situations. The new protocol family which satisfies the above requirements is built on top of widely used communication protocols (UDP and TCP), provides reliable point-to-point and broadcast communication (UDP+) and is implemented in C++. Reliability is achieved using a retransmission mechanism based on a sequence numbering scheme. Such a scheme allows to have cost-effective performances compared to the traditional protocols, because retransmission is only triggered by applications which explicitly need reliability. This flexibility enables applications with different profiles to take advantage of the available protocols, so that the best rate between sped and reliability can be achieved case by case.

  5. The Healthcare Complaints Analysis Tool: development and reliability testing of a method for service monitoring and organisational learning

    PubMed Central

    Gillespie, Alex; Reader, Tom W

    2016-01-01

    Background Letters of complaint written by patients and their advocates reporting poor healthcare experiences represent an under-used data source. The lack of a method for extracting reliable data from these heterogeneous letters hinders their use for monitoring and learning. To address this gap, we report on the development and reliability testing of the Healthcare Complaints Analysis Tool (HCAT). Methods HCAT was developed from a taxonomy of healthcare complaints reported in a previously published systematic review. It introduces the novel idea that complaints should be analysed in terms of severity. Recruiting three groups of educated lay participants (n=58, n=58, n=55), we refined the taxonomy through three iterations of discriminant content validity testing. We then supplemented this refined taxonomy with explicit coding procedures for seven problem categories (each with four levels of severity), stage of care and harm. These combined elements were further refined through iterative coding of a UK national sample of healthcare complaints (n= 25, n=80, n=137, n=839). To assess reliability and accuracy for the resultant tool, 14 educated lay participants coded a referent sample of 125 healthcare complaints. Results The seven HCAT problem categories (quality, safety, environment, institutional processes, listening, communication, and respect and patient rights) were found to be conceptually distinct. On average, raters identified 1.94 problems (SD=0.26) per complaint letter. Coders exhibited substantial reliability in identifying problems at four levels of severity; moderate and substantial reliability in identifying stages of care (except for ‘discharge/transfer’ that was only fairly reliable) and substantial reliability in identifying overall harm. Conclusions HCAT is not only the first reliable tool for coding complaints, it is the first tool to measure the severity of complaints. It facilitates service monitoring and organisational learning and it enables future research examining whether healthcare complaints are a leading indicator of poor service outcomes. HCAT is freely available to download and use. PMID:26740496

  6. Electrical power technology for robotic planetary rovers

    NASA Technical Reports Server (NTRS)

    Bankston, C. P.; Shirbacheh, M.; Bents, D. J.; Bozek, J. M.

    1993-01-01

    Power technologies which will enable a range of robotic rover vehicle missions by the end of the 1990s and beyond are discussed. The electrical power system is the most critical system for reliability and life, since all other on board functions (mobility, navigation, command and data, communications, and the scientific payload instruments) require electrical power. The following are discussed: power generation, energy storage, power management and distribution, and thermal management.

  7. On Predicting the Crystal Structure of Energetic Materials From Quantum Mechanics

    DTIC Science & Technology

    2008-12-01

    DE ABSTRACT A quantum-mechanically-based potential energy function that describes interactions of dimers of the explosive ...method is capable of producing force fields for interactions of the molecular crystalline explosive RDX, and appears to be suitable to enable reliable...Ridge, TN. Byrd, E.F.C., Scuseria, G.E., Chabalowski, C.F., 2004: “An ab initio study of solid nitromethane , HMX, RDX and CL20: Successes and

  8. Brillouin Scattering Spectrum Analysis Based on Auto-Regressive Spectral Estimation

    NASA Astrophysics Data System (ADS)

    Huang, Mengyun; Li, Wei; Liu, Zhangyun; Cheng, Linghao; Guan, Bai-Ou

    2018-06-01

    Auto-regressive (AR) spectral estimation technology is proposed to analyze the Brillouin scattering spectrum in Brillouin optical time-domain refelectometry. It shows that AR based method can reliably estimate the Brillouin frequency shift with an accuracy much better than fast Fourier transform (FFT) based methods provided the data length is not too short. It enables about 3 times improvement over FFT at a moderate spatial resolution.

  9. Common Badging and Access Control System (CBACS)

    NASA Technical Reports Server (NTRS)

    Baldridge, Tim

    2005-01-01

    The goals of the project are: Achieve high business value through a common badging and access control system that integrates with smart cards. Provide physical (versus logical) deployment of smart cards initially. Provides a common consistent and reliable environment into which to release the smart card. Gives opportunity to develop agency-wide consistent processes, practices and policies. Enables enterprise data capture and management. Promotes data validation prior to SC issuance.

  10. Enabling Technologies for Unified Life-Cycle Engineering of Structural Components

    DTIC Science & Technology

    1991-03-22

    representations for entities in the ULCE system for unambiguous, reliable, and efficient retrieval, manipulation, and transfer of data. Develop a rapid analysis...approaches to these functions. It is reasonable to assume that program budgets for future systems will be more restrictive and that fixed- price contracting...enemy threats, economics, and politics. The requirements are voluminous and may stipulate firm fixed- price proposals with detailed schedules. At this

  11. Terahertz Characterization of DNA: Enabling a Novel Approach

    DTIC Science & Technology

    2015-11-01

    DNA in a more reliable and less procedurally complicated manner. The method involves the use of terahertz surface plasmon generated on the surface of...advantages are due to overlapping resonance when the plasmon frequency generated by a foil coincides with that of the biological material. The...interference of the impinging terahertz wave and surface plasmon produces spectral graphs, which can be analyzed to identify and characterize a DNA sample

  12. Thermally Stabilized Transmit/Receive Modules

    NASA Technical Reports Server (NTRS)

    Hoffman, James; DelCastillo, Linda; Miller, Jennifer; Birur, Gaj

    2011-01-01

    RF-hybrid technologies enable smaller packaging and mass reduction in radar instruments, especially for subsystems with dense electronics, such as electronically steered arrays. We are designing thermally stabilized RF-hybrid T/R modules using new materials for improved thermal performance of electronics. We are combining advanced substrate and housing materials with a thermal reservoir material, and develop new packaging techniques to significantly improve thermal-cycling reliability and performance stability over temperature.

  13. Look-ahead Dynamic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-20

    Look-ahead dynamic simulation software system incorporates the high performance parallel computing technologies, significantly reduces the solution time for each transient simulation case, and brings the dynamic simulation analysis into on-line applications to enable more transparency for better reliability and asset utilization. It takes the snapshot of the current power grid status, functions in parallel computing the system dynamic simulation, and outputs the transient response of the power system in real time.

  14. Improved estimation of subject-level functional connectivity using full and partial correlation with empirical Bayes shrinkage.

    PubMed

    Mejia, Amanda F; Nebel, Mary Beth; Barber, Anita D; Choe, Ann S; Pekar, James J; Caffo, Brian S; Lindquist, Martin A

    2018-05-15

    Reliability of subject-level resting-state functional connectivity (FC) is determined in part by the statistical techniques employed in its estimation. Methods that pool information across subjects to inform estimation of subject-level effects (e.g., Bayesian approaches) have been shown to enhance reliability of subject-level FC. However, fully Bayesian approaches are computationally demanding, while empirical Bayesian approaches typically rely on using repeated measures to estimate the variance components in the model. Here, we avoid the need for repeated measures by proposing a novel measurement error model for FC describing the different sources of variance and error, which we use to perform empirical Bayes shrinkage of subject-level FC towards the group average. In addition, since the traditional intra-class correlation coefficient (ICC) is inappropriate for biased estimates, we propose a new reliability measure denoted the mean squared error intra-class correlation coefficient (ICC MSE ) to properly assess the reliability of the resulting (biased) estimates. We apply the proposed techniques to test-retest resting-state fMRI data on 461 subjects from the Human Connectome Project to estimate connectivity between 100 regions identified through independent components analysis (ICA). We consider both correlation and partial correlation as the measure of FC and assess the benefit of shrinkage for each measure, as well as the effects of scan duration. We find that shrinkage estimates of subject-level FC exhibit substantially greater reliability than traditional estimates across various scan durations, even for the most reliable connections and regardless of connectivity measure. Additionally, we find partial correlation reliability to be highly sensitive to the choice of penalty term, and to be generally worse than that of full correlations except for certain connections and a narrow range of penalty values. This suggests that the penalty needs to be chosen carefully when using partial correlations. Copyright © 2018. Published by Elsevier Inc.

  15. CMOS-Technology-Enabled Flexible and Stretchable Electronics for Internet of Everything Applications.

    PubMed

    Hussain, Aftab M; Hussain, Muhammad M

    2016-06-01

    Flexible and stretchable electronics can dramatically enhance the application of electronics for the emerging Internet of Everything applications where people, processes, data and devices will be integrated and connected, to augment quality of life. Using naturally flexible and stretchable polymeric substrates in combination with emerging organic and molecular materials, nanowires, nanoribbons, nanotubes, and 2D atomic crystal structured materials, significant progress has been made in the general area of such electronics. However, high volume manufacturing, reliability and performance per cost remain elusive goals for wide commercialization of these electronics. On the other hand, highly sophisticated but extremely reliable, batch-fabrication-capable and mature complementary metal oxide semiconductor (CMOS)-based technology has facilitated tremendous growth of today's digital world using thin-film-based electronics; in particular, bulk monocrystalline silicon (100) which is used in most of the electronics existing today. However, one fundamental challenge is that state-of-the-art CMOS electronics are physically rigid and brittle. Therefore, in this work, how CMOS-technology-enabled flexible and stretchable electronics can be developed is discussed, with particular focus on bulk monocrystalline silicon (100). A comprehensive information base to realistically devise an integration strategy by rational design of materials, devices and processes for Internet of Everything electronics is offered. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Reliability and validity of a GPS-enabled iPhone "app" to measure physical activity.

    PubMed

    Benson, Amanda Clare; Bruce, Lyndell; Gordon, Brett Ashley

    2015-01-01

    This study assessed the validity and reliability of an iPhone "app" and two sport-specific global positioning system (GPS) units to monitor distance, intensity and contextual physical activity. Forty (23 female, 17 male) 18-55-year-olds completed two trials of six laps around a 400-m athletics track wearing GPSports Pro and WiSpi units (5 and 1 Hz) and an iPhone(TM) with a Motion X GPS(TM) "app" that used the inbuilt iPhone location services application programming interface to obtain its sampling rate (which is likely to be ≤1 Hz). Overall, the statistical agreement, assessed using t-tests and Bland-Altman plots, indicated an underestimation of the known track distance (2.400 km) and average speed by the Motion X GPS "app" and GPSports Pro while the GPSports WiSpi(TM) device overestimated these outcomes. There was a ≤3% variation between trials for distance and average speed when measured by any of the GPS devices. Thus, the smartphone "app" trialled could be considered as an accessible alternative to provide high-quality contextualised data to enable ubiquitous monitoring and modification of programmes to ensure appropriate intensity and type of physical activity is prescribed and more importantly adhered to.

  17. Oscillation-Induced Signal Transmission and Gating in Neural Circuits

    PubMed Central

    Jahnke, Sven; Memmesheimer, Raoul-Martin; Timme, Marc

    2014-01-01

    Reliable signal transmission constitutes a key requirement for neural circuit function. The propagation of synchronous pulse packets through recurrent circuits is hypothesized to be one robust form of signal transmission and has been extensively studied in computational and theoretical works. Yet, although external or internally generated oscillations are ubiquitous across neural systems, their influence on such signal propagation is unclear. Here we systematically investigate the impact of oscillations on propagating synchrony. We find that for standard, additive couplings and a net excitatory effect of oscillations, robust propagation of synchrony is enabled in less prominent feed-forward structures than in systems without oscillations. In the presence of non-additive coupling (as mediated by fast dendritic spikes), even balanced oscillatory inputs may enable robust propagation. Here, emerging resonances create complex locking patterns between oscillations and spike synchrony. Interestingly, these resonances make the circuits capable of selecting specific pathways for signal transmission. Oscillations may thus promote reliable transmission and, in co-action with dendritic nonlinearities, provide a mechanism for information processing by selectively gating and routing of signals. Our results are of particular interest for the interpretation of sharp wave/ripple complexes in the hippocampus, where previously learned spike patterns are replayed in conjunction with global high-frequency oscillations. We suggest that the oscillations may serve to stabilize the replay. PMID:25503492

  18. A new quantitative approach to measure perceived work-related stress in Italian employees.

    PubMed

    Cevenini, Gabriele; Fratini, Ilaria; Gambassi, Roberto

    2012-09-01

    We propose a method for a reliable quantitative measure of subjectively perceived occupational stress applicable in any company to enhance occupational safety and psychosocial health, to enable precise prevention policies and intervention and to improve work quality and efficiency. A suitable questionnaire was telephonically administered to a stratified sample of the whole Italian population of employees. Combined multivariate statistical methods, including principal component, cluster and discriminant analyses, were used to identify risk factors and to design a causal model for understanding work-related stress. The model explained the causal links of stress through employee perception of imbalance between job demands and resources for responding appropriately, by supplying a reliable U-shaped nonlinear stress index, expressed in terms of values of human systolic arterial pressure. Low, intermediate and high values indicated demotivation (or inefficiency), well-being and distress, respectively. Costs for stress-dependent productivity shortcomings were estimated to about 3.7% of national income from employment. The method identified useful structured information able to supply a simple and precise interpretation of employees' well-being and stress risk. Results could be compared with estimated national benchmarks to enable targeted intervention strategies to protect the health and safety of workers, and to reduce unproductive costs for firms.

  19. MEMS for pico- to micro-satellites

    NASA Astrophysics Data System (ADS)

    Shea, H. R.

    2009-02-01

    MEMS sensors, actuators, and sub-systems can enable an important reduction in the size and mass of spacecrafts, first by replacing larger and heavier components, then by replacing entire subsystems, and finally by enabling the microfabrication of highly integrated picosats. Very small satellites (1 to 100 kg) stand to benefit the most from MEMS technologies. These small satellites are typically used for science or technology demonstration missions, with higher risk tolerance than multi-ton telecommunication satellites. While MEMS are playing a growing role on Earth in safety-critical applications, in the harsh and remote environment of space, reliability is still the crucial issue, and the absence of an accepted qualification methodology is holding back MEMS from wider use. An overview is given of the range of MEMS applications in space. An effective way to prove that MEMS can operate reliably in space is to use them in space: we illustrate how Cubesats (1 kg, 1 liter, cubic satellites in a standardized format to reduce launch costs) can serve as low-cost vectors for MEMS technology demonstration in space. The Cubesat SwissCube developed in Switzerland is used as one example of a rapid way to fly new microtechnologies, and also as an example of a spacecraft whose performance is only possible thanks to MEMS.

  20. NASA Glenn Research in Controls and Diagnostics for Intelligent Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    2005-01-01

    With the increased emphasis on aircraft safety, enhanced performance and affordability, and the need to reduce the environmental impact of aircraft, there are many new challenges being faced by the designers of aircraft propulsion systems. Also the propulsion systems required to enable the NASA (National Aeronautics and Space Administration) Vision for Space Exploration in an affordable manner will need to have high reliability, safety and autonomous operation capability. The Controls and Dynamics Branch at NASA Glenn Research Center (GRC) in Cleveland, Ohio, is leading and participating in various projects in partnership with other organizations within GRC and across NASA, the U.S. aerospace industry, and academia to develop advanced controls and health management technologies that will help meet these challenges through the concept of Intelligent Propulsion Systems. The key enabling technologies for an Intelligent Propulsion System are the increased efficiencies of components through active control, advanced diagnostics and prognostics integrated with intelligent engine control to enhance operational reliability and component life, and distributed control with smart sensors and actuators in an adaptive fault tolerant architecture. This paper describes the current activities of the Controls and Dynamics Branch in the areas of active component control and propulsion system intelligent control, and presents some recent analytical and experimental results in these areas.

  1. Who's Got the Bridge? - Towards Safe, Robust Autonomous Operations at NASA Langley's Autonomy Incubator

    NASA Technical Reports Server (NTRS)

    Allen, B. Danette; Cross, Charles D.; Motter, Mark A.; Neilan, James H.; Qualls, Garry D.; Rothhaar, Paul M.; Tran, Loc; Trujillo, Anna C.; Crisp, Vicki K.

    2015-01-01

    NASA aeronautics research has made decades of contributions to aviation. Both aircraft and air traffic management (ATM) systems in use today contain NASA-developed and NASA sponsored technologies that improve safety and efficiency. Recent innovations in robotics and autonomy for automobiles and unmanned systems point to a future with increased personal mobility and access to transportation, including aviation. Automation and autonomous operations will transform the way we move people and goods. Achieving this mobility will require safe, robust, reliable operations for both the vehicle and the airspace and challenges to this inevitable future are being addressed now in government labs, universities, and industry. These challenges are the focus of NASA Langley Research Center's Autonomy Incubator whose R&D portfolio includes mission planning, trajectory and path planning, object detection and avoidance, object classification, sensor fusion, controls, machine learning, computer vision, human-machine teaming, geo-containment, open architecture design and development, as well as the test and evaluation environment that will be critical to prove system reliability and support certification. Safe autonomous operations will be enabled via onboard sensing and perception systems in both data-rich and data-deprived environments. Applied autonomy will enable safety, efficiency and unprecedented mobility as people and goods take to the skies tomorrow just as we do on the road today.

  2. Full-body gestures and movements recognition: user descriptive and unsupervised learning approaches in GDL classifier

    NASA Astrophysics Data System (ADS)

    Hachaj, Tomasz; Ogiela, Marek R.

    2014-09-01

    Gesture Description Language (GDL) is a classifier that enables syntactic description and real time recognition of full-body gestures and movements. Gestures are described in dedicated computer language named Gesture Description Language script (GDLs). In this paper we will introduce new GDLs formalisms that enable recognition of selected classes of movement trajectories. The second novelty is new unsupervised learning method with which it is possible to automatically generate GDLs descriptions. We have initially evaluated both proposed extensions of GDL and we have obtained very promising results. Both the novel methodology and evaluation results will be described in this paper.

  3. Full-color, large area, transmissive holograms enabled by multi-level diffractive optics.

    PubMed

    Mohammad, Nabil; Meem, Monjurul; Wan, Xiaowen; Menon, Rajesh

    2017-07-19

    We show that multi-level diffractive microstructures can enable broadband, on-axis transmissive holograms that can project complex full-color images, which are invariant to viewing angle. Compared to alternatives like metaholograms, diffractive holograms utilize much larger minimum features (>10 µm), much smaller aspect ratios (<0.2) and thereby, can be fabricated in a single lithography step over relatively large areas (>30 mm ×30 mm). We designed, fabricated and characterized holograms that encode various full-color images. Our devices demonstrate absolute transmission efficiencies of >86% across the visible spectrum from 405 nm to 633 nm (peak value of about 92%), and excellent color fidelity. Furthermore, these devices do not exhibit polarization dependence. Finally, we emphasize that our devices exhibit negligible absorption and are phase-only holograms with high diffraction efficiency.

  4. The Computing And Interdisciplinary Systems Office: Annual Review and Planning Meeting

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2003-01-01

    The goal of this research is to develop an advanced engineering analysis system that enables high-fidelity, multi-disciplinary, full propulsion system simulations to be performed early in the design process (a virtual test cell that integrates propulsion and information technologies). This will enable rapid, high-confidence, cost-effective design of revolutionary systems.

  5. Autonomy enables new science missions

    NASA Astrophysics Data System (ADS)

    Doyle, Richard J.; Gor, Victoria; Man, Guy K.; Stolorz, Paul E.; Chapman, Clark; Merline, William J.; Stern, Alan

    1997-01-01

    The challenge of space flight in NASA's future is to enable smaller, more frequent and intensive space exploration at much lower total cost without substantially decreasing mission reliability, capability, or the scientific return on investment. The most effective way to achieve this goal is to build intelligent capabilities into the spacecraft themselves. Our technological vision for meeting the challenge of returning quality science through limited communication bandwidth will actually put scientists in a more direct link with the spacecraft than they have enjoyed to date. Technologies such as pattern recognition and machine learning can place a part of the scientist's awareness onboard the spacecraft to prioritize downlink or to autonomously trigger time-critical follow-up observations-particularly important in flyby missions-without ground interaction. Onboard knowledge discovery methods can be used to include candidate discoveries in each downlink for scientists' scrutiny. Such capabilities will allow scientists to quickly reprioritize missions in a much more intimate and efficient manner than is possible today. Ultimately, new classes of exploration missions will be enabled.

  6. Combination of High-density Microelectrode Array and Patch Clamp Recordings to Enable Studies of Multisynaptic Integration.

    PubMed

    Jäckel, David; Bakkum, Douglas J; Russell, Thomas L; Müller, Jan; Radivojevic, Milos; Frey, Urs; Franke, Felix; Hierlemann, Andreas

    2017-04-20

    We present a novel, all-electric approach to record and to precisely control the activity of tens of individual presynaptic neurons. The method allows for parallel mapping of the efficacy of multiple synapses and of the resulting dynamics of postsynaptic neurons in a cortical culture. For the measurements, we combine an extracellular high-density microelectrode array, featuring 11'000 electrodes for extracellular recording and stimulation, with intracellular patch-clamp recording. We are able to identify the contributions of individual presynaptic neurons - including inhibitory and excitatory synaptic inputs - to postsynaptic potentials, which enables us to study dendritic integration. Since the electrical stimuli can be controlled at microsecond resolution, our method enables to evoke action potentials at tens of presynaptic cells in precisely orchestrated sequences of high reliability and minimum jitter. We demonstrate the potential of this method by evoking short- and long-term synaptic plasticity through manipulation of multiple synaptic inputs to a specific neuron.

  7. Next-generation fiber lasers enabled by high-performance components

    NASA Astrophysics Data System (ADS)

    Kliner, D. A. V.; Victor, B.; Rivera, C.; Fanning, G.; Balsley, D.; Farrow, R. L.; Kennedy, K.; Hampton, S.; Hawke, R.; Soukup, E.; Reynolds, M.; Hodges, A.; Emery, J.; Brown, A.; Almonte, K.; Nelson, M.; Foley, B.; Dawson, D.; Hemenway, D. M.; Urbanek, W.; DeVito, M.; Bao, L.; Koponen, J.; Gross, K.

    2018-02-01

    Next-generation industrial fiber lasers enable challenging applications that cannot be addressed with legacy fiber lasers. Key features of next-generation fiber lasers include robust back-reflection protection, high power stability, wide power tunability, high-speed modulation and waveform generation, and facile field serviceability. These capabilities are enabled by high-performance components, particularly pump diodes and optical fibers, and by advanced fiber laser designs. We summarize the performance and reliability of nLIGHT diodes, fibers, and next-generation industrial fiber lasers at power levels of 500 W - 8 kW. We show back-reflection studies with up to 1 kW of back-reflected power, power-stability measurements in cw and modulated operation exhibiting sub-1% stability over a 5 - 100% power range, and high-speed modulation (100 kHz) and waveform generation with a bandwidth 20x higher than standard fiber lasers. We show results from representative applications, including cutting and welding of highly reflective metals (Cu and Al) for production of Li-ion battery modules and processing of carbon fiber reinforced polymers.

  8. U.S. Department of Energy Office of Nuclear Technology Research and Eevelopment ((NTRD) comprehensive summary of QA assessments for FY17

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trost, Alan L.

    The U.S. Department of Energy (DOE) Office of Nuclear Energy (NE) has developed a research and development (R&D) roadmap for its research, development, and demonstration (RD&D) activities to ensure nuclear energy remains a compelling and viable energy option for the U.S. The roadmap defines NE RD&D activities and objectives that address the challenges to research, develop and demonstrate options to the current U.S commercial fuel cycle to enable the safe, secure, economic, and sustainable expansion of nuclear energy, while minimizing proliferation and terrorism risks expanding the use of nuclear power. The roadmap enables the development of technologies and other solutionsmore » that can improve the reliability, sustain the safety, and extend the life of current reactors. In addition, it will help to develop improvements in the affordability of the new reactors to enable nuclear energy to help meet the Administration’s energy security and climate change goals.« less

  9. Validation of the Dementia Care Assessment Packet-Instrumental Activities of Daily Living

    PubMed Central

    Lee, Seok Bum; Park, Jeong Ran; Yoo, Jeong-Hwa; Park, Joon Hyuk; Lee, Jung Jae; Yoon, Jong Chul; Jhoo, Jin Hyeong; Lee, Dong Young; Woo, Jong Inn; Han, Ji Won; Huh, Yoonseok; Kim, Tae Hui

    2013-01-01

    Objective We aimed to evaluate the psychometric properties of the IADL measure included in the Dementia Care Assessment Packet (DCAP-IADL) in dementia patients. Methods The study involved 112 dementia patients and 546 controls. The DCAP-IADL was scored in two ways: observed score (OS) and predicted score (PS). The reliability of the DCAP-IADL was evaluated by testing its internal consistency, inter-rater reliability and test-retest reliability. Discriminant validity was evaluated by comparing the mean OS and PS between dementia patients and controls by ANCOVA. Pearson or Spearman correlation analysis was performed with other instruments to assess concurrent validity. Receiver operating characteristics curve analysis was performed to examine diagnostic accuracy. Results Chronbach's α coefficients of the DCAP-IADL were above 0.7. The values in dementia patients were much higher (OS=0.917, PS=0.927), indicating excellent degrees of internal consistency. Inter-rater reliabilities and test-retest reliabilities were statistically significant (p<0.05). PS exhibited higher reliabilities than OS. The mean OS and PS of dementia patients were significantly higher than those of the non-demented group after controlling for age, sex and education level. The DCAP-IADL was significantly correlated with other IADL instruments and MMSE-KC (p<0.001). Areas under the curves of the DCAP-IADL were above 0.9. Conclusion The DCAP-IADL is a reliable and valid instrument for evaluating instrumental ability of daily living for the elderly, and may also be useful for screening dementia. Moreover, administering PS may enable the DCAP-IADL to overcome the differences in gender, culture and life style that hinders accurate evaluation of the elderly in previous IADL instruments. PMID:24302946

  10. Predicting pedestrian flow: a methodology and a proof of concept based on real-life data.

    PubMed

    Davidich, Maria; Köster, Gerta

    2013-01-01

    Building a reliable predictive model of pedestrian motion is very challenging: Ideally, such models should be based on observations made in both controlled experiments and in real-world environments. De facto, models are rarely based on real-world observations due to the lack of available data; instead, they are largely based on intuition and, at best, literature values and laboratory experiments. Such an approach is insufficient for reliable simulations of complex real-life scenarios: For instance, our analysis of pedestrian motion under natural conditions at a major German railway station reveals that the values for free-flow velocities and the flow-density relationship differ significantly from widely used literature values. It is thus necessary to calibrate and validate the model against relevant real-life data to make it capable of reproducing and predicting real-life scenarios. In this work we aim at constructing such realistic pedestrian stream simulation. Based on the analysis of real-life data, we present a methodology that identifies key parameters and interdependencies that enable us to properly calibrate the model. The success of the approach is demonstrated for a benchmark model, a cellular automaton. We show that the proposed approach significantly improves the reliability of the simulation and hence the potential prediction accuracy. The simulation is validated by comparing the local density evolution of the measured data to that of the simulated data. We find that for our model the most sensitive parameters are: the source-target distribution of the pedestrian trajectories, the schedule of pedestrian appearances in the scenario and the mean free-flow velocity. Our results emphasize the need for real-life data extraction and analysis to enable predictive simulations.

  11. Robust Online Monitoring for Calibration Assessment of Transmitters and Instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramuhalli, Pradeep; Coble, Jamie B.; Shumaker, Brent

    Robust online monitoring (OLM) technologies are expected to enable the extension or elimination of periodic sensor calibration intervals in operating and new reactors. These advances in OLM technologies will improve the safety and reliability of current and planned nuclear power systems through improved accuracy and increased reliability of sensors used to monitor key parameters. In this article, we discuss an overview of research being performed within the Nuclear Energy Enabling Technologies (NEET)/Advanced Sensors and Instrumentation (ASI) program, for the development of OLM algorithms to use sensor outputs and, in combination with other available information, 1) determine whether one or moremore » sensors are out of calibration or failing and 2) replace a failing sensor with reliable, accurate sensor outputs. Algorithm development is focused on the following OLM functions: • Signal validation • Virtual sensing • Sensor response-time assessment These algorithms incorporate, at their base, a Gaussian Process-based uncertainty quantification (UQ) method. Various plant models (using kernel regression, GP, or hierarchical models) may be used to predict sensor responses under various plant conditions. These predicted responses can then be applied in fault detection (sensor output and response time) and in computing the correct value (virtual sensing) of a failing physical sensor. The methods being evaluated in this work can compute confidence levels along with the predicted sensor responses, and as a result, may have the potential for compensating for sensor drift in real-time (online recalibration). Evaluation was conducted using data from multiple sources (laboratory flow loops and plant data). Ongoing research in this project is focused on further evaluation of the algorithms, optimization for accuracy and computational efficiency, and integration into a suite of tools for robust OLM that are applicable to monitoring sensor calibration state in nuclear power plants.« less

  12. Producing Cochrane systematic reviews-a qualitative study of current approaches and opportunities for innovation and improvement.

    PubMed

    Turner, Tari; Green, Sally; Tovey, David; McDonald, Steve; Soares-Weiser, Karla; Pestridge, Charlotte; Elliott, Julian

    2017-08-01

    Producing high-quality, relevant systematic reviews and keeping them up to date is challenging. Cochrane is a leading provider of systematic reviews in health. For Cochrane to continue to contribute to improvements in heath, Cochrane Reviews must be rigorous, reliable and up to date. We aimed to explore existing models of Cochrane Review production and emerging opportunities to improve the efficiency and sustainability of these processes. To inform discussions about how to best achieve this, we conducted 26 interviews and an online survey with 106 respondents. Respondents highlighted the importance and challenge of creating reliable, timely systematic reviews. They described the challenges and opportunities presented by current production models, and they shared what they are doing to improve review production. They particularly highlighted significant challenges with increasing complexity of review methods; difficulty keeping authors on board and on track; and the length of time required to complete the process. Strong themes emerged about the roles of authors and Review Groups, the central actors in the review production process. The results suggest that improvements to Cochrane's systematic review production models could come from improving clarity of roles and expectations, ensuring continuity and consistency of input, enabling active management of the review process, centralising some review production steps; breaking reviews into smaller "chunks", and improving approaches to building capacity of and sharing information between authors and Review Groups. Respondents noted the important role new technologies have to play in enabling these improvements. The findings of this study will inform the development of new Cochrane Review production models and may provide valuable data for other systematic review producers as they consider how best to produce rigorous, reliable, up-to-date reviews.

  13. Postnatal care: development of a psychometric multidimensional satisfaction questionnaire (the WOMBPNSQ) to assess women's views.

    PubMed

    Smith, Lindsay F P

    2011-10-01

    Postnatal care is the neglected area of pregnancy care, despite repeated calls to improve it. Changes would require assessment, which should include women's views. No suitable satisfaction questionnaire exists to enable this. To develop a multidimensional psychometric postnatal satisfaction self-completion instrument. Ten maternity services in south west England from 2006-2009. Sources for questions were literature review, fieldwork, and related published instruments. Principal components analysis with varimax rotation was used to develop the final WOMen's views of Birth Postnatal Satisfaction Questionnaire (WOMBPNSQ) version. Validity and internal reliability were assessed. Questionnaires were mailed 6-8 weeks postnatally (with one reminder). The WOMBPNSQ comprises 36 seven-point Likert questions (13 dimensions including general satisfaction). Of 300 women, 166 (55.3%) replied; of these 155 (95.1 %) were white, 152 (93.8%) were married or cohabiting, 135 (81.3%) gave birth in a consultant unit, 129 (78.6%) had a vaginal delivery; and 100 (60.6%) were multiparous. The 12 specific dimensions were: support from professionals or partner, or social support; care from GP and health visitor; advice on contraception, feeding baby, the mother's health; continuity of care; duration of inpatient stay; home visiting; pain after birth. These have internal reliability (Cronbach's alpha varying from 0.624 to 0.902). Various demographic and clinical characteristics were significantly associated with specific dimensions. WOMBPNSQ could be used to assess existing or planned changes to maternity services or as a screening instrument, which would then enable in-depth qualitative assessment of areas of dissatisfaction. Its convergent validity and test-retest reliability are still to be assessed but are an improvement upon existing postnatal satisfaction questionnaires.

  14. Better Minds, Better Morals: A Procedural Guide to Better Judgment

    PubMed Central

    Schaefer, G. Owen; Savulescu, Julian

    2017-01-01

    Making more moral decisions – an uncontroversial goal, if ever there was one. But how to go about it? In this article, we offer a practical guide on ways to promote good judgment in our personal and professional lives. We will do this not by outlining what the good life consists in or which values we should accept.Rather, we offer a theory of procedural reliability: a set of dimensions of thought that are generally conducive to good moral reasoning. At the end of the day, we all have to decide for ourselves what is good and bad, right and wrong. The best way to ensure we make the right choices is to ensure the procedures we’re employing are sound and reliable. We identify four broad categories of judgment to be targeted – cognitive, self-management, motivational and interpersonal. Specific factors within each category are further delineated, with a total of 14 factors to be discussed. For each, we will go through the reasons it generally leads to more morally reliable decision-making, how various thinkers have historically addressed the topic, and the insights of recent research that can offer new ways to promote good reasoning. The result is a wide-ranging survey that contains practical advice on how to make better choices. Finally, we relate this to the project of transhumanism and prudential decision-making. We argue that transhumans will employ better moral procedures like these. We also argue that the same virtues will enable us to take better control of our own lives, enhancing our responsibility and enabling us to lead better lives from the prudential perspective. PMID:29098205

  15. Determining Criteria and Weights for Prioritizing Health Technologies Based on the Preferences of the General Population: A New Zealand Pilot Study.

    PubMed

    Sullivan, Trudy; Hansen, Paul

    2017-04-01

    The use of multicriteria decision analysis for health technology prioritization depends on decision-making criteria and weights according to their relative importance. We report on a methodology for determining criteria and weights that was developed and piloted in New Zealand and enables extensive participation by members of the general population. Stimulated by a preliminary ranking exercise that involved prioritizing 14 diverse technologies, six focus groups discussed what matters to people when thinking about technologies that should be funded. These discussions informed the specification of criteria related to technologies' benefits for use in a discrete choice survey designed to generate weights for each individual participant as well as mean weights. A random sample of 3218 adults was invited to participate. To check test-retest reliability, a subsample completed the survey twice. Cluster analysis was performed to identify participants with similar patterns of weights. Six benefits-related criteria were distilled from the focus group discussions and included in the discrete choice survey, which was completed by 322 adults (10% response rate). Most participants (85%) found the survey easy to understand, and the survey exhibited test-retest reliability. The cluster analysis revealed that participant weights are related more to idiosyncratic personal preferences than to demographic and background characteristics. The methodology enables extensive participation by members of the general population, for whom it is both acceptable and reliable. Generating weights for each participant allows the heterogeneity of individual preferences, and the extent to which they are related to demographic and background characteristics, to be tested. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  16. Computational toxicology using the OpenTox application programming interface and Bioclipse

    PubMed Central

    2011-01-01

    Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173

  17. Probabilistic design of fibre concrete structures

    NASA Astrophysics Data System (ADS)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented methodology is illustrated on results from two probabilistic studies with different types of concrete structures related to practical applications and made from various materials (with the parameters obtained from real material tests).

  18. Measuring Food Brand Awareness in Australian Children: Development and Validation of a New Instrument.

    PubMed

    Turner, Laura; Kelly, Bridget; Boyland, Emma; Bauman, Adrian E

    2015-01-01

    Children's exposure to food marketing is one environmental determinant of childhood obesity. Measuring the extent to which children are aware of food brands may be one way to estimate relative prior exposures to food marketing. This study aimed to develop and validate an Australian Brand Awareness Instrument (ABAI) to estimate children's food brand awareness. The ABAI incorporated 30 flashcards depicting food/drink logos and their corresponding products. An abbreviated version was also created using 12 flashcards (ABAI-a). The ABAI was presented to 60 primary school aged children (7-11 yrs) attending two Australian after-school centres. A week later, the full-version was repeated on approximately half the sample (n=27) and the abbreviated-version was presented to the remaining half (n=30). The test-retest reliability of the ABAI was analysed using Intra-class correlation coefficients. The concordance of the ABAI-a and full-version was assessed using Bland-Altman plots. The 'nomological' validity of the full tool was investigated by comparing children's brand awareness with food marketing-related variables (e.g. television habits, intake of heavily promoted foods). Brand awareness increased with age (p<0.01) but was not significantly correlated with other variables. Bland-Altman analyses showed good agreement between the ABAI and ABAI-a. Reliability analyses revealed excellent agreement between the two administrations of the full-ABAI. The ABAI was able to differentiate children's varying levels of brand awareness. It was shown to be a valid and reliable tool and may allow quantification of brand awareness as a proxy measure for children's prior food marketing exposure.

  19. Development of novel technologies to enhance performance and reliability of III-Nitride avalanche photodiodes

    NASA Astrophysics Data System (ADS)

    Suvarna, Puneet Harischandra

    Solar-blind ultraviolet avalanche photodiodes are an enabling technology for applications in the fields of astronomy, communication, missile warning systems, biological agent detection and particle physics research. Avalanche photodiodes (APDs) are capable of detecting low-intensity light with high quantum efficiency and signal-to-noise ratio without the need for external amplification. The properties of III-N materials (GaN and AlGaN) are promising for UV photodetectors that are highly efficient, radiation-hard and capable of visible-blind or solar-blind operation without the need for external filters. However, the realization of reliable and high performance III-N APDs and imaging arrays has several technological challenges. The high price and lack of availability of bulk III-N substrates necessitates the growth of III-Ns on lattice mismatched substrates leading to a high density of dislocations in the material that can cause high leakage currents, noise and premature breakdown in APDs. The etched sidewalls of III-N APDs and high electric fields at contact edges are also detrimental to APD performance and reliability. In this work, novel technologies have been developed and implemented that address the issues of performance and reliability in III-Nitride based APDs. To address the issue of extended defects in the bulk of the material, a novel pulsed MOCVD process was developed for the growth of AlGaN. This process enables growth of high crystal quality AlxGa1-xN with excellent control over composition, doping and thickness. The process has also been adapted for the growth of high quality III-N materials on silicon substrate for devices such as high electron mobility transistors (HEMTs). A novel post-growth defect isolation technique is also discussed that can isolate the impact of conductive defects from devices. A new sidewall passivation technique using atomic layer deposition (ALD) of dielectric materials was developed for III-N APDs that is effective in reducing the dark-current and trap states at sidewalls by close to an order of magnitude, leading to improved APD performance. Development and implementation of an ion implantation based contact edge termination technique for III-N APDs that helps prevent premature breakdown from the contact edge of the devices, has further lead to improved reliability. Finally novel improved III-N APD device designs are proposed using preliminary experiments and numerical simulations for future implementations.

  20. Full-field local displacement analysis of two-sided paperboard

    Treesearch

    J.M. Considine; D.W. Vahey

    2007-01-01

    This report describes a method to examine full-field displacements of both sides of paperboard during tensile testing. Analysis showed out-of-plane shear behavior near the failures zones. The method was reliably used to examine out-of-plane shear in double notch shear specimens. Differences in shear behavior of machine direction and cross-machine direction specimens...

Top