Sample records for reliability processing presentation

  1. MEMS Reliability Assurance Activities at JPL

    NASA Technical Reports Server (NTRS)

    Kayali, S.; Lawton, R.; Stark, B.

    2000-01-01

    An overview of Microelectromechanical Systems (MEMS) reliability assurance and qualification activities at JPL is presented along with the a discussion of characterization of MEMS structures implemented on single crystal silicon, polycrystalline silicon, CMOS, and LIGA processes. Additionally, common failure modes and mechanisms affecting MEMS structures, including radiation effects, are discussed. Common reliability and qualification practices contained in the MEMS Reliability Assurance Guideline are also presented.

  2. Markov and semi-Markov processes as a failure rate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabski, Franciszek

    2016-06-08

    In this paper the reliability function is defined by the stochastic failure rate process with a non negative and right continuous trajectories. Equations for the conditional reliability functions of an object, under assumption that the failure rate is a semi-Markov process with an at most countable state space are derived. A proper theorem is presented. The linear systems of equations for the appropriate Laplace transforms allow to find the reliability functions for the alternating, the Poisson and the Furry-Yule failure rate processes.

  3. Enhancing healthcare process design with human factors engineering and reliability science, part 1: setting the context.

    PubMed

    Boston-Fleischhauer, Carol

    2008-01-01

    The design and implementation of efficient, effective, and safe processes are never-ending challenges in healthcare. Less than optimal performance levels and rising concerns about patient safety suggest that traditional process design methods are insufficient to meet design requirements. In this 2-part series, the author presents human factors engineering and reliability science as important knowledge to enhance existing operational and clinical process design methods in healthcare. An examination of these theories, application approaches, and examples are presented.

  4. Use of a structured functional evaluation process for independent medical evaluations of claimants presenting with disabling mental illness: rationale and design for a multi-center reliability study.

    PubMed

    Bachmann, Monica; de Boer, Wout; Schandelmaier, Stefan; Leibold, Andrea; Marelli, Renato; Jeger, Joerg; Hoffmann-Richter, Ulrike; Mager, Ralph; Schaad, Heinz; Zumbrunn, Thomas; Vogel, Nicole; Bänziger, Oskar; Busse, Jason W; Fischer, Katrin; Kunz, Regina

    2016-07-29

    Work capacity evaluations by independent medical experts are widely used to inform insurers whether injured or ill workers are capable of engaging in competitive employment. In many countries, evaluation processes lack a clearly structured approach, standardized instruments, and an explicit focus on claimants' functional abilities. Evaluation of subjective complaints, such as mental illness, present additional challenges in the determination of work capacity. We have therefore developed a process for functional evaluation of claimants with mental disorders which complements usual psychiatric evaluation. Here we report the design of a study to measure the reliability of our approach in determining work capacity among patients with mental illness applying for disability benefits. We will conduct a multi-center reliability study, in which 20 psychiatrists trained in our functional evaluation process will assess 30 claimants presenting with mental illness for eligibility to receive disability benefits [Reliability of Functional Evaluation in Psychiatry, RELY-study]. The functional evaluation process entails a five-step structured interview and a reporting instrument (Instrument of Functional Assessment in Psychiatry [IFAP]) to document the severity of work-related functional limitations. We will videotape all evaluations which will be viewed by three psychiatrists who will independently rate claimants' functional limitations. Our primary outcome measure is the evaluation of claimant's work capacity as a percentage (0 to 100 %), and our secondary outcomes are the 12 mental functions and 13 functional capacities assessed by the IFAP-instrument. Inter-rater reliability of four psychiatric experts will be explored using multilevel models to estimate the intraclass correlation coefficient (ICC). Additional analyses include subgroups according to mental disorder, the typicality of claimants, and claimant perceived fairness of the assessment process. We hypothesize that a structured functional approach will show moderate reliability (ICC ≥ 0.6) of psychiatric evaluation of work capacity. Enrollment of actual claimants with mental disorders referred for evaluation by disability/accident insurers will increase the external validity of our findings. Finding moderate levels of reliability, we will continue with a randomized trial to test the reliability of a structured functional approach versus evaluation-as-usual.

  5. [The assessment of family resources and need for help: Construct validity and reliability of the Systematic Exploration and Process Inventory for health professionals in early childhood intervention services (SEVG)].

    PubMed

    Scharmanski, Sara; Renner, Ilona

    2016-12-01

    Health professionals in early childhood intervention and prevention make an important contribution by helping burdened families with young children cope with everyday life and child raising issues. A prerequisite for success is the health professionals' ability to tailor their services to the specific needs of families. The "Systematic Exploration and Process Inventory for health professionals in early childhood intervention services (SEVG)" can be used to identify each family's individual resources and needs, enabling a valid, reliable and objective assessment of the conditions and the process of counseling service. The present paper presents the statistical analyses that were used to confirm the reliability of the inventory. Based on the results of the reliability analysis and principal component analysis (PCA), the SEVG seems to be a reliable and objective inventory for assessing families' need for support. It also allows for calculation of average values of each scale. The development of valid and reliable assessments is essential to quality assurance and the professionalization of interventions in early childhood service. Copyright © 2016. Published by Elsevier GmbH.

  6. Reliability and the design process at Honeywell Avionics Division

    NASA Technical Reports Server (NTRS)

    Bezat, A.

    1981-01-01

    The division's philosophy for designed-in reliability and a comparison of reliability programs for space, manned military aircraft, and commercial aircraft, are presented. Topics include: the reliability interface with design and production; the concept phase through final proposal; the design, development, test and evaluation phase; the production phase; and the commonality among space, military, and commercial avionics.

  7. Effect of Entropy Generation on Wear Mechanics and System Reliability

    NASA Astrophysics Data System (ADS)

    Gidwani, Akshay; James, Siddanth; Jagtap, Sagar; Karthikeyan, Ram; Vincent, S.

    2018-04-01

    Wear is an irreversible phenomenon. Processes such as mutual sliding and rolling between materials involve entropy generation. These processes are monotonic with respect to time. The concept of entropy generation is further quantified using Degradation Entropy Generation theorem formulated by Michael D. Bryant. The sliding-wear model can be extrapolated to different instances in order to further provide a potential analysis of machine prognostics as well as system and process reliability for various processes besides even mere mechanical processes. In other words, using the concept of ‘entropy generation’ and wear, one can quantify the reliability of a system with respect to time using a thermodynamic variable, which is the basis of this paper. Thus in the present investigation, a unique attempt has been made to establish correlation between entropy-wear-reliability which can be useful technique in preventive maintenance.

  8. System reliability and recovery.

    DOT National Transportation Integrated Search

    1971-06-01

    The paper exhibits a variety of reliability techniques applicable to future ATC data processing systems. Presently envisioned schemes for error detection, error interrupt and error analysis are considered, along with methods of retry, reconfiguration...

  9. Highly reliable oxide VCSELs for datacom applications

    NASA Astrophysics Data System (ADS)

    Aeby, Ian; Collins, Doug; Gibson, Brian; Helms, Christopher J.; Hou, Hong Q.; Lou, Wenlin; Bossert, David J.; Wang, Charlie X.

    2003-06-01

    In this paper we describe the processes and procedures that have been developed to ensure high reliability for Emcore"s 850 nm oxide confined GaAs VCSELs. Evidence from on-going accelerated life testing and other reliability studies that confirm that this process yields reliable products will be discussed. We will present data and analysis techniques used to determine the activation energy and acceleration factors for the dominant wear-out failure mechanisms for our devices as well as our estimated MTTF of greater than 2 million use hours. We conclude with a summary of internal verification and field return rate validation data.

  10. NPTool: Towards Scalability and Reliability of Business Process Management

    NASA Astrophysics Data System (ADS)

    Braghetto, Kelly Rosa; Ferreira, João Eduardo; Pu, Calton

    Currently one important challenge in business process management is provide at the same time scalability and reliability of business process executions. This difficulty becomes more accentuated when the execution control assumes complex countless business processes. This work presents NavigationPlanTool (NPTool), a tool to control the execution of business processes. NPTool is supported by Navigation Plan Definition Language (NPDL), a language for business processes specification that uses process algebra as formal foundation. NPTool implements the NPDL language as a SQL extension. The main contribution of this paper is a description of the NPTool showing how the process algebra features combined with a relational database model can be used to provide a scalable and reliable control in the execution of business processes. The next steps of NPTool include reuse of control-flow patterns and support to data flow management.

  11. Problematics of Reliability of Road Rollers

    NASA Astrophysics Data System (ADS)

    Stawowiak, Michał; Kuczaj, Mariusz

    2018-06-01

    This article refers to the reliability of road rollers used in a selected roadworks company. Information on the method of road rollers service and how the service affects the reliability of these rollers is presented. Attention was paid to the process of the implemented maintenance plan with regard to the machine's operational time. The reliability of road rollers was analyzed by determining and interpreting readiness coefficients.

  12. Reliability of CGA/LGA/HDI Package Board/Assembly (Revision A)

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza

    2013-01-01

    This follow-up report presents reliability test results conducted by thermal cycling of five CGA assemblies evaluated under two extreme cycle profiles, representative of use for high-reliability applications. The thermal cycles ranged from a low temperature of 55 C to maximum temperatures of either 100 C or 125 C with slow ramp-up rate (3 C/min) and dwell times of about 15 minutes at the two extremes. Optical photomicrographs that illustrate key inspection findings of up to 200 thermal cycles are presented. Other information presented include an evaluation of the integrity of capacitors on CGA substrate after thermal cycling as well as process evaluation for direct assembly of an LGA onto PCB. The qualification guidelines, which are based on the test results for CGA/LGA/HDI packages and board assemblies, will facilitate NASA projects' use of very dense and newly available FPGA area array packages with known reliably and mitigation risks, allowing greater processing power in a smaller board footprint and lower system weight.

  13. How to Characterize the Reliability of Ceramic Capacitors with Base-Metal Electrodes (BMEs)

    NASA Technical Reports Server (NTRS)

    Liu, David (Donhang)

    2015-01-01

    The reliability of an MLCC device is the product of a time-dependent part and a time-independent part: 1) Time-dependent part is a statistical distribution; 2) Time-independent part is the reliability at t=0, the initial reliability. Initial reliability depends only on how a BME MLCC is designed and processed. Similar to the way the minimum dielectric thickness ensured the long-term reliability of a PME MLCC, the initial reliability also ensures the long term-reliability of a BME MLCC. This presentation shows new discoveries regarding commonalities and differences between PME and BME capacitor technologies.

  14. How to Characterize the Reliability of Ceramic Capacitors with Base-Metal Electrodes (BMEs)

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2015-01-01

    The reliability of an MLCC device is the product of a time-dependent part and a time-independent part: 1) Time-dependent part is a statistical distribution; 2) Time-independent part is the reliability at t0, the initial reliability. Initial reliability depends only on how a BME MLCC is designed and processed. Similar to the way the minimum dielectric thickness ensured the long-term reliability of a PME MLCC, the initial reliability also ensures the long term-reliability of a BME MLCC. This presentation shows new discoveries regarding commonalities and differences between PME and BME capacitor technologies.

  15. An Integrated Approach to Establish Validity and Reliability of Reading Tests

    ERIC Educational Resources Information Center

    Razi, Salim

    2012-01-01

    This study presents the processes of developing and establishing reliability and validity of a reading test by administering an integrative approach as conventional reliability and validity measures superficially reveals the difficulty of a reading test. In this respect, analysing vocabulary frequency of the test is regarded as a more eligible way…

  16. Reliability and validity: Part II.

    PubMed

    Davis, Debora Winders

    2004-01-01

    Determining measurement reliability and validity involves complex processes. There is usually room for argument about most instruments. It is important that the researcher clearly describes the processes upon which she made the decision to use a particular instrument, and presents the evidence available showing that the instrument is reliable and valid for the current purposes. In some cases, the researcher may need to conduct pilot studies to obtain evidence upon which to decide whether the instrument is valid for a new population or a different setting. In all cases, the researcher must present a clear and complete explanation for the choices, she has made regarding reliability and validity. The consumer must then judge the degree to which the researcher has provided adequate and theoretically sound rationale. Although I have tried to touch on most of the important concepts related to measurement reliability and validity, it is beyond the scope of this column to be exhaustive. There are textbooks devoted entirely to specific measurement issues if readers require more in-depth knowledge.

  17. Reliability and Probabilistic Risk Assessment - How They Play Together

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Stutts, Richard; Huang, Zhaofeng

    2015-01-01

    The objective of this presentation is to discuss the PRA process and the reliability engineering discipline, their differences and similarities, and how they are used as complimentary analyses to support design and flight decisions.

  18. Unreliable evoked responses in autism

    PubMed Central

    Dinstein, Ilan; Heeger, David J.; Lorenzi, Lauren; Minshew, Nancy J.; Malach, Rafael; Behrmann, Marlene

    2012-01-01

    Summary Autism has been described as a disorder of general neural processing, but the particular processing characteristics that might be abnormal in autism have mostly remained obscure. Here, we present evidence of one such characteristic: poor evoked response reliability. We compared cortical response amplitude and reliability (consistency across trials) in visual, auditory, and somatosensory cortices of high-functioning individuals with autism and controls. Mean response amplitudes were statistically indistinguishable across groups, yet trial-by-trial response reliability was significantly weaker in autism, yielding smaller signal-to-noise ratios in all sensory systems. Response reliability differences were evident only in evoked cortical responses and not in ongoing resting-state activity. These findings reveal that abnormally unreliable cortical responses, even to elementary non-social sensory stimuli, may represent a fundamental physiological alteration of neural processing in autism. The results motivate a critical expansion of autism research to determine whether (and how) basic neural processing properties such as reliability, plasticity, and adaptation/habituation are altered in autism. PMID:22998867

  19. Power processing for electric propulsion

    NASA Technical Reports Server (NTRS)

    Finke, R. C.; Herron, B. G.; Gant, G. D.

    1975-01-01

    The inclusion of electric thruster systems in spacecraft design is considered. The propulsion requirements of such spacecraft dictate a wide range of thruster power levels and operational lifetimes, which must be matched by lightweight, efficient, and reliable thruster power processing systems. Electron bombardment ion thruster requirements are presented, and the performance characteristics of present power processing systems are reviewed. Design philosophies and alternatives in areas such as inverter type, arc protection, and control methods are discussed along with future performance potentials for meeting goals in the areas of power process or weight (10 kg/kW), efficiency (approaching 92 percent), reliability (0.96 for 15,000 hr), and thermal control capability (0.3 to 5 AU).

  20. Creating Highly Reliable Accountable Care Organizations.

    PubMed

    Vogus, Timothy J; Singer, Sara J

    2016-12-01

    Accountable Care Organizations' (ACOs) pursuit of the triple aim of higher quality, lower cost, and improved population health has met with mixed results. To improve the design and implementation of ACOs we look to organizations that manage similarly complex, dynamic, and tightly coupled conditions while sustaining exceptional performance known as high-reliability organizations. We describe the key processes through which organizations achieve reliability, the leadership and organizational practices that enable it, and the role that professionals can play when charged with enacting it. Specifically, we present concrete practices and processes from health care organizations pursuing high-reliability and from early ACOs to illustrate how the triple aim may be met by cultivating mindful organizing, practicing reliability-enhancing leadership, and identifying and supporting reliability professionals. We conclude by proposing a set of research questions to advance the study of ACOs and high-reliability research. © The Author(s) 2016.

  1. Limited evidence of individual differences in holistic processing in different versions of the part-whole paradigm.

    PubMed

    Sunday, Mackenzie A; Richler, Jennifer J; Gauthier, Isabel

    2017-07-01

    The part-whole paradigm was one of the first measures of holistic processing and it has been used to address several topics in face recognition, including its development, other-race effects, and more recently, whether holistic processing is correlated with face recognition ability. However the task was not designed to measure individual differences and it has produced measurements with low reliability. We created a new holistic processing test designed to measure individual differences based on the part-whole paradigm, the Vanderbilt Part Whole Test (VPWT). Measurements in the part and whole conditions were reliable, but, surprisingly, there was no evidence for reliable individual differences in the part-whole index (how well a person can take advantage of a face part presented within a whole face context compared to the part presented without a whole face) because part and whole conditions were strongly correlated. The same result was obtained in a version of the original part-whole task that was modified to increase its reliability. Controlling for object recognition ability, we found that variance in the whole condition does not predict any additional variance in face recognition over what is already predicted by performance in the part condition.

  2. Real-time stereo matching using orthogonal reliability-based dynamic programming.

    PubMed

    Gong, Minglun; Yang, Yee-Hong

    2007-03-01

    A novel algorithm is presented in this paper for estimating reliable stereo matches in real time. Based on the dynamic programming-based technique we previously proposed, the new algorithm can generate semi-dense disparity maps using as few as two dynamic programming passes. The iterative best path tracing process used in traditional dynamic programming is replaced by a local minimum searching process, making the algorithm suitable for parallel execution. Most computations are implemented on programmable graphics hardware, which improves the processing speed and makes real-time estimation possible. The experiments on the four new Middlebury stereo datasets show that, on an ATI Radeon X800 card, the presented algorithm can produce reliable matches for 60% approximately 80% of pixels at the rate of 10 approximately 20 frames per second. If needed, the algorithm can be configured for generating full density disparity maps.

  3. Inspection planning development: An evolutionary approach using reliability engineering as a tool

    NASA Technical Reports Server (NTRS)

    Graf, David A.; Huang, Zhaofeng

    1994-01-01

    This paper proposes an evolutionary approach for inspection planning which introduces various reliability engineering tools into the process and assess system trade-offs among reliability, engineering requirement, manufacturing capability and inspection cost to establish an optimal inspection plan. The examples presented in the paper illustrate some advantages and benefits of the new approach. Through the analysis, reliability and engineering impacts due to manufacturing process capability and inspection uncertainty are clearly understood; the most cost effective and efficient inspection plan can be established and associated risks are well controlled; some inspection reductions and relaxations are well justified; and design feedbacks and changes may be initiated from the analysis conclusion to further enhance reliability and reduce cost. The approach is particularly promising as global competitions and customer quality improvement expectations are rapidly increasing.

  4. Scaled CMOS Reliability and Considerations for Spacecraft Systems: Bottom-Up and Top-Down Perspective

    NASA Technical Reports Server (NTRS)

    White, Mark

    2012-01-01

    New space missions will increasingly rely on more advanced technologies because of system requirements for higher performance, particularly in instruments and high-speed processing. Component-level reliability challenges with scaled CMOS in spacecraft systems from a bottom-up perspective have been presented. Fundamental Front-end and Back-end processing reliability issues with more aggressively scaled parts have been discussed. Effective thermal management from system-level to the componentlevel (top-down) is a key element in overall design of reliable systems. Thermal management in space systems must consider a wide range of issues, including thermal loading of many different components, and frequent temperature cycling of some systems. Both perspectives (top-down and bottom-up) play a large role in robust, reliable spacecraft system design.

  5. Speech-evoked activation in adult temporal cortex measured using functional near-infrared spectroscopy (fNIRS): Are the measurements reliable?

    PubMed

    Wiggins, Ian M; Anderson, Carly A; Kitterick, Pádraig T; Hartley, Douglas E H

    2016-09-01

    Functional near-infrared spectroscopy (fNIRS) is a silent, non-invasive neuroimaging technique that is potentially well suited to auditory research. However, the reliability of auditory-evoked activation measured using fNIRS is largely unknown. The present study investigated the test-retest reliability of speech-evoked fNIRS responses in normally-hearing adults. Seventeen participants underwent fNIRS imaging in two sessions separated by three months. In a block design, participants were presented with auditory speech, visual speech (silent speechreading), and audiovisual speech conditions. Optode arrays were placed bilaterally over the temporal lobes, targeting auditory brain regions. A range of established metrics was used to quantify the reproducibility of cortical activation patterns, as well as the amplitude and time course of the haemodynamic response within predefined regions of interest. The use of a signal processing algorithm designed to reduce the influence of systemic physiological signals was found to be crucial to achieving reliable detection of significant activation at the group level. For auditory speech (with or without visual cues), reliability was good to excellent at the group level, but highly variable among individuals. Temporal-lobe activation in response to visual speech was less reliable, especially in the right hemisphere. Consistent with previous reports, fNIRS reliability was improved by averaging across a small number of channels overlying a cortical region of interest. Overall, the present results confirm that fNIRS can measure speech-evoked auditory responses in adults that are highly reliable at the group level, and indicate that signal processing to reduce physiological noise may substantially improve the reliability of fNIRS measurements. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Reliabilities of mental rotation tasks: limits to the assessment of individual differences.

    PubMed

    Hirschfeld, Gerrit; Thielsch, Meinald T; Zernikow, Boris

    2013-01-01

    Mental rotation tasks with objects and body parts as targets are widely used in cognitive neuropsychology. Even though these tasks are well established to study between-groups differences, the reliability on an individual level is largely unknown. We present a systematic study on the internal consistency and test-retest reliability of individual differences in mental rotation tasks comparing different target types and orders of presentations. In total n = 99 participants (n = 63 for the retest) completed the mental rotation tasks with hands, feet, faces, and cars as targets. Different target types were presented in either randomly mixed blocks or blocks of homogeneous targets. Across all target types, the consistency (split-half reliability) and stability (test-retest reliabilities) were good or acceptable both for intercepts and slopes. At the level of individual targets, only intercepts showed acceptable reliabilities. Blocked presentations resulted in significantly faster and numerically more consistent and stable responses. Mental rotation tasks-especially in blocked variants-can be used to reliably assess individual differences in global processing speed. However, the assessment of the theoretically important slope parameter for individual targets requires further adaptations to mental rotation tests.

  7. Reliability and paste process optimization of eutectic and lead-free for mixed packaging

    NASA Technical Reports Server (NTRS)

    Ramkumar, S. M.; Ganeshan, V.; Thenalur, K.; Ghaffarian, R.

    2002-01-01

    This paper reports the results of an experiment that utilized the JPL's area array consortium test vehicle design, containing a myriad of mixed technology components with an OSP finish. The details of the reliability study are presented in this paper.

  8. CNES reliability approach for the qualification of MEMS for space

    NASA Astrophysics Data System (ADS)

    Pressecq, Francis; Lafontan, Xavier; Perez, Guy; Fortea, Jean-Pierre

    2001-10-01

    This paper describes the reliability approach performs at CNES to evaluate MEMS for space application. After an introduction and a detailed state of the art on the space requirements and on the use of MEMS for space, different approaches for taking into account MEMS in the qualification phases are presented. CNES proposes improvement to theses approaches in term of failure mechanisms identification. Our approach is based on a design and test phase deeply linked with a technology study. This workflow is illustrated with an example: the case of a variable capacitance processed with MUMPS process is presented.

  9. Flexible organic TFT bio-signal amplifier using reliable chip component assembly process with conductive adhesive.

    PubMed

    Yoshimoto, Shusuke; Uemura, Takafumi; Akiyama, Mihoko; Ihara, Yoshihiro; Otake, Satoshi; Fujii, Tomoharu; Araki, Teppei; Sekitani, Tsuyoshi

    2017-07-01

    This paper presents a flexible organic thin-film transistor (OTFT) amplifier for bio-signal monitoring and presents the chip component assembly process. Using a conductive adhesive and a chip mounter, the chip components are mounted on a flexible film substrate, which has OTFT circuits. This study first investigates the assembly technique reliability for chip components on the flexible substrate. This study also specifically examines heart pulse wave monitoring conducted using the proposed flexible amplifier circuit and a flexible piezoelectric film. We connected the amplifier to a bluetooth device for a wearable device demonstration.

  10. Effect of individual shades on reliability and validity of observers in colour matching.

    PubMed

    Lagouvardos, P E; Diamanti, H; Polyzois, G

    2004-06-01

    The effect of individual shades in shade guides, on the reliability and validity of measurements in a colour matching process is very important. Observer's agreement on shades and sensitivity/specificity of shades, can give us an estimate of shade's effect on observer's reliability and validity. In the present study, a group of 16 students, matched 15 shades of a Kulzer's guide and 10 human incisors to Kulzer's and/or Vita's shade tabs, in 4 different tests. The results showed shades I, B10, C40, A35 and A10 were those with the highest reliability and validity values. In conclusion, a) the matching process with shades of different materials was not accurate enough, b) some shades produce a more reliable and valid match than others and c) teeth are matched with relative difficulty.

  11. Compound estimation procedures in reliability

    NASA Technical Reports Server (NTRS)

    Barnes, Ron

    1990-01-01

    At NASA, components and subsystems of components in the Space Shuttle and Space Station generally go through a number of redesign stages. While data on failures for various design stages are sometimes available, the classical procedures for evaluating reliability only utilize the failure data on the present design stage of the component or subsystem. Often, few or no failures have been recorded on the present design stage. Previously, Bayesian estimators for the reliability of a single component, conditioned on the failure data for the present design, were developed. These new estimators permit NASA to evaluate the reliability, even when few or no failures have been recorded. Point estimates for the latter evaluation were not possible with the classical procedures. Since different design stages of a component (or subsystem) generally have a good deal in common, the development of new statistical procedures for evaluating the reliability, which consider the entire failure record for all design stages, has great intuitive appeal. A typical subsystem consists of a number of different components and each component has evolved through a number of redesign stages. The present investigations considered compound estimation procedures and related models. Such models permit the statistical consideration of all design stages of each component and thus incorporate all the available failure data to obtain estimates for the reliability of the present version of the component (or subsystem). A number of models were considered to estimate the reliability of a component conditioned on its total failure history from two design stages. It was determined that reliability estimators for the present design stage, conditioned on the complete failure history for two design stages have lower risk than the corresponding estimators conditioned only on the most recent design failure data. Several models were explored and preliminary models involving bivariate Poisson distribution and the Consael Process (a bivariate Poisson process) were developed. Possible short comings of the models are noted. An example is given to illustrate the procedures. These investigations are ongoing with the aim of developing estimators that extend to components (and subsystems) with three or more design stages.

  12. Trial application of reliability technology to emergency diesel generators at the Trojan Nuclear Power Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, S.M.; Boccio, J.L.; Karimian, S.

    1986-01-01

    In this paper, a trial application of reliability technology to the emergency diesel generator system at the Trojan Nuclear Power Plant is presented. An approach for formulating a reliability program plan for this system is being developed. The trial application has shown that a reliability program process, using risk- and reliability-based techniques, can be interwoven into current plant operational activities to help in controlling, analyzing, and predicting faults that can challenge safety systems. With the cooperation of the utility, Portland General Electric Co., this reliability program can eventually be implemented at Trojan to track its effectiveness.

  13. Prognostics-based qualification of high-power white LEDs using Lévy process approach

    NASA Astrophysics Data System (ADS)

    Yung, Kam-Chuen; Sun, Bo; Jiang, Xiaopeng

    2017-01-01

    Due to their versatility in a variety of applications and the growing market demand, high-power white light-emitting diodes (LEDs) have attracted considerable attention. Reliability qualification testing is an essential part of the product development process to ensure the reliability of a new LED product before its release. However, the widely used IES-TM-21 method does not provide comprehensive reliability information. For more accurate and effective qualification, this paper presents a novel method based on prognostics techniques. Prognostics is an engineering technology predicting the future reliability or determining the remaining useful lifetime (RUL) of a product by assessing the extent of deviation or degradation from its expected normal operating conditions. A Lévy subordinator of a mixed Gamma and compound Poisson process is used to describe the actual degradation process of LEDs characterized by random sporadic small jumps of degradation degree, and the reliability function is derived for qualification with different distribution forms of jump sizes. The IES LM-80 test results reported by different LED vendors are used to develop and validate the qualification methodology. This study will be helpful for LED manufacturers to reduce the total test time and cost required to qualify the reliability of an LED product.

  14. Women's Mental Health Questionnaire (W-MHQ), Construction, Reliability, Validity: Father Parenting Associations

    ERIC Educational Resources Information Center

    Perkins, Rose J. Merlino

    2018-01-01

    "Women's Mental Health Questionnaire" (W-MHQ) assesses females' adult mental health concerns, and examines their associations with specified father-daughter childhood relationships. Presented are W-MHQ item and scale development, and psychometric findings drawn from factor analyses, reliability assessments, and validation processes. For…

  15. The Shuttle processing contractors (SPC) reliability program at the Kennedy Space Center - The real world

    NASA Astrophysics Data System (ADS)

    McCrea, Terry

    The Shuttle Processing Contract (SPC) workforce consists of Lockheed Space Operations Co. as prime contractor, with Grumman, Thiokol Corporation, and Johnson Controls World Services as subcontractors. During the design phase, reliability engineering is instrumental in influencing the development of systems that meet the Shuttle fail-safe program requirements. Reliability engineers accomplish this objective by performing FMEA (failure modes and effects analysis) to identify potential single failure points. When technology, time, or resources do not permit a redesign to eliminate a single failure point, the single failure point information is formatted into a change request and presented to senior management of SPC and NASA for risk acceptance. In parallel with the FMEA, safety engineering conducts a hazard analysis to assure that potential hazards to personnel are assessed. The combined effort (FMEA and hazard analysis) is published as a system assurance analysis. Special ground rules and techniques are developed to perform and present the analysis. The reliability program at KSC is vigorously pursued, and has been extremely successful. The ground support equipment and facilities used to launch and land the Space Shuttle maintain an excellent reliability record.

  16. Design and validation of instruments to measure knowledge.

    PubMed

    Elliott, T E; Regal, R R; Elliott, B A; Renier, C M

    2001-01-01

    Measuring health care providers' learning after they have participated in educational interventions that use experimental designs requires valid, reliable, and practical instruments. A literature review was conducted. In addition, experience gained from designing and validating instruments for measuring the effect of an educational intervention informed this process. The eight main steps for designing, validating, and testing the reliability of instruments for measuring learning outcomes are presented. The key considerations and rationale for this process are discussed. Methods for critiquing and adapting existent instruments and creating new ones are offered. This study may help other investigators in developing valid, reliable, and practical instruments for measuring the outcomes of educational activities.

  17. A study of discrete control signal fault conditions in the shuttle DPS

    NASA Technical Reports Server (NTRS)

    Reddi, S. S.; Retter, C. T.

    1976-01-01

    An analysis of the effects of discrete failures on the data processing subsystem is presented. A functional description of each discrete together with a list of software modules that use this discrete are included. A qualitative description of the consequences that may ensue due to discrete failures is given followed by a probabilistic reliability analysis of the data processing subsystem. Based on the investigation conducted, recommendations were made to improve the reliability of the subsystem.

  18. Constructing the 'Best' Reliability Data for the Job - Developing Generic Reliability Data from Alternative Sources Early in a Product's Development Phase

    NASA Technical Reports Server (NTRS)

    Kleinhammer, Roger K.; Graber, Robert R.; DeMott, D. L.

    2016-01-01

    Reliability practitioners advocate getting reliability involved early in a product development process. However, when assigned to estimate or assess the (potential) reliability of a product or system early in the design and development phase, they are faced with lack of reasonable models or methods for useful reliability estimation. Developing specific data is costly and time consuming. Instead, analysts rely on available data to assess reliability. Finding data relevant to the specific use and environment for any project is difficult, if not impossible. Instead, analysts attempt to develop the "best" or composite analog data to support the assessments. Industries, consortia and vendors across many areas have spent decades collecting, analyzing and tabulating fielded item and component reliability performance in terms of observed failures and operational use. This data resource provides a huge compendium of information for potential use, but can also be compartmented by industry, difficult to find out about, access, or manipulate. One method used incorporates processes for reviewing these existing data sources and identifying the available information based on similar equipment, then using that generic data to derive an analog composite. Dissimilarities in equipment descriptions, environment of intended use, quality and even failure modes impact the "best" data incorporated in an analog composite. Once developed, this composite analog data provides a "better" representation of the reliability of the equipment or component. It can be used to support early risk or reliability trade studies, or analytical models to establish the predicted reliability data points. It also establishes a baseline prior that may updated based on test data or observed operational constraints and failures, i.e., using Bayesian techniques. This tutorial presents a descriptive compilation of historical data sources across numerous industries and disciplines, along with examples of contents and data characteristics. It then presents methods for combining failure information from different sources and mathematical use of this data in early reliability estimation and analyses.

  19. Space Shuttle Software Development and Certification

    NASA Technical Reports Server (NTRS)

    Orr, James K.; Henderson, Johnnie A

    2000-01-01

    Man-rated software, "software which is in control of systems and environments upon which human life is critically dependent," must be highly reliable. The Space Shuttle Primary Avionics Software System is an excellent example of such a software system. Lessons learn from more than 20 years of effort have identified basic elements that must be present to achieve this high degree of reliability. The elements include rigorous application of appropriate software development processes, use of trusted tools to support those processes, quantitative process management, and defect elimination and prevention. This presentation highlights methods used within the Space Shuttle project and raises questions that must be addressed to provide similar success in a cost effective manner on future long-term projects where key application development tools are COTS rather than internally developed custom application development tools

  20. Reliability based design of the primary structure of oil tankers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casella, G.; Dogliani, M.; Guedes Soares, C.

    1996-12-31

    The present paper describes the reliability analysis carried out for two oil tanker-ships having comparable dimensions but different design. The scope of the analysis was to derive indications on the value of the reliability index obtained for existing, typical and well designed oil tankers, as well as to apply the tentative rule checking formulation developed within the CEC-funded SHIPREL Project. The checking formula was adopted to redesign the midships section of one of the considered ships, upgrading her in order to meet the target failure probability considered in the rule development process. The resulting structure, in view of an upgradingmore » of the steel grade in the central part of the deck, lead to a convenient reliability level. The results of the analysis clearly showed that a large scatter exists presently in the design safety levels of ships, even when the Classification Societies` unified requirements are satisfied. A reliability based approach for the calibration of the rules for the global strength of ships is therefore proposed, in order to assist designers and Classification Societies in the process of producing ships which are more optimized, with respect to ensured safety levels. Based on the work reported in the paper, the feasibility and usefulness of a reliability based approach in the development of ship longitudinal strength requirements has been demonstrated.« less

  1. Process characteristics and design methods for a 300 deg quad OP amp

    NASA Technical Reports Server (NTRS)

    Beasom, J. D.; Patterson, R. B., III

    1981-01-01

    The results of process characterization, circuit design, and reliability studies for the development of a quad OP amplifier intended for use up to 300 C are presented. A dielectrically isolated complementary vertical bipolar process was chosen to fabricate the amplifier in order to eliminate isolation leakage and the possibility of latch up. Characterization of NPN and PNP junctions showed them to be suitable for use up to 300 C. Interconnect reliability was predicted to be greater than four years mean time between failure. Parasitic MOS formation was eliminated by isolation of each device.

  2. Understanding the Reliability of Solder Joints Used in Advanced Structural and Electronics Applications: Part 2 - Reliability Performance.

    DOE PAGES

    Vianco, Paul T.

    2017-03-01

    Whether structural or electronic, all solder joints must provide the necessary level of reliability for the application. The Part 1 report examined the effects of filler metal properties and the soldering process on joint reliability. Filler metal solderability and mechanical properties, as well as the extents of base material dissolution and interface reaction that occur during the soldering process, were shown to affect reliability performance. The continuation of this discussion is presented in this Part 2 report, which highlights those factors that directly affect solder joint reliability. There is the growth of an intermetallic compound (IMC) reaction layer at themore » solder/base material interface by means of solid-state diffusion processes. In terms of mechanical response by the solder joint, fatigue remains as the foremost concern for long-term performance. Thermal mechanical fatigue (TMF), a form of low-cycle fatigue (LCF), occurs when temperature cycling is combined with mismatched values of the coefficient of thermal expansion (CTE) between materials comprising the solder joint “system.” Vibration environments give rise to high-cycle fatigue (HCF) degradation. Although accelerated aging studies provide valuable empirical data, too many variants of filler metals, base materials, joint geometries, and service environments are forcing design engineers to embrace computational modeling to predict the long-term reliability of solder joints.« less

  3. Bayesian methods in reliability

    NASA Astrophysics Data System (ADS)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  4. Design of high reliability organizations in health care.

    PubMed

    Carroll, J S; Rudolph, J W

    2006-12-01

    To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self-understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self-design for safety and reliability.

  5. Complementary Reliability-Based Decodings of Binary Linear Block Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1997-01-01

    This correspondence presents a hybrid reliability-based decoding algorithm which combines the reprocessing method based on the most reliable basis and a generalized Chase-type algebraic decoder based on the least reliable positions. It is shown that reprocessing with a simple additional algebraic decoding effort achieves significant coding gain. For long codes, the order of reprocessing required to achieve asymptotic optimum error performance is reduced by approximately 1/3. This significantly reduces the computational complexity, especially for long codes. Also, a more efficient criterion for stopping the decoding process is derived based on the knowledge of the algebraic decoding solution.

  6. [Employees in high-reliability organizations: systematic selection of personnel as a final criterion].

    PubMed

    Oubaid, V; Anheuser, P

    2014-05-01

    Employees represent an important safety factor in high-reliability organizations. The combination of clear organizational structures, a nonpunitive safety culture, and psychological personnel selection guarantee a high level of safety. The cockpit personnel selection process of a major German airline is presented in order to demonstrate a possible transferability into medicine and urology.

  7. Reliability issues in PACS

    NASA Astrophysics Data System (ADS)

    Taira, Ricky K.; Chan, Kelby K.; Stewart, Brent K.; Weinberg, Wolfram S.

    1991-07-01

    Reliability is an increasing concern when moving PACS from the experimental laboratory to the clinical environment. Any system downtime may seriously affect patient care. The authors report on the several classes of errors encountered during the pre-clinical release of the PACS during the past several months and present the solutions implemented to handle them. The reliability issues discussed include: (1) environmental precautions, (2) database backups, (3) monitor routines of critical resources and processes, (4) hardware redundancy (networks, archives), and (5) development of a PACS quality control program.

  8. Noninteractive macroscopic reliability model for whisker-reinforced ceramic composites

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Arnold, Steven M.

    1990-01-01

    Considerable research is underway in the field of material science focusing on incorporating silicon carbide whiskers into silicon nitride and alumina matrices. These composites show the requisite thermal stability and thermal shock resistance necessary for use as components in advanced gas turbines and heat exchangers. This paper presents a macroscopic noninteractive reliability model for whisker-reinforced ceramic composites. The theory is multiaxial and is applicable to composites that can be characterized as transversely isotropic. Enough processing data exists to suggest this idealization encompasses a significantly large class of fabricated components. A qualitative assessment of the model is made by presenting reliability surfaces in several different stress spaces and for different values of model parameters.

  9. Reliable transfer of data from ground to space

    NASA Technical Reports Server (NTRS)

    Brosi, Fred

    1993-01-01

    This paper describes the problems involved in uplink of data from control centers on the ground to spacecraft, and explores the solutions to those problems, past. present. and future. The evolution of this process, from simple commanding to transfer of large volumes of data and commands is traced. The need for reliable end-to-end protocols for commanding and file transfer is demonstrated, and the shortcomings of both existing telecommand protocols and commercial products to meet this need are discussed. Recent developments in commercial protocols that may be adaptable to the mentioned operations environment are surveyed, and current efforts to develop a suite of protocols for reliable transfer in this environment are presented.

  10. An adaptive management process for forest soil conservation.

    Treesearch

    Michael P. Curran; Douglas G. Maynard; Ronald L. Heninger; Thomas A. Terry; Steven W. Howes; Douglas M. Stone; Thomas Niemann; Richard E. Miller; Robert F. Powers

    2005-01-01

    Soil disturbance guidelines should be based on comparable disturbance categories adapted to specific local soil conditions, validated by monitoring and research. Guidelines, standards, and practices should be continually improved based on an adaptive management process, which is presented in this paper. Core components of this process include: reliable monitoring...

  11. Photovoltaic research needs industry perspective

    NASA Technical Reports Server (NTRS)

    Ravi, K. V.

    1982-01-01

    An industries perspective of photovoltaic research needs is presented. Objectives and features of industry needs are discussed for the materials, devices, processes, and reliability research categories.

  12. Development of an ultrasonic weld inspection system based on image processing and neural networks

    NASA Astrophysics Data System (ADS)

    Roca Barceló, Fernando; Jaén del Hierro, Pedro; Ribes Llario, Fran; Real Herráiz, Julia

    2018-04-01

    Several types of discontinuities and defects may be present on a weld, thus leading to a considerable reduction of its resistance. Therefore, ensuring a high welding quality and reliability has become a matter of key importance for many construction and industrial activities. Among the non-destructive weld testing and inspection techniques, the time-of-flight diffraction (TOFD) arises as a very safe (no ionising radiation), precise, reliable and versatile practice. However, this technique presents a relevant drawback, associated to the appearance of speckle noise that should be addressed. In this regard, this paper presents a new, intelligent and automatic method for weld inspection and analysis, based on TOFD, image processing and neural networks. The developed system is capable of detecting weld defects and imperfections with accuracy, and classify them into different categories.

  13. Reliability growth modeling analysis of the space shuttle main engines based upon the Weibull process

    NASA Technical Reports Server (NTRS)

    Wheeler, J. T.

    1990-01-01

    The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.

  14. Design of high reliability organizations in health care

    PubMed Central

    Carroll, J S; Rudolph, J W

    2006-01-01

    To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self‐understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self‐design for safety and reliability. PMID:17142607

  15. Cliophysics: Socio-Political Reliability Theory, Polity Duration and African Political (In)stabilities

    PubMed Central

    Cherif, Alhaji; Barley, Kamal

    2010-01-01

    Quantification of historical sociological processes have recently gained attention among theoreticians in the effort of providing a solid theoretical understanding of the behaviors and regularities present in socio-political dynamics. Here we present a reliability theory of polity processes with emphases on individual political dynamics of African countries. We found that the structural properties of polity failure rates successfully capture the risk of political vulnerability and instabilities in which , , , and of the countries with monotonically increasing, unimodal, U-shaped and monotonically decreasing polity failure rates, respectively, have high level of state fragility indices. The quasi-U-shape relationship between average polity duration and regime types corroborates historical precedents and explains the stability of the autocracies and democracies. PMID:21206911

  16. Comparison of fMRI paradigms assessing visuospatial processing: Robustness and reproducibility

    PubMed Central

    Herholz, Peer; Zimmermann, Kristin M.; Westermann, Stefan; Frässle, Stefan; Jansen, Andreas

    2017-01-01

    The development of brain imaging techniques, in particular functional magnetic resonance imaging (fMRI), made it possible to non-invasively study the hemispheric lateralization of cognitive brain functions in large cohorts. Comprehensive models of hemispheric lateralization are, however, still missing and should not only account for the hemispheric specialization of individual brain functions, but also for the interactions among different lateralized cognitive processes (e.g., language and visuospatial processing). This calls for robust and reliable paradigms to study hemispheric lateralization for various cognitive functions. While numerous reliable imaging paradigms have been developed for language, which represents the most prominent left-lateralized brain function, the reliability of imaging paradigms investigating typically right-lateralized brain functions, such as visuospatial processing, has received comparatively less attention. In the present study, we aimed to establish an fMRI paradigm that robustly and reliably identifies right-hemispheric activation evoked by visuospatial processing in individual subjects. In a first study, we therefore compared three frequently used paradigms for assessing visuospatial processing and evaluated their utility to robustly detect right-lateralized brain activity on a single-subject level. In a second study, we then assessed the test-retest reliability of the so-called Landmark task–the paradigm that yielded the most robust results in study 1. At the single-voxel level, we found poor reliability of the brain activation underlying visuospatial attention. This suggests that poor signal-to-noise ratios can become a limiting factor for test-retest reliability. This represents a common detriment of fMRI paradigms investigating visuospatial attention in general and therefore highlights the need for careful considerations of both the possibilities and limitations of the respective fMRI paradigm–in particular, when being interested in effects at the single-voxel level. Notably, however, when focusing on the reliability of measures of hemispheric lateralization (which was the main goal of study 2), we show that hemispheric dominance (quantified by the lateralization index, LI, with |LI| >0.4) of the evoked activation could be robustly determined in more than 62% and, if considering only two categories (i.e., left, right), in more than 93% of our subjects. Furthermore, the reliability of the lateralization strength (LI) was “fair” to “good”. In conclusion, our results suggest that the degree of right-hemispheric dominance during visuospatial processing can be reliably determined using the Landmark task, both at the group and single-subject level, while at the same time stressing the need for future refinements of experimental paradigms and more sophisticated fMRI data acquisition techniques. PMID:29059201

  17. Overview of RICOR's reliability theoretical analysis, accelerated life demonstration test results and verification by field data

    NASA Astrophysics Data System (ADS)

    Vainshtein, Igor; Baruch, Shlomi; Regev, Itai; Segal, Victor; Filis, Avishai; Riabzev, Sergey

    2018-05-01

    The growing demand for EO applications that work around the clock 24hr/7days a week, such as in border surveillance systems, emphasizes the need for a highly reliable cryocooler having increased operational availability and optimized system's Integrated Logistic Support (ILS). In order to meet this need, RICOR developed linear and rotary cryocoolers which achieved successfully this goal. Cryocoolers MTTF was analyzed by theoretical reliability evaluation methods, demonstrated by normal and accelerated life tests at Cryocooler level and finally verified by field data analysis derived from Cryocoolers operating at system level. The following paper reviews theoretical reliability analysis methods together with analyzing reliability test results derived from standard and accelerated life demonstration tests performed at Ricor's advanced reliability laboratory. As a summary for the work process, reliability verification data will be presented as a feedback from fielded systems.

  18. Power processing for electric propulsion

    NASA Technical Reports Server (NTRS)

    Finke, R. C.; Herron, B. G.; Gant, G. D.

    1975-01-01

    The potential of achieving up to 30 per cent more spacecraft payload or 50 per cent more useful operating life by the use of electric propulsion in place of conventional cold gas or hydrazine systems in science, communications, and earth applications spacecraft is a compelling reason to consider the inclusion of electric thruster systems in new spacecraft design. The propulsion requirements of such spacecraft dictate a wide range of thruster power levels and operational lifetimes, which must be matched by lightweight, efficient, and reliable thruster power processing systems. This paper will present electron bombardment ion thruster requirements; review the performance characteristics of present power processing systems; discuss design philosophies and alternatives in areas such as inverter type, arc protection, and control methods; and project future performance potentials for meeting goals in the areas of power processor weight (10 kg/kW), efficiency (approaching 92 per cent), reliability (0.96 for 15,000 hr), and thermal control capability (0.3 to 5 AU).

  19. Body of Knowledge (BOK) for Leadless Quad Flat No-Lead/bottom Termination Components (QFN/BTC) Package Trends and Reliability

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza

    2014-01-01

    Bottom terminated components and quad flat no-lead (BTC/QFN) packages have been extensively used by commercial industry for more than a decade. Cost and performance advantages and the closeness of the packages to the boards make them especially unique for radio frequency (RF) applications. A number of high-reliability parts are now available in this style of package configuration. This report presents a summary of literature surveyed and provides a body of knowledge (BOK) gathered on the status of BTC/QFN and their advanced versions of multi-row QFN (MRQFN) packaging technologies. The report provides a comprehensive review of packaging trends and specifications on design, assembly, and reliability. Emphasis is placed on assembly reliability and associated key design and process parameters because they show lower life than standard leaded package assembly under thermal cycling exposures. Inspection of hidden solder joints for assuring quality is challenging and is similar to ball grid arrays (BGAs). Understanding the key BTC/QFN technology trends, applications, processing parameters, workmanship defects, and reliability behavior is important when judicially selecting and narrowing the follow-on packages for evaluation and testing, as well as for the low risk insertion in high-reliability applications.

  20. Body of Knowledge (BOK) for Leadless Quad Flat No-Lead/Bottom Termination Components (QFN/BTC) Package Trends and Reliability

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza

    2014-01-01

    Bottom terminated components and quad flat no-lead (BTC/QFN) packages have been extensively used by commercial industry for more than a decade. Cost and performance advantages and the closeness of the packages to the boards make them especially unique for radio frequency (RF) applications. A number of high-reliability parts are now available in this style of package configuration. This report presents a summary of literature surveyed and provides a body of knowledge (BOK) gathered on the status of BTC/QFN and their advanced versions of multi-row QFN (MRQFN) packaging technologies. The report provides a comprehensive review of packaging trends and specifications on design, assembly, and reliability. Emphasis is placed on assembly reliability and associated key design and process parameters because they show lower life than standard leaded package assembly under thermal cycling exposures. Inspection of hidden solder joints for assuring quality is challenging and is similar to ball grid arrays (BGAs). Understanding the key BTC/QFN technology trends, applications, processing parameters, workmanship defects, and reliability behavior is important when judicially selecting and narrowing the follow-on packages for evaluation and testing, as well as for the low risk insertion in high-reliability applications.

  1. Thick resist for MEMS processing

    NASA Astrophysics Data System (ADS)

    Brown, Joe; Hamel, Clifford

    2001-11-01

    The need for technical innovation is always present in today's economy. Microfabrication methods have evolved in support of the demand for smaller and faster integrated circuits with price performance improvements always in the scope of the manufacturing design engineer. The dispersion of processing technology spans well beyond IC fabrication today with batch fabrication and wafer scale processing lending advantages to MEMES applications from biotechnology to consumer electronics from oil exploration to aerospace. Today the demand for innovative processing techniques that enable technology is apparent where only a few years ago appeared too costly or not reliable. In high volume applications where yield and cost improvements are measured in fractions of a percent it is imperative to have process technologies that produce consistent results. Only a few years ago thick resist coatings were limited to thickness less than 20 microns. Factors such as uniformity, edge bead and multiple coatings made high volume production impossible. New developments in photoresist formulation combined with advanced coating equipment techniques that closely controls process parameters have enable thick photoresist coatings of 70 microns with acceptable uniformity and edge bead in one pass. Packaging of microelectronic and micromechanical devices is often a significant cost factor and a reliability issue for high volume low cost production. Technologies such as flip- chip assembly provide a solution for cost and reliability improvements over wire bond techniques. The processing for such technology demands dimensional control and presents a significant cost savings if it were compatible with mainstream technologies. Thick photoresist layers, with good sidewall control would allow wafer-bumping technologies to penetrate the barriers to yield and production where costs for technology are the overriding issue. Single pass processing is paramount to the manufacturability of packaging technology. Uniformity and edge bead control defined the success of process implementation. Today advanced packaging solutions are created with thick photoresist coatings. The techniques and results will be presented.

  2. An abstract specification language for Markov reliability models

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1985-01-01

    Markov models can be used to compute the reliability of virtually any fault tolerant system. However, the process of delineating all of the states and transitions in a model of complex system can be devastatingly tedious and error-prone. An approach to this problem is presented utilizing an abstract model definition language. This high level language is described in a nonformal manner and illustrated by example.

  3. An abstract language for specifying Markov reliability models

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1986-01-01

    Markov models can be used to compute the reliability of virtually any fault tolerant system. However, the process of delineating all of the states and transitions in a model of complex system can be devastatingly tedious and error-prone. An approach to this problem is presented utilizing an abstract model definition language. This high level language is described in a nonformal manner and illustrated by example.

  4. An algebraic equation solution process formulated in anticipation of banded linear equations.

    DOT National Transportation Integrated Search

    1971-01-01

    A general method for the solution of large, sparsely banded, positive-definite, coefficient matrices is presented. The goal in developing the method was to produce an efficient and reliable solution process and to provide the user-programmer with a p...

  5. Optimizing multiple reliable forward contracts for reservoir allocation using multitime scale streamflow forecasts

    NASA Astrophysics Data System (ADS)

    Lu, Mengqian; Lall, Upmanu; Robertson, Andrew W.; Cook, Edward

    2017-03-01

    Streamflow forecasts at multiple time scales provide a new opportunity for reservoir management to address competing objectives. Market instruments such as forward contracts with specified reliability are considered as a tool that may help address the perceived risk associated with the use of such forecasts in lieu of traditional operation and allocation strategies. A water allocation process that enables multiple contracts for water supply and hydropower production with different durations, while maintaining a prescribed level of flood risk reduction, is presented. The allocation process is supported by an optimization model that considers multitime scale ensemble forecasts of monthly streamflow and flood volume over the upcoming season and year, the desired reliability and pricing of proposed contracts for hydropower and water supply. It solves for the size of contracts at each reliability level that can be allocated for each future period, while meeting target end of period reservoir storage with a prescribed reliability. The contracts may be insurable, given that their reliability is verified through retrospective modeling. The process can allow reservoir operators to overcome their concerns as to the appropriate skill of probabilistic forecasts, while providing water users with short-term and long-term guarantees as to how much water or energy they may be allocated. An application of the optimization model to the Bhakra Dam, India, provides an illustration of the process. The issues of forecast skill and contract performance are examined. A field engagement of the idea is useful to develop a real-world perspective and needs a suitable institutional environment.

  6. Characterising the reproducibility and reliability of dietary patterns among Yup'ik Alaska Native people.

    PubMed

    Ryman, Tove K; Boyer, Bert B; Hopkins, Scarlett; Philip, Jacques; O'Brien, Diane; Thummel, Kenneth; Austin, Melissa A

    2015-02-28

    FFQ data can be used to characterise dietary patterns for diet-disease association studies. In the present study, we evaluated three previously defined dietary patterns--'subsistence foods', market-based 'processed foods' and 'fruits and vegetables'--among a sample of Yup'ik people from Southwest Alaska. We tested the reproducibility and reliability of the dietary patterns, as well as the associations of these patterns with dietary biomarkers and participant characteristics. We analysed data from adult study participants who completed at least one FFQ with the Center for Alaska Native Health Research 9/2009-5/2013. To test the reproducibility of the dietary patterns, we conducted a confirmatory factor analysis (CFA) of a hypothesised model using eighteen food items to measure the dietary patterns (n 272). To test the reliability of the dietary patterns, we used the CFA to measure composite reliability (n 272) and intra-class correlation coefficients for test-retest reliability (n 113). Finally, to test the associations, we used linear regression (n 637). All factor loadings, except one, in CFA indicated acceptable correlations between foods and dietary patterns (r>0·40), and model-fit criteria were >0·90. Composite and test-retest reliability of the dietary patterns were, respectively, 0·56 and 0·34 for 'subsistence foods', 0·73 and 0·66 for 'processed foods', and 0·72 and 0·54 for 'fruits and vegetables'. In the multi-predictor analysis, the dietary patterns were significantly associated with dietary biomarkers, community location, age, sex and self-reported lifestyle. This analysis confirmed the reproducibility and reliability of the dietary patterns in the present study population. These dietary patterns can be used for future research and development of dietary interventions in this underserved population.

  7. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  8. Reliable 6 PEP LTPS device for AMOLED's

    NASA Astrophysics Data System (ADS)

    Chou, Cheng-Wei; Wang, Pei-Yun; Hu, Chin-Wei; Chang, York; Chuang, Ching-Sang; Lin, Yusin

    2013-09-01

    This study presents a TFT structure which has less photo process and higher cost competitiveness in AMOLED display markets. A novel LTPS based 6 masks TFT structure for bottom emission AMOLED display is demonstrated in this paper. High field effect mobility (PMOS < 80 cm2/Vs ) and high reliability (PBTS △Vth< 0.02V @ 50oC VG=15V 10ks) was accomplished without the high temperature and rapid thermal annealing (RTA) activation process. Furthermore, a 14-inch AMOLED TV was achieved on the proposed 6-pep TFT backplane using the Gen. 3.5 mass production factory.

  9. Calculating system reliability with SRFYDO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morzinski, Jerome; Anderson - Cook, Christine M; Klamann, Richard M

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for themore » system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.« less

  10. Individual differences and predictors of forgetting in old age: the role of processing speed and working memory.

    PubMed

    Zimprich, Daniel; Kurtz, Tanja

    2013-01-01

    The goal of the present study was to examine whether individual differences in basic cognitive abilities, processing speed, and working memory, are reliable predictors of individual differences in forgetting rates in old age. The sample for the present study comprised 364 participants aged between 65 and 80 years from the Zurich Longitudinal Study on Cognitive Aging. The impact of basic cognitive abilities on forgetting was analyzed by modeling working memory and processing speed as predictors of the amount of forgetting of 27 words, which had been learned across five trials. Forgetting was measured over a 30-minute interval by using parceling and a latent change model, in which the latent difference between recall performance after five learning trials and a delayed recall was modeled. Results implied reliable individual differences in forgetting. These individual differences in forgetting were strongly related to processing speed and working memory. Moreover, an age-related effect, which was significantly stronger for forgetting than for learning, emerged even after controlling effects of processing speed and working memory.

  11. Time frequency analysis of olfactory induced EEG-power change.

    PubMed

    Schriever, Valentin Alexander; Han, Pengfei; Weise, Stefanie; Hösel, Franziska; Pellegrino, Robert; Hummel, Thomas

    2017-01-01

    The objective of the present study was to investigate the usefulness of time-frequency analysis (TFA) of olfactory-induced EEG change with a low-cost, portable olfactometer in the clinical investigation of smell function. A total of 78 volunteers participated. The study was composed of three parts where olfactory stimuli were presented using a custom-built olfactometer. Part I was designed to optimize the stimulus as well as the recording conditions. In part II EEG-power changes after olfactory/trigeminal stimulation were compared between healthy participants and patients with olfactory impairment. In Part III the test-retest reliability of the method was evaluated in healthy subjects. Part I indicated that the most effective paradigm for stimulus presentation was cued stimulus, with an interstimulus interval of 18-20s at a stimulus duration of 1000ms with each stimulus quality presented 60 times in blocks of 20 stimuli each. In Part II we found that central processing of olfactory stimuli analyzed by TFA differed significantly between healthy controls and patients even when controlling for age. It was possible to reliably distinguish patients with olfactory impairment from healthy individuals at a high degree of accuracy (healthy controls vs anosmic patients: sensitivity 75%; specificity 89%). In addition we could show a good test-retest reliability of TFA of chemosensory induced EEG-power changes in Part III. Central processing of olfactory stimuli analyzed by TFA reliably distinguishes patients with olfactory impairment from healthy individuals at a high degree of accuracy. Importantly this can be achieved with a simple olfactometer.

  12. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  13. Report on phase 1 of the Microprocessor Seminar. [and associated large scale integration

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Proceedings of a seminar on microprocessors and associated large scale integrated (LSI) circuits are presented. The potential for commonality of device requirements, candidate processes and mechanisms for qualifying candidate LSI technologies for high reliability applications, and specifications for testing and testability were among the topics discussed. Various programs and tentative plans of the participating organizations in the development of high reliability LSI circuits are given.

  14. Reliability analysis and utilization of PEMs in space application

    NASA Astrophysics Data System (ADS)

    Jiang, Xiujie; Wang, Zhihua; Sun, Huixian; Chen, Xiaomin; Zhao, Tianlin; Yu, Guanghua; Zhou, Changyi

    2009-11-01

    More and more plastic encapsulated microcircuits (PEMs) are used in space missions to achieve high performance. Since PEMs are designed for use in terrestrial operating conditions, the successful usage of PEMs in space harsh environment is closely related to reliability issues, which should be considered firstly. However, there is no ready-made methodology for PEMs in space applications. This paper discusses the reliability for the usage of PEMs in space. This reliability analysis can be divided into five categories: radiation test, radiation hardness, screening test, reliability calculation and reliability assessment. One case study is also presented to illuminate the details of the process, in which a PEM part is used in a joint space program Double-Star Project between the European Space Agency (ESA) and China. The influence of environmental constrains including radiation, humidity, temperature and mechanics on the PEM part has been considered. Both Double-Star Project satellites are still running well in space now.

  15. DESIGN MANUAL - REMOVAL OF ARSENIC FROM DRINKING WATER SUPPLIES BY ION EXCHANGE

    EPA Science Inventory

    This design manual is an in-depth presentation of the steps required to design and operate a water treatment plant for removal of excess arsenic from drinking water using the anion exchange process. The treatment process is very reliable, simple and cost-effective. This design ...

  16. DESIGN MANUAL - REMOVAL OF ARSENIC FROM DRINKING WATER SUPPLIES BY ADSORPTIVE MEDIA

    EPA Science Inventory

    This design manual is an in-depth presentation of the steps required to design and operate a water treatment plant for removal of excess arsenic from drinking water using the adsorptive media process. The treatment process is very reliable, simple and cost-effective. The adsorpt...

  17. Reliability based fatigue design and maintenance procedures

    NASA Technical Reports Server (NTRS)

    Hanagud, S.

    1977-01-01

    A stochastic model has been developed to describe a probability for fatigue process by assuming a varying hazard rate. This stochastic model can be used to obtain the desired probability of a crack of certain length at a given location after a certain number of cycles or time. Quantitative estimation of the developed model was also discussed. Application of the model to develop a procedure for reliability-based cost-effective fail-safe structural design is presented. This design procedure includes the reliability improvement due to inspection and repair. Methods of obtaining optimum inspection and maintenance schemes are treated.

  18. Reliability prediction of ontology-based service compositions using Petri net and time series models.

    PubMed

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy.

  19. Applicability and Limitations of Reliability Allocation Methods

    NASA Technical Reports Server (NTRS)

    Cruz, Jose A.

    2016-01-01

    Reliability allocation process may be described as the process of assigning reliability requirements to individual components within a system to attain the specified system reliability. For large systems, the allocation process is often performed at different stages of system design. The allocation process often begins at the conceptual stage. As the system design develops, more information about components and the operating environment becomes available, different allocation methods can be considered. Reliability allocation methods are usually divided into two categories: weighting factors and optimal reliability allocation. When properly applied, these methods can produce reasonable approximations. Reliability allocation techniques have limitations and implied assumptions that need to be understood by system engineers. Applying reliability allocation techniques without understanding their limitations and assumptions can produce unrealistic results. This report addresses weighting factors, optimal reliability allocation techniques, and identifies the applicability and limitations of each reliability allocation technique.

  20. RGCA: A Reliable GPU Cluster Architecture for Large-Scale Internet of Things Computing Based on Effective Performance-Energy Optimization

    PubMed Central

    Chen, Qingkui; Zhao, Deyu; Wang, Jingjuan

    2017-01-01

    This paper aims to develop a low-cost, high-performance and high-reliability computing system to process large-scale data using common data mining algorithms in the Internet of Things (IoT) computing environment. Considering the characteristics of IoT data processing, similar to mainstream high performance computing, we use a GPU (Graphics Processing Unit) cluster to achieve better IoT services. Firstly, we present an energy consumption calculation method (ECCM) based on WSNs. Then, using the CUDA (Compute Unified Device Architecture) Programming model, we propose a Two-level Parallel Optimization Model (TLPOM) which exploits reasonable resource planning and common compiler optimization techniques to obtain the best blocks and threads configuration considering the resource constraints of each node. The key to this part is dynamic coupling Thread-Level Parallelism (TLP) and Instruction-Level Parallelism (ILP) to improve the performance of the algorithms without additional energy consumption. Finally, combining the ECCM and the TLPOM, we use the Reliable GPU Cluster Architecture (RGCA) to obtain a high-reliability computing system considering the nodes’ diversity, algorithm characteristics, etc. The results show that the performance of the algorithms significantly increased by 34.1%, 33.96% and 24.07% for Fermi, Kepler and Maxwell on average with TLPOM and the RGCA ensures that our IoT computing system provides low-cost and high-reliability services. PMID:28777325

  1. RGCA: A Reliable GPU Cluster Architecture for Large-Scale Internet of Things Computing Based on Effective Performance-Energy Optimization.

    PubMed

    Fang, Yuling; Chen, Qingkui; Xiong, Neal N; Zhao, Deyu; Wang, Jingjuan

    2017-08-04

    This paper aims to develop a low-cost, high-performance and high-reliability computing system to process large-scale data using common data mining algorithms in the Internet of Things (IoT) computing environment. Considering the characteristics of IoT data processing, similar to mainstream high performance computing, we use a GPU (Graphics Processing Unit) cluster to achieve better IoT services. Firstly, we present an energy consumption calculation method (ECCM) based on WSNs. Then, using the CUDA (Compute Unified Device Architecture) Programming model, we propose a Two-level Parallel Optimization Model (TLPOM) which exploits reasonable resource planning and common compiler optimization techniques to obtain the best blocks and threads configuration considering the resource constraints of each node. The key to this part is dynamic coupling Thread-Level Parallelism (TLP) and Instruction-Level Parallelism (ILP) to improve the performance of the algorithms without additional energy consumption. Finally, combining the ECCM and the TLPOM, we use the Reliable GPU Cluster Architecture (RGCA) to obtain a high-reliability computing system considering the nodes' diversity, algorithm characteristics, etc. The results show that the performance of the algorithms significantly increased by 34.1%, 33.96% and 24.07% for Fermi, Kepler and Maxwell on average with TLPOM and the RGCA ensures that our IoT computing system provides low-cost and high-reliability services.

  2. A high voltage electrical power system for low Earth orbit applications

    NASA Technical Reports Server (NTRS)

    Lanier, J. R., Jr.; Bush, J. R., Jr.

    1984-01-01

    The results of testing a high voltage electrical power system (EPS) breadboard using high voltage power processing equipment developed at Marshall Space Flight Center and Ni-Cd batteries are discussed. These test results are used to extrapolate to an efficient, reliable, high capacity EPS for near term low Earth orbit, high power applications. EPS efficiencies, figures of merit, and battery reliability with a battery protection and reconditioning circuit are presented.

  3. Non-Technical Skills for Surgeons (NOTSS): Critical appraisal of its measurement properties.

    PubMed

    Jung, James J; Borkhoff, Cornelia M; Jüni, Peter; Grantcharov, Teodor P

    2018-02-17

    To critically appraise the development and measurement properties, including sensibility, reliability, and validity of the Non-Technical Skills of Surgeons (NOTSS) system. Articles that described development process of the NOTSS system were identified. Relevant primary studies that presented evidence of reliability and validity were identified through a comprehensive literature review. NOTSS was developed through robust item generation and reduction strategies. It was shown to have good content validity, acceptability, and feasibility. Inter-rater reliability increased with greater expertise and number of assessors. Studies demonstrated evidence of cross-sectional construct validity, in that the tool was able to differentiate known groups of varied non-technical skill levels. Evidence of longitudinal construct validity also existed to demonstrate that NOTSS detected changes in non-technical skills before and after targeted training. In populations and settings presented in our critical appraisal, NOTSS provided reliable and valid measurements of intraoperative non-technical skills of surgeons. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. An efficient, reliable and inexpensive device for the rapid homogenization of multiple tissue samples by centrifugation.

    PubMed

    Ilyin, S E; Plata-Salamán, C R

    2000-02-15

    Homogenization of tissue samples is a common first step in the majority of current protocols for RNA, DNA, and protein isolation. This report describes a simple device for centrifugation-mediated homogenization of tissue samples. The method presented is applicable to RNA, DNA, and protein isolation, and we show examples where high quality total cell RNA, DNA, and protein were obtained from brain and other tissue samples. The advantages of the approach presented include: (1) a significant reduction in time investment relative to hand-driven or individual motorized-driven pestle homogenization; (2) easy construction of the device from inexpensive parts available in any laboratory; (3) high replicability in the processing; and (4) the capacity for the parallel processing of multiple tissue samples, thus allowing higher efficiency, reliability, and standardization.

  5. Proceedings of the 24th Project Integration Meeting

    NASA Technical Reports Server (NTRS)

    Tustin, D.

    1984-01-01

    Progress made by the Flat-Plate Solar Array Project is described. Reports on silicon sheet growth and characterization, silicon material, process development, high-efficiency cells, environmental isolation, engineering sciences, and reliability physics are presented along with copies of visual presentations made at the 24th Project Integration Meeting.

  6. Development of a nanosatellite de-orbiting system by reliability based design optimization

    NASA Astrophysics Data System (ADS)

    Nikbay, Melike; Acar, Pınar; Aslan, Alim Rüstem

    2015-12-01

    This paper presents design approaches to develop a reliable and efficient de-orbiting system for the 3USAT nanosatellite to provide a beneficial orbital decay process at the end of a mission. A de-orbiting system is initially designed by employing the aerodynamic drag augmentation principle where the structural constraints of the overall satellite system and the aerodynamic forces are taken into account. Next, an alternative de-orbiting system is designed with new considerations and further optimized using deterministic and reliability based design techniques. For the multi-objective design, the objectives are chosen to maximize the aerodynamic drag force through the maximization of the Kapton surface area while minimizing the de-orbiting system mass. The constraints are related in a deterministic manner to the required deployment force, the height of the solar panel hole and the deployment angle. The length and the number of layers of the deployable Kapton structure are used as optimization variables. In the second stage of this study, uncertainties related to both manufacturing and operating conditions of the deployable structure in space environment are considered. These uncertainties are then incorporated into the design process by using different probabilistic approaches such as Monte Carlo Simulation, the First-Order Reliability Method and the Second-Order Reliability Method. The reliability based design optimization seeks optimal solutions using the former design objectives and constraints with the inclusion of a reliability index. Finally, the de-orbiting system design alternatives generated by different approaches are investigated and the reliability based optimum design is found to yield the best solution since it significantly improves both system reliability and performance requirements.

  7. Reliability Quantification of the Flexure: A Critical Stirling Convertor Component

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward J.

    2004-01-01

    Uncertainties in the manufacturing, fabrication process, material behavior, loads, and boundary conditions results in the variation of the stresses and strains induced in the flexures and its fatigue life. Past experience and the test data at material coupon levels revealed a significant amount of scatter of the fatigue life. Owing to these facts, the design of the flexure, using conventional approaches based on safety factor or traditional reliability based on similar equipment considerations does not provide a direct measure of reliability. Additionally, it may not be feasible to run actual long term fatigue tests due to cost and time constraints. Therefore it is difficult to ascertain material fatigue strength limit. The objective of the paper is to present a methodology and quantified results of numerical simulation for the reliability of flexures used in the Stirling convertor for their structural performance. The proposed approach is based on application of finite element analysis method in combination with the random fatigue limit model, which includes uncertainties in material fatigue life. Additionally, sensitivity of fatigue life reliability to the design variables is quantified and its use to develop guidelines to improve design, manufacturing, quality control and inspection design process is described.

  8. Towards automatic Markov reliability modeling of computer architectures

    NASA Technical Reports Server (NTRS)

    Liceaga, C. A.; Siewiorek, D. P.

    1986-01-01

    The analysis and evaluation of reliability measures using time-varying Markov models is required for Processor-Memory-Switch (PMS) structures that have competing processes such as standby redundancy and repair, or renewal processes such as transient or intermittent faults. The task of generating these models is tedious and prone to human error due to the large number of states and transitions involved in any reasonable system. Therefore model formulation is a major analysis bottleneck, and model verification is a major validation problem. The general unfamiliarity of computer architects with Markov modeling techniques further increases the necessity of automating the model formulation. This paper presents an overview of the Automated Reliability Modeling (ARM) program, under development at NASA Langley Research Center. ARM will accept as input a description of the PMS interconnection graph, the behavior of the PMS components, the fault-tolerant strategies, and the operational requirements. The output of ARM will be the reliability of availability Markov model formulated for direct use by evaluation programs. The advantages of such an approach are (a) utility to a large class of users, not necessarily expert in reliability analysis, and (b) a lower probability of human error in the computation.

  9. GPS Data Filtration Method for Drive Cycle Analysis Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duran, A.; Earleywine, M.

    2013-02-01

    When employing GPS data acquisition systems to capture vehicle drive-cycle information, a number of errors often appear in the raw data samples, such as sudden signal loss, extraneous or outlying data points, speed drifting, and signal white noise, all of which limit the quality of field data for use in downstream applications. Unaddressed, these errors significantly impact the reliability of source data and limit the effectiveness of traditional drive-cycle analysis approaches and vehicle simulation software. Without reliable speed and time information, the validity of derived metrics for drive cycles, such as acceleration, power, and distance, become questionable. This study exploresmore » some of the common sources of error present in raw onboard GPS data and presents a detailed filtering process designed to correct for these issues. Test data from both light and medium/heavy duty applications are examined to illustrate the effectiveness of the proposed filtration process across the range of vehicle vocations. Graphical comparisons of raw and filtered cycles are presented, and statistical analyses are performed to determine the effects of the proposed filtration process on raw data. Finally, an evaluation of the overall benefits of data filtration on raw GPS data and present potential areas for continued research is presented.« less

  10. Designing a Likert-Type Scale to Predict Environmentally Responsible Behavior in Undergraduate Students: A Multistep Process.

    ERIC Educational Resources Information Center

    Smith-Sebasto, N. J.; D'Costa, Ayres

    1995-01-01

    Describes an attempt to develop a reliable and valid instrument to assess the relationship between locus of control of reinforcement and environmentally responsible behavior. Presents a six-step psychometric process used to develop the Environmental Action Internal Control Index (EAICI) for undergraduate students. Contains 54 references. (JRH)

  11. Toward reliable and repeatable automated STEM-EDS metrology with high throughput

    NASA Astrophysics Data System (ADS)

    Zhong, Zhenxin; Donald, Jason; Dutrow, Gavin; Roller, Justin; Ugurlu, Ozan; Verheijen, Martin; Bidiuk, Oleksii

    2018-03-01

    New materials and designs in complex 3D architectures in logic and memory devices have raised complexity in S/TEM metrology. In this paper, we report about a newly developed, automated, scanning transmission electron microscopy (STEM) based, energy dispersive X-ray spectroscopy (STEM-EDS) metrology method that addresses these challenges. Different methodologies toward repeatable and efficient, automated STEM-EDS metrology with high throughput are presented: we introduce the best known auto-EDS acquisition and quantification methods for robust and reliable metrology and present how electron exposure dose impacts the EDS metrology reproducibility, either due to poor signalto-noise ratio (SNR) at low dose or due to sample modifications at high dose conditions. Finally, we discuss the limitations of the STEM-EDS metrology technique and propose strategies to optimize the process both in terms of throughput and metrology reliability.

  12. An enhanced reliability-oriented workforce planning model for process industry using combined fuzzy goal programming and differential evolution approach

    NASA Astrophysics Data System (ADS)

    Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.

    2018-03-01

    This paper draws on the "human reliability" concept as a structure for gaining insight into the maintenance workforce assessment in a process industry. Human reliability hinges on developing the reliability of humans to a threshold that guides the maintenance workforce to execute accurate decisions within the limits of resources and time allocations. This concept offers a worthwhile point of deviation to encompass three elegant adjustments to literature model in terms of maintenance time, workforce performance and return-on-workforce investments. These fully explain the results of our influence. The presented structure breaks new grounds in maintenance workforce theory and practice from a number of perspectives. First, we have successfully implemented fuzzy goal programming (FGP) and differential evolution (DE) techniques for the solution of optimisation problem in maintenance of a process plant for the first time. The results obtained in this work showed better quality of solution from the DE algorithm compared with those of genetic algorithm and particle swarm optimisation algorithm, thus expressing superiority of the proposed procedure over them. Second, the analytical discourse, which was framed on stochastic theory, focusing on specific application to a process plant in Nigeria is a novelty. The work provides more insights into maintenance workforce planning during overhaul rework and overtime maintenance activities in manufacturing systems and demonstrated capacity in generating substantially helpful information for practice.

  13. All about Listening.

    ERIC Educational Resources Information Center

    Grunkemeyer, Florence B.

    1992-01-01

    Discusses the importance of effective listening and problems in the listening process. Presents a matrix evaluating 18 listening inventories on 8 criteria: cost effectiveness, educational use, business use, reliability, validity, adult audience, high school audience, and potential barriers. (JOW)

  14. Cross-language parafoveal semantic processing: Evidence from Korean-Chinese bilinguals.

    PubMed

    Wang, Aiping; Yeon, Junmo; Zhou, Wei; Shu, Hua; Yan, Ming

    2016-02-01

    In the present study, we aimed at testing cross-language cognate and semantic preview effects. We tested how native Korean readers who learned Chinese as a second language make use of the parafoveal information during the reading of Chinese sentences. There were 3 types of Korean preview words: cognate translations of the Chinese target words, semantically related noncognate words, and unrelated words. Together with a highly significant cognate preview effect, more critically, we also observed reliable facilitation in processing of the target word from the semantically related previews in all fixation measures. Results from the present study provide first evidence for semantic processing from parafoveally presented Korean words and for cross-language parafoveal semantic processing.

  15. The neural processing of hierarchical structure in music and speech at different timescales

    PubMed Central

    Farbood, Morwaread M.; Heeger, David J.; Marcus, Gary; Hasson, Uri; Lerner, Yulia

    2015-01-01

    Music, like speech, is a complex auditory signal that contains structures at multiple timescales, and as such is a potentially powerful entry point into the question of how the brain integrates complex streams of information. Using an experimental design modeled after previous studies that used scrambled versions of a spoken story (Lerner et al., 2011) and a silent movie (Hasson et al., 2008), we investigate whether listeners perceive hierarchical structure in music beyond short (~6 s) time windows and whether there is cortical overlap between music and language processing at multiple timescales. Experienced pianists were presented with an extended musical excerpt scrambled at multiple timescales—by measure, phrase, and section—while measuring brain activity with functional magnetic resonance imaging (fMRI). The reliability of evoked activity, as quantified by inter-subject correlation of the fMRI responses, was measured. We found that response reliability depended systematically on musical structure coherence, revealing a topographically organized hierarchy of processing timescales. Early auditory areas (at the bottom of the hierarchy) responded reliably in all conditions. For brain areas at the top of the hierarchy, the original (unscrambled) excerpt evoked more reliable responses than any of the scrambled excerpts, indicating that these brain areas process long-timescale musical structures, on the order of minutes. The topography of processing timescales was analogous with that reported previously for speech, but the timescale gradients for music and speech overlapped with one another only partially, suggesting that temporally analogous structures—words/measures, sentences/musical phrases, paragraph/sections—are processed separately. PMID:26029037

  16. The neural processing of hierarchical structure in music and speech at different timescales.

    PubMed

    Farbood, Morwaread M; Heeger, David J; Marcus, Gary; Hasson, Uri; Lerner, Yulia

    2015-01-01

    Music, like speech, is a complex auditory signal that contains structures at multiple timescales, and as such is a potentially powerful entry point into the question of how the brain integrates complex streams of information. Using an experimental design modeled after previous studies that used scrambled versions of a spoken story (Lerner et al., 2011) and a silent movie (Hasson et al., 2008), we investigate whether listeners perceive hierarchical structure in music beyond short (~6 s) time windows and whether there is cortical overlap between music and language processing at multiple timescales. Experienced pianists were presented with an extended musical excerpt scrambled at multiple timescales-by measure, phrase, and section-while measuring brain activity with functional magnetic resonance imaging (fMRI). The reliability of evoked activity, as quantified by inter-subject correlation of the fMRI responses, was measured. We found that response reliability depended systematically on musical structure coherence, revealing a topographically organized hierarchy of processing timescales. Early auditory areas (at the bottom of the hierarchy) responded reliably in all conditions. For brain areas at the top of the hierarchy, the original (unscrambled) excerpt evoked more reliable responses than any of the scrambled excerpts, indicating that these brain areas process long-timescale musical structures, on the order of minutes. The topography of processing timescales was analogous with that reported previously for speech, but the timescale gradients for music and speech overlapped with one another only partially, suggesting that temporally analogous structures-words/measures, sentences/musical phrases, paragraph/sections-are processed separately.

  17. Interrater reliability to assure valid content in peer review of CME-accredited presentations.

    PubMed

    Quigg, Mark; Lado, Fred A

    2009-01-01

    The Accreditation Council for Continuing Medical Education (ACCME) provides guidelines for continuing medical education (CME) materials to mitigate problems in the independence or validity of content in certified activities; however, the process of peer review of materials appears largely unstudied and the reproducibility of peer-review audits for ACCME accreditation and designation of American Medical Association Category 1 Credit(TM) is unknown. Categories of presentation defects were constructed from discussions of the CME committee of the American Epilepsy Society: (1) insufficient citation, (2) poor formatting, (3) nonacknowledgment of non-FDA-approved use, (4) misapplied data, (5) 1-sided data, (6) self- or institutional promotion, (7) conflict of interest/commercial bias, (8) other, or (9) no defect. A PowerPoint lecture (n = 29 slides) suitable for presentation to general neurologists was purposefully created with the above defects. A multirater, multilevel kappa statistic was determined from the number and category of defects. Of 14 reviewers, 12 returned completed surveys (86%) identifying a mean +/- standard deviation 1.6 +/- 1.1 defects/slide. The interrater kappa equaled 0.115 (poor reliability) for number of defects/slides. No individual categories achieved kappa > 0.38. Interrater reliability on the rating of durable materials used in subspecialty CME was poor. Guidelines for CME appropriate content are too subjective to be applied reliably by raters knowledgeable in their specialty field but relatively untrained in the specifics of CME requirements. The process of peer review of CME materials would be aided by education of physicians on validation of materials appropriate for CME.

  18. Effect of Sensors on the Reliability and Control Performance of Power Circuits in the Web of Things (WoT)

    PubMed Central

    Bae, Sungwoo; Kim, Myungchin

    2016-01-01

    In order to realize a true WoT environment, a reliable power circuit is required to ensure interconnections among a range of WoT devices. This paper presents research on sensors and their effects on the reliability and response characteristics of power circuits in WoT devices. The presented research can be used in various power circuit applications, such as energy harvesting interfaces, photovoltaic systems, and battery management systems for the WoT devices. As power circuits rely on the feedback from voltage/current sensors, the system performance is likely to be affected by the sensor failure rates, sensor dynamic characteristics, and their interface circuits. This study investigated how the operational availability of the power circuits is affected by the sensor failure rates by performing a quantitative reliability analysis. In the analysis process, this paper also includes the effects of various reconstruction and estimation techniques used in power processing circuits (e.g., energy harvesting circuits and photovoltaic systems). This paper also reports how the transient control performance of power circuits is affected by sensor interface circuits. With the frequency domain stability analysis and circuit simulation, it was verified that the interface circuit dynamics may affect the transient response characteristics of power circuits. The verification results in this paper showed that the reliability and control performance of the power circuits can be affected by the sensor types, fault tolerant approaches against sensor failures, and the response characteristics of the sensor interfaces. The analysis results were also verified by experiments using a power circuit prototype. PMID:27608020

  19. Progress in reliable single emitters and laser bars for efficient CW-operation in the near-infrared emission range

    NASA Astrophysics Data System (ADS)

    Zorn, Martin; Hülsewede, Ralf; Pietrzak, Agnieszka; Meusel, Jens; Sebastian, Jürgen

    2015-03-01

    Laser bars, laser arrays, and single emitters are highly-desired light sources e.g. for direct material processing, pump sources for solid state and fiber lasers or medical applications. These sources require high output powers with optimal efficiency together with good reliability resulting in a long lifetime of the device. Desired wavelengths range from 760 nm in esthetic skin treatment over 915 nm, 940 nm and 976 nm to 1030 nm for direct material processing and pumping applications. In this publication we present our latest developments for the different application-defined wavelengths in continuouswave operation mode. At 760nm laser bars with 30 % filling factor and 1.5 mm resonator length show optical output powers around 90-100 W using an optimized design. For longer wavelengths between 915 nm and 1030 nm laser bars with 4 mm resonator length and 50 % filling factor show reliable output powers above 200 W. The efficiency reached lies above 60% and the slow axis divergence (95% power content) is below 7°. Further developments of bars tailored for 940 nm emission wavelength reach output powers of 350 W. Reliable single emitters for effective fiber coupling having emitter widths of 90 μm and 195 μm are presented. They emit optical powers of 12 W and 24 W, respectively, at emission wavelengths of 915 nm, 940 nm and 976 nm. Moreover, reliability tests of 90 μm-single emitters at a power level of 12W currently show a life time over 3500 h.

  20. Multiple serial picture presentation with millisecond resolution using a three-way LC-shutter-tachistoscope

    PubMed Central

    Fischmeister, Florian Ph.S.; Leodolter, Ulrich; Windischberger, Christian; Kasess, Christian H.; Schöpf, Veronika; Moser, Ewald; Bauer, Herbert

    2010-01-01

    Throughout recent years there has been an increasing interest in studying unconscious visual processes. Such conditions of unawareness are typically achieved by either a sufficient reduction of the stimulus presentation time or visual masking. However, there are growing concerns about the reliability of the presentation devices used. As all these devices show great variability in presentation parameters, the processing of visual stimuli becomes dependent on the display-device, e.g. minimal changes in the physical stimulus properties may have an enormous impact on stimulus processing by the sensory system and on the actual experience of the stimulus. Here we present a custom-built three-way LC-shutter-tachistoscope which allows experimental setups with both, precise and reliable stimulus delivery, and millisecond resolution. This tachistoscope consists of three LCD-projectors equipped with zoom lenses to enable stimulus presentation via a built-in mirror-system onto a back projection screen from an adjacent room. Two high-speed liquid crystal shutters are mounted serially in front of each projector to control the stimulus duration. To verify the intended properties empirically, different sequences of presentation times were performed while changes in optical power were measured using a photoreceiver. The obtained results demonstrate that interfering variabilities in stimulus parameters and stimulus rendering are markedly reduced. Together with the possibility to collect external signals and to send trigger-signals to other devices, this tachistoscope represents a highly flexible and easy to set up research tool not only for the study of unconscious processing in the brain but for vision research in general. PMID:20122963

  1. Uncertainty Analysis for Peer Assessment: Oral Presentation Skills for Final Year Project

    ERIC Educational Resources Information Center

    Kim, Ho Sung

    2014-01-01

    Peer assessment plays an important role in engineering education for an active involvement in the assessment process, developing autonomy, enhancing reflection, and understanding of how to achieve the learning outcomes. Peer assessment uncertainty for oral presentation skills as part of the FYP assessment is studied. Validity and reliability for…

  2. Reliability Prediction of Ontology-Based Service Compositions Using Petri Net and Time Series Models

    PubMed Central

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy. PMID:24688429

  3. Reconciling Streamflow Uncertainty Estimation and River Bed Morphology Dynamics. Insights from a Probabilistic Assessment of Streamflow Uncertainties Using a Reliability Diagram

    NASA Astrophysics Data System (ADS)

    Morlot, T.; Mathevet, T.; Perret, C.; Favre Pugin, A. C.

    2014-12-01

    Streamflow uncertainty estimation has recently received a large attention in the literature. A dynamic rating curve assessment method has been introduced (Morlot et al., 2014). This dynamic method allows to compute a rating curve for each gauging and a continuous streamflow time-series, while calculating streamflow uncertainties. Streamflow uncertainty takes into account many sources of uncertainty (water level, rating curve interpolation and extrapolation, gauging aging, etc.) and produces an estimated distribution of streamflow for each days. In order to caracterise streamflow uncertainty, a probabilistic framework has been applied on a large sample of hydrometric stations of the Division Technique Générale (DTG) of Électricité de France (EDF) hydrometric network (>250 stations) in France. A reliability diagram (Wilks, 1995) has been constructed for some stations, based on the streamflow distribution estimated for a given day and compared to a real streamflow observation estimated via a gauging. To build a reliability diagram, we computed the probability of an observed streamflow (gauging), given the streamflow distribution. Then, the reliability diagram allows to check that the distribution of probabilities of non-exceedance of the gaugings follows a uniform law (i.e., quantiles should be equipropables). Given the shape of the reliability diagram, the probabilistic calibration is caracterised (underdispersion, overdispersion, bias) (Thyer et al., 2009). In this paper, we present case studies where reliability diagrams have different statistical properties for different periods. Compared to our knowledge of river bed morphology dynamic of these hydrometric stations, we show how reliability diagram gives us invaluable information on river bed movements, like a continuous digging or backfilling of the hydraulic control due to erosion or sedimentation processes. Hence, the careful analysis of reliability diagrams allows to reconcile statistics and long-term river bed morphology processes. This knowledge improves our real-time management of hydrometric stations, given a better caracterisation of erosion/sedimentation processes and the stability of hydrometric station hydraulic control.

  4. Multi-mode reliability-based design of horizontal curves.

    PubMed

    Essa, Mohamed; Sayed, Tarek; Hussein, Mohamed

    2016-08-01

    Recently, reliability analysis has been advocated as an effective approach to account for uncertainty in the geometric design process and to evaluate the risk associated with a particular design. In this approach, a risk measure (e.g. probability of noncompliance) is calculated to represent the probability that a specific design would not meet standard requirements. The majority of previous applications of reliability analysis in geometric design focused on evaluating the probability of noncompliance for only one mode of noncompliance such as insufficient sight distance. However, in many design situations, more than one mode of noncompliance may be present (e.g. insufficient sight distance and vehicle skidding at horizontal curves). In these situations, utilizing a multi-mode reliability approach that considers more than one failure (noncompliance) mode is required. The main objective of this paper is to demonstrate the application of multi-mode (system) reliability analysis to the design of horizontal curves. The process is demonstrated by a case study of Sea-to-Sky Highway located between Vancouver and Whistler, in southern British Columbia, Canada. Two noncompliance modes were considered: insufficient sight distance and vehicle skidding. The results show the importance of accounting for several noncompliance modes in the reliability model. The system reliability concept could be used in future studies to calibrate the design of various design elements in order to achieve consistent safety levels based on all possible modes of noncompliance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Building a Library Web Server on a Budget.

    ERIC Educational Resources Information Center

    Orr, Giles

    1998-01-01

    Presents a method for libraries with limited budgets to create reliable Web servers with existing hardware and free software available via the Internet. Discusses staff, hardware and software requirements, and security; outlines the assembly process. (PEN)

  6. Probing Reliability of Transport Phenomena Based Heat Transfer and Fluid Flow Analysis in Autogeneous Fusion Welding Process

    NASA Astrophysics Data System (ADS)

    Bag, S.; de, A.

    2010-09-01

    The transport phenomena based heat transfer and fluid flow calculations in weld pool require a number of input parameters. Arc efficiency, effective thermal conductivity, and viscosity in weld pool are some of these parameters, values of which are rarely known and difficult to assign a priori based on the scientific principles alone. The present work reports a bi-directional three-dimensional (3-D) heat transfer and fluid flow model, which is integrated with a real number based genetic algorithm. The bi-directional feature of the integrated model allows the identification of the values of a required set of uncertain model input parameters and, next, the design of process parameters to achieve a target weld pool dimension. The computed values are validated with measured results in linear gas-tungsten-arc (GTA) weld samples. Furthermore, a novel methodology to estimate the overall reliability of the computed solutions is also presented.

  7. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  8. Design and control strategy for a hybrid green energy system for mobile telecommunication sites

    NASA Astrophysics Data System (ADS)

    Okundamiya, Michael S.; Emagbetere, Joy O.; Ogujor, Emmanuel A.

    2014-07-01

    The rising energy costs and carbon footprint of operating mobile telecommunication sites in the emerging world have increased research interests in green technology. The intermittent nature of most green energy sources creates the problem of designing the optimum configuration for a given location. This study presents the design analysis and control strategy for a cost effective and reliable operation of the hybrid green energy system (HGES) for GSM base transceiver station (BTS) sites in isolated regions. The design constrains the generation and distribution of power to reliably satisfy the energy demand while ensuring safe operation of the system. The overall process control applies the genetic algorithm-based technique for optimal techno-economic sizing of system's components. The process simulation utilized meteorological data for 3 locations (Abuja, Benin City and Sokoto) with varying climatic conditions in Nigeria. Simulation results presented for green GSM BTS sites are discussed and compared with existing approaches.

  9. High-power VCSEL systems and applications

    NASA Astrophysics Data System (ADS)

    Moench, Holger; Conrads, Ralf; Deppe, Carsten; Derra, Guenther; Gronenborn, Stephan; Gu, Xi; Heusler, Gero; Kolb, Johanna; Miller, Michael; Pekarski, Pavel; Pollmann-Retsch, Jens; Pruijmboom, Armand; Weichmann, Ulrich

    2015-03-01

    Easy system design, compactness and a uniform power distribution define the basic advantages of high power VCSEL systems. Full addressability in space and time add new dimensions for optimization and enable "digital photonic production". Many thermal processes benefit from the improved control i.e. heat is applied exactly where and when it is needed. The compact VCSEL systems can be integrated into most manufacturing equipment, replacing batch processes using large furnaces and reducing energy consumption. This paper will present how recent technological development of high power VCSEL systems will extend efficiency and flexibility of thermal processes and replace not only laser systems, lamps and furnaces but enable new ways of production. High power VCSEL systems are made from many VCSEL chips, each comprising thousands of low power VCSELs. Systems scalable in power from watts to multiple ten kilowatts and with various form factors utilize a common modular building block concept. Designs for reliable high power VCSEL arrays and systems can be developed and tested on each building block level and benefit from the low power density and excellent reliability of the VCSELs. Furthermore advanced assembly concepts aim to reduce the number of individual processes and components and make the whole system even more simple and reliable.

  10. Closed-form solution of decomposable stochastic models

    NASA Technical Reports Server (NTRS)

    Sjogren, Jon A.

    1990-01-01

    Markov and semi-Markov processes are increasingly being used in the modeling of complex reconfigurable systems (fault tolerant computers). The estimation of the reliability (or some measure of performance) of the system reduces to solving the process for its state probabilities. Such a model may exhibit numerous states and complicated transition distributions, contributing to an expensive and numerically delicate solution procedure. Thus, when a system exhibits a decomposition property, either structurally (autonomous subsystems), or behaviorally (component failure versus reconfiguration), it is desirable to exploit this decomposition in the reliability calculation. In interesting cases there can be failure states which arise from non-failure states of the subsystems. Equations are presented which allow the computation of failure probabilities of the total (combined) model without requiring a complete solution of the combined model. This material is presented within the context of closed-form functional representation of probabilities as utilized in the Symbolic Hierarchical Automated Reliability and Performance Evaluator (SHARPE) tool. The techniques adopted enable one to compute such probability functions for a much wider class of systems at a reduced computational cost. Several examples show how the method is used, especially in enhancing the versatility of the SHARPE tool.

  11. Counting pollen grains using readily available, free image processing and analysis software.

    PubMed

    Costa, Clayton M; Yang, Suann

    2009-10-01

    Although many methods exist for quantifying the number of pollen grains in a sample, there are few standard methods that are user-friendly, inexpensive and reliable. The present contribution describes a new method of counting pollen using readily available, free image processing and analysis software. Pollen was collected from anthers of two species, Carduus acanthoides and C. nutans (Asteraceae), then illuminated on slides and digitally photographed through a stereomicroscope. Using ImageJ (NIH), these digital images were processed to remove noise and sharpen individual pollen grains, then analysed to obtain a reliable total count of the number of grains present in the image. A macro was developed to analyse multiple images together. To assess the accuracy and consistency of pollen counting by ImageJ analysis, counts were compared with those made by the human eye. Image analysis produced pollen counts in 60 s or less per image, considerably faster than counting with the human eye (5-68 min). In addition, counts produced with the ImageJ procedure were similar to those obtained by eye. Because count parameters are adjustable, this image analysis protocol may be used for many other plant species. Thus, the method provides a quick, inexpensive and reliable solution to counting pollen from digital images, not only reducing the chance of error but also substantially lowering labour requirements.

  12. Evaluation of engineering foods for Controlled Ecological Life Support Systems (CELSS)

    NASA Technical Reports Server (NTRS)

    Karel, M.

    1982-01-01

    The feasibility of developing acceptable and reliable engineered foods for use in controlled ecological support systems (CELSS) was evaluated. Food resupply and regeneration are calculated, flow charts of food processes in a multipurpose food pilot plant are presented, and equipment for a multipurpose food pilot plant and potential simplification of processes are discussed. Food-waste treatment and water usage in food processing and preparation are also considered.

  13. Practical Framework: Implementing OEE Method in Manufacturing Process Environment

    NASA Astrophysics Data System (ADS)

    Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.

    2016-02-01

    Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.

  14. Refinement, Validation and Benchmarking of a Model for E-Government Service Quality

    NASA Astrophysics Data System (ADS)

    Magoutas, Babis; Mentzas, Gregoris

    This paper presents the refinement and validation of a model for Quality of e-Government Services (QeGS). We built upon our previous work where a conceptualized model was identified and put focus on the confirmatory phase of the model development process, in order to come up with a valid and reliable QeGS model. The validated model, which was benchmarked with very positive results with similar models found in the literature, can be used for measuring the QeGS in a reliable and valid manner. This will form the basis for a continuous quality improvement process, unleashing the full potential of e-government services for both citizens and public administrations.

  15. Hadoop distributed batch processing for Gaia: a success story

    NASA Astrophysics Data System (ADS)

    Riello, Marco

    2015-12-01

    The DPAC Cambridge Data Processing Centre (DPCI) is responsible for the photometric calibration of the Gaia data including the low resolution spectra. The large data volume produced by Gaia (~26 billion transits/year), the complexity of its data stream and the self-calibrating approach pose unique challenges for scalability, reliability and robustness of both the software pipelines and the operations infrastructure. DPCI has been the first in DPAC to realise the potential of Hadoop and Map/Reduce and to adopt them as the core technologies for its infrastructure. This has proven a winning choice allowing DPCI unmatched processing throughput and reliability within DPAC to the point that other DPCs have started following our footsteps. In this talk we will present the software infrastructure developed to build the distributed and scalable batch data processing system that is currently used in production at DPCI and the excellent results in terms of performance of the system.

  16. Reliability design and verification for launch-vehicle propulsion systems - Report of an AIAA Workshop, Washington, DC, May 16, 17, 1989

    NASA Astrophysics Data System (ADS)

    Launch vehicle propulsion system reliability considerations during the design and verification processes are discussed. The tools available for predicting and minimizing anomalies or failure modes are described and objectives for validating advanced launch system propulsion reliability are listed. Methods for ensuring vehicle/propulsion system interface reliability are examined and improvements in the propulsion system development process are suggested to improve reliability in launch operations. Also, possible approaches to streamline the specification and procurement process are given. It is suggested that government and industry should define reliability program requirements and manage production and operations activities in a manner that provides control over reliability drivers. Also, it is recommended that sufficient funds should be invested in design, development, test, and evaluation processes to ensure that reliability is not inappropriately subordinated to other management considerations.

  17. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  18. Fiber laser platform for highest flexibility and reliability in industrial femtosecond micromachining: TruMicro Series 2000

    NASA Astrophysics Data System (ADS)

    Jansen, Florian; Kanal, Florian; Kahmann, Max; Tan, Chuong; Diekamp, Holger; Scelle, Raphael; Budnicki, Aleksander; Sutter, Dirk

    2018-02-01

    In this work we present an ultrafast laser system distinguished by its industry-ready reliability and its outstanding flexibility that allows for real-time process-inherent parameter. The robust system design and linear amplifier architecture make the all-fiber series TruMicro 2000 ideally suited for passive coupling to hollow-core delivery fibers. In addition to details on the laser system itself, various application examples are shown, including welding of different glasses and ablation of silicon carbide and silicon.

  19. Navstar Global Positioning System (GPS) clock program: Present and future

    NASA Technical Reports Server (NTRS)

    Tennant, D. M.

    1981-01-01

    Global Positioning System (GPS) program status are discussed and plans for ensuring the long term continuation of the program are presented. Performance of GPS clocks is presented in terms of on orbit data as portrayed by GPS master control station kalman filter processing. The GPS Clock reliability program is reviewed in depth and future plans fo the overall clock program are published.

  20. TruMicro Series 2000 sub-400 fs class industrial fiber lasers: adjustment of laser parameters to process requirements

    NASA Astrophysics Data System (ADS)

    Kanal, Florian; Kahmann, Max; Tan, Chuong; Diekamp, Holger; Jansen, Florian; Scelle, Raphael; Budnicki, Aleksander; Sutter, Dirk

    2017-02-01

    The matchless properties of ultrashort laser pulses, such as the enabling of cold processing and non-linear absorption, pave the way to numerous novel applications. Ultrafast lasers arrived in the last decade at a level of reliability suitable for the industrial environment.1 Within the next years many industrial manufacturing processes in several markets will be replaced by laser-based processes due to their well-known benefits: These are non-contact wear-free processing, higher process accuracy or an increase of processing speed and often improved economic efficiency compared to conventional processes. Furthermore, new processes will arise with novel sources, addressing previously unsolved challenges. One technical requirement for these exciting new applications will be to optimize the large number of available parameters to the requirements of the application. In this work we present an ultrafast laser system distinguished by its capability to combine high flexibility and real time process-inherent adjustments of the parameters with industry-ready reliability. This industry-ready reliability is ensured by a long experience in designing and building ultrashort-pulse lasers in combination with rigorous optimization of the mechanical construction, optical components and the entire laser head for continuous performance. By introducing a new generation of mechanical design in the last few years, TRUMPF enabled its ultrashort-laser platforms to fulfill the very demanding requirements for passively coupling high-energy single-mode radiation into a hollow-core transport fiber. The laser architecture presented here is based on the all fiber MOPA (master oscillator power amplifier) CPA (chirped pulse amplification) technology. The pulses are generated in a high repetition rate mode-locked fiber oscillator also enabling flexible pulse bursts (groups of multiple pulses) with 20 ns intra-burst pulse separation. An external acousto-optic modulator (XAOM) enables linearization and multi-level quad-loop stabilization of the output power of the laser.2 In addition to the well-established platform latest developments addressed single-pulse energies up to 50 μJ and made femtosecond pulse durations available for the TruMicro Series 2000. Beyond these stabilization aspects this laser architecture together with other optical modules and combined with smart laser control software enables process-driven adjustments of the parameters (e. g. repetition rate, multi-pulse functionalities, pulse energy, pulse duration) by external signals, which will be presented in this work.

  1. The Japanese version of the questionnaire about the process of recovery: development and validity and reliability testing.

    PubMed

    Kanehara, Akiko; Kotake, Risa; Miyamoto, Yuki; Kumakura, Yousuke; Morita, Kentaro; Ishiura, Tomoko; Shimizu, Kimiko; Fujieda, Yumiko; Ando, Shuntaro; Kondo, Shinsuke; Kasai, Kiyoto

    2017-11-07

    Personal recovery is increasingly recognised as an important outcome measure in mental health services. This study aimed to develop a Japanese version of the Questionnaire about the Process of Recovery (QPR-J) and test its validity and reliability. The study comprised two stages that employed the cross-sectional and prospective cohort designs, respectively. We translated the questionnaire using a standard translation/back-translation method. Convergent validity was examined by calculating Pearson's correlation coefficients with scores on the Recovery Assessment Scale (RAS) and the Short-Form-8 Health Survey (SF-8). An exploratory factor analysis (EFA) was conducted to examine factorial validity. We used intraclass correlation and Cronbach's alpha to examine the test-retest and internal consistency reliability of the QPR-J's 22-item full scale, 17-item intrapersonal and 5-item interpersonal subscales. We conducted an EFA along with a confirmatory factor analysis (CFA). Data were obtained from 197 users of mental health services (mean age: 42.0 years; 61.9% female; 49.2% diagnosed with schizophrenia). The QPR-J showed adequate convergent validity, exhibiting significant, positive correlations with the RAS and SF-8 scores. The QPR-J's full version, subscales, showed excellent test-retest and internal consistency reliability, with the exception of acceptable but relatively low internal consistency reliability for the interpersonal subscale. Based on the results of the CFA and EFA, we adopted the factor structure extracted from the original 2-factor model based on the present CFA. The QPR-J is an adequately valid and reliable measure of the process of recovery among Japanese users with mental health services.

  2. X-Ray Computed Tomography: The First Step in Mars Sample Return Processing

    NASA Technical Reports Server (NTRS)

    Welzenbach, L. C.; Fries, M. D.; Grady, M. M.; Greenwood, R. C.; McCubbin, F. M.; Zeigler, R. A.; Smith, C. L.; Steele, A.

    2017-01-01

    The Mars 2020 rover mission will collect and cache samples from the martian surface for possible retrieval and subsequent return to Earth. If the samples are returned, that mission would likely present an opportunity to analyze returned Mars samples within a geologic context on Mars. In addition, it may provide definitive information about the existence of past or present life on Mars. Mars sample return presents unique challenges for the collection, containment, transport, curation and processing of samples [1] Foremost in the processing of returned samples are the closely paired considerations of life detection and Planetary Protection. In order to achieve Mars Sample Return (MSR) science goals, reliable analyses will depend on overcoming some challenging signal/noise-related issues where sparse martian organic compounds must be reliably analyzed against the contamination background. While reliable analyses will depend on initial clean acquisition and robust documentation of all aspects of developing and managing the cache [2], there needs to be a reliable sample handling and analysis procedure that accounts for a variety of materials which may or may not contain evidence of past or present martian life. A recent report [3] suggests that a defined set of measurements should be made to effectively inform both science and Planetary Protection, when applied in the context of the two competing null hypotheses: 1) that there is no detectable life in the samples; or 2) that there is martian life in the samples. The defined measurements would include a phased approach that would be accepted by the community to preserve the bulk of the material, but provide unambiguous science data that can be used and interpreted by various disciplines. Fore-most is the concern that the initial steps would ensure the pristine nature of the samples. Preliminary, non-invasive techniques such as computed X-ray tomography (XCT) have been suggested as the first method to interrogate and characterize the cached samples without altering the materials [1,2]. A recent report [4] indicates that XCT may minimally alter samples for some techniques, and work is needed to quantify these effects, maximizing science return from XCT initial analysis while minimizing effects.

  3. Depressive Rumination, the Default-Mode Network, and the Dark Matter of Clinical Neuroscience.

    PubMed

    Hamilton, J Paul; Farmer, Madison; Fogelman, Phoebe; Gotlib, Ian H

    2015-08-15

    The intuitive association between self-focused rumination in major depressive disorder (MDD) and the self-referential operations performed by the brain's default-mode network (DMN) has prompted interest in examining the role of the DMN in MDD. In this article, we present meta-analytic findings showing reliably increased functional connectivity between the DMN and subgenual prefrontal cortex (sgPFC)-connectivity that often predicts levels of depressive rumination. We also present meta-analytic findings that, while there is reliably increased regional cerebral blood flow in sgPFC in MDD, no such abnormality has been reliably observed in nodes of the DMN. We then detail a model that integrates the body of research presented. In this model, we propose that increased functional connectivity between sgPFC and the DMN in MDD represents an integration of the self-referential processes supported by the DMN with the affectively laden, behavioral withdrawal processes associated with sgPFC-an integration that produces a functional neural ensemble well suited for depressive rumination and that, in MDD, abnormally taxes only sgPFC and not the DMN. This synthesis explains a broad array of existing data concerning the neural substrates of depressive rumination and provides an explicit account of functional abnormalities in sgPFC in MDD. Copyright © 2015 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  4. Modification site localization scoring integrated into a search engine.

    PubMed

    Baker, Peter R; Trinidad, Jonathan C; Chalkley, Robert J

    2011-07-01

    Large proteomic data sets identifying hundreds or thousands of modified peptides are becoming increasingly common in the literature. Several methods for assessing the reliability of peptide identifications both at the individual peptide or data set level have become established. However, tools for measuring the confidence of modification site assignments are sparse and are not often employed. A few tools for estimating phosphorylation site assignment reliabilities have been developed, but these are not integral to a search engine, so require a particular search engine output for a second step of processing. They may also require use of a particular fragmentation method and are mostly only applicable for phosphorylation analysis, rather than post-translational modifications analysis in general. In this study, we present the performance of site assignment scoring that is directly integrated into the search engine Protein Prospector, which allows site assignment reliability to be automatically reported for all modifications present in an identified peptide. It clearly indicates when a site assignment is ambiguous (and if so, between which residues), and reports an assignment score that can be translated into a reliability measure for individual site assignments.

  5. Using multivariate generalizability theory to assess the effect of content stratification on the reliability of a performance assessment.

    PubMed

    Keller, Lisa A; Clauser, Brian E; Swanson, David B

    2010-12-01

    In recent years, demand for performance assessments has continued to grow. However, performance assessments are notorious for lower reliability, and in particular, low reliability resulting from task specificity. Since reliability analyses typically treat the performance tasks as randomly sampled from an infinite universe of tasks, these estimates of reliability may not be accurate. For tests built according to a table of specifications, tasks are randomly sampled from different strata (content domains, skill areas, etc.). If these strata remain fixed in the test construction process, ignoring this stratification in the reliability analysis results in an underestimate of "parallel forms" reliability, and an overestimate of the person-by-task component. This research explores the effect of representing and misrepresenting the stratification appropriately in estimation of reliability and the standard error of measurement. Both multivariate and univariate generalizability studies are reported. Results indicate that the proper specification of the analytic design is essential in yielding the proper information both about the generalizability of the assessment and the standard error of measurement. Further, illustrative D studies present the effect under a variety of situations and test designs. Additional benefits of multivariate generalizability theory in test design and evaluation are also discussed.

  6. High power diode lasers emitting from 639 nm to 690 nm

    NASA Astrophysics Data System (ADS)

    Bao, L.; Grimshaw, M.; DeVito, M.; Kanskar, M.; Dong, W.; Guan, X.; Zhang, S.; Patterson, J.; Dickerson, P.; Kennedy, K.; Li, S.; Haden, J.; Martinsen, R.

    2014-03-01

    There is increasing market demand for high power reliable red lasers for display and cinema applications. Due to the fundamental material system limit at this wavelength range, red diode lasers have lower efficiency and are more temperature sensitive, compared to 790-980 nm diode lasers. In terms of reliability, red lasers are also more sensitive to catastrophic optical mirror damage (COMD) due to the higher photon energy. Thus developing higher power-reliable red lasers is very challenging. This paper will present nLIGHT's released red products from 639 nm to 690nm, with established high performance and long-term reliability. These single emitter diode lasers can work as stand-alone singleemitter units or efficiently integrate into our compact, passively-cooled Pearl™ fiber-coupled module architectures for higher output power and improved reliability. In order to further improve power and reliability, new chip optimizations have been focused on improving epitaxial design/growth, chip configuration/processing and optical facet passivation. Initial optimization has demonstrated promising results for 639 nm diode lasers to be reliably rated at 1.5 W and 690nm diode lasers to be reliably rated at 4.0 W. Accelerated life-test has started and further design optimization are underway.

  7. Processing and Properties Of Refractory Zirconium Diboride Composites For Use In High Temperature Applications

    NASA Technical Reports Server (NTRS)

    Stackpoole, Margaret; Gusman, M.; Ellerby, D.; Johnson, S. M.; Arnold, Jim (Technical Monitor)

    2001-01-01

    The Thermal Protection Materials and Systems Branch at NASA Ames Research Center is involved in the development of a class of refractory oxidation-resistant diboride composites termed Ultra High Temperature Ceramics or UHTCs. These composites have good high temperature properties making them candidate materials for thermal protection system (TPS) applications. The current research focuses on improving processing methods to develop more reliable composites with enhanced thermal and mechanical properties. This presentation will concentrate on the processing of ZrB2/SiC composites. Some preliminary mechanical properties and oxidation data will also be presented.

  8. Avoiding Assessment Anarchy. Quality Test Administration Strategies: Communicate Expectations, Reduce Variation, Increase Quality, Improve Relationships, Reward Excellence, Recognize Success.

    ERIC Educational Resources Information Center

    Matter, M. Kevin

    This paper presents strategies that address the needs of the school district assessment office for standardized procedures to support reliable and efficient test processing and reporting and that meet the needs of school staff for test administration guidelines. The key to test administration and processing quality is a knowledgeable test…

  9. Improving quality of laser scanning data acquisition through calibrated amplitude and pulse deviation measurement

    NASA Astrophysics Data System (ADS)

    Pfennigbauer, Martin; Ullrich, Andreas

    2010-04-01

    Newest developments in laser scanner technologies put surveyors in the position to comply with the ever increasing demand of high-speed, high-accuracy, and highly reliable data acquisition from terrestrial, mobile, and airborne platforms. Echo digitization in pulsed time-of-flight laser ranging has demonstrated its superior performance in the field of bathymetry and airborne laser scanning for more than a decade, however at the cost of somewhat time consuming off line post processing. State-of-the-art online waveform processing as implemented in RIEGL's V-Line not only saves users post-processing time to obtain true 3D point clouds, it also adds the assets of calibrated amplitude and reflectance measurement for data classification and pulse deviation determination for effective and reliable data validation. We present results from data acquisitions in different complex target situations.

  10. Project Longshot: A mission to Alpha Centauri

    NASA Technical Reports Server (NTRS)

    West, Curtis; Chamberlain, Sally; Pagan, Neftali; Stevens, Robert

    1989-01-01

    Project Longshot, an exercise in the Advanced Design Program for Space, had as its destination Alpha Centauri, the closest star system to our own solar system. Alpha Centauri, a trinary star system, is 4.34 light years from earth. Although Project Longshot is impossible based on existing technologies, areas that require further investigation in order to make this feat possible are identified. Three areas where advances in technology are needed are propulsion, data processing for autonomous command and control functions, and reliability. Propulsion, possibly by antimatter annihilation; navigation and navigation aids; reliable hardware and instruments; artificial intelligence to eliminate the need for command telemetry; laser communication; and a reliable, compact, and lightweight power system that converts energy efficiently and reliably present major challenges. Project Longshot promises exciting advances in science and technology and new information concerning the universe.

  11. Total systems design analysis of high performance structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1993-01-01

    Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.

  12. Reliability-based optimization of an active vibration controller using evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Saraygord Afshari, Sajad; Pourtakdoust, Seid H.

    2017-04-01

    Many modern industrialized systems such as aircrafts, rotating turbines, satellite booms, etc. cannot perform their desired tasks accurately if their uninhibited structural vibrations are not controlled properly. Structural health monitoring and online reliability calculations are emerging new means to handle system imposed uncertainties. As stochastic forcing are unavoidable, in most engineering systems, it is often needed to take them into the account for the control design process. In this research, smart material technology is utilized for structural health monitoring and control in order to keep the system in a reliable performance range. In this regard, a reliability-based cost function is assigned for both controller gain optimization as well as sensor placement. The proposed scheme is implemented and verified for a wing section. Comparison of results for the frequency responses is considered to show potential applicability of the presented technique.

  13. A Meta-Analysis of Hemodynamic Studies on First and Second Language Processing: Which Suggested Differences Can We Trust and What Do They Mean?

    ERIC Educational Resources Information Center

    Indefrey, Peter

    2006-01-01

    This article presents the results of a meta-analysis of 30 hemodynamic experiments comparing first language (L1) and second language (L2) processing in a range of tasks. The results suggest that reliably stronger activation during L2 processing is found (a) only for task-specific subgroups of L2 speakers and (b) within some, but not all regions…

  14. Statistics-related and reliability-physics-related failure processes in electronics devices and products

    NASA Astrophysics Data System (ADS)

    Suhir, E.

    2014-05-01

    The well known and widely used experimental reliability "passport" of a mass manufactured electronic or a photonic product — the bathtub curve — reflects the combined contribution of the statistics-related and reliability-physics (physics-of-failure)-related processes. When time progresses, the first process results in a decreasing failure rate, while the second process associated with the material aging and degradation leads to an increased failure rate. An attempt has been made in this analysis to assess the level of the reliability physics-related aging process from the available bathtub curve (diagram). It is assumed that the products of interest underwent the burn-in testing and therefore the obtained bathtub curve does not contain the infant mortality portion. It has been also assumed that the two random processes in question are statistically independent, and that the failure rate of the physical process can be obtained by deducting the theoretically assessed statistical failure rate from the bathtub curve ordinates. In the carried out numerical example, the Raleigh distribution for the statistical failure rate was used, for the sake of a relatively simple illustration. The developed methodology can be used in reliability physics evaluations, when there is a need to better understand the roles of the statistics-related and reliability-physics-related irreversible random processes in reliability evaluations. The future work should include investigations on how powerful and flexible methods and approaches of the statistical mechanics can be effectively employed, in addition to reliability physics techniques, to model the operational reliability of electronic and photonic products.

  15. Toward a holistic environmental impact assessment of marble quarrying and processing: proposal of a novel easy-to-use IPAT-based method.

    PubMed

    Capitano, Cinzia; Peri, Giorgia; Rizzo, Gianfranco; Ferrante, Patrizia

    2017-03-01

    Marble is a natural dimension stone that is widely used in building due to its resistance and esthetic qualities. Unfortunately, some concerns have arisen regarding its production process because quarrying and processing activities demand significant amounts of energy and greatly affect the environment. Further, performing an environmental analysis of a production process such as that of marble requires the consideration of many environmental aspects (e.g., noise, vibrations, dust and waste production, energy consumption). Unfortunately, the current impact accounting tools do not seem to be capable of considering all of the major aspects of the (marble) production process that may affect the environment and thus cannot provide a comprehensive and concise assessment of all environmental aspects associated with the marble production process. Therefore, innovative, easy, and reliable methods for evaluating its environmental impact are necessary, and they must be accessible for the non-technician. The present study intends to provide a contribution in this sense by proposing a reliable and easy-to-use evaluation method to assess the significance of the environmental impacts associated with the marble production process. In addition, an application of the method to an actual marble-producing company is presented to demonstrate its practicability. Because of its relative ease of use, the method presented here can also be used as a "self-assessment" tool for pursuing a virtuous environmental policy because it enables company owners to easily identify the segments of their production chain that most require environmental enhancement.

  16. Power processing systems for ion thrusters.

    NASA Technical Reports Server (NTRS)

    Herron, B. G.; Garth, D. R.; Finke, R. C.; Shumaker, H. A.

    1972-01-01

    The proposed use of ion thrusters to fulfill various communication satellite propulsion functions such as east-west and north-south stationkeeping, attitude control, station relocation and orbit raising, naturally leads to the requirement for lightweight, efficient and reliable thruster power processing systems. Collectively, the propulsion requirements dictate a wide range of thruster power levels and operational lifetimes, which must be matched by the power processing. This paper will discuss the status of such power processing systems, present system design alternatives and project expected near future power system performance.

  17. Process for the physical segregation of minerals

    DOEpatents

    Yingling, Jon C.; Ganguli, Rajive

    2004-01-06

    With highly heterogeneous groups or streams of minerals, physical segregation using online quality measurements is an economically important first stage of the mineral beneficiation process. Segregation enables high quality fractions of the stream to bypass processing, such as cleaning operations, thereby reducing the associated costs and avoiding the yield losses inherent in any downstream separation process. The present invention includes various methods for reliably segregating a mineral stream into at least one fraction meeting desired quality specifications while at the same time maximizing yield of that fraction.

  18. Reliability models applicable to space telescope solar array assembly system

    NASA Technical Reports Server (NTRS)

    Patil, S. A.

    1986-01-01

    A complex system may consist of a number of subsystems with several components in series, parallel, or combination of both series and parallel. In order to predict how well the system will perform, it is necessary to know the reliabilities of the subsystems and the reliability of the whole system. The objective of the present study is to develop mathematical models of the reliability which are applicable to complex systems. The models are determined by assuming k failures out of n components in a subsystem. By taking k = 1 and k = n, these models reduce to parallel and series models; hence, the models can be specialized to parallel, series combination systems. The models are developed by assuming the failure rates of the components as functions of time and as such, can be applied to processes with or without aging effects. The reliability models are further specialized to Space Telescope Solar Arrray (STSA) System. The STSA consists of 20 identical solar panel assemblies (SPA's). The reliabilities of the SPA's are determined by the reliabilities of solar cell strings, interconnects, and diodes. The estimates of the reliability of the system for one to five years are calculated by using the reliability estimates of solar cells and interconnects given n ESA documents. Aging effects in relation to breaks in interconnects are discussed.

  19. Indicating spinal joint mobilisations or manipulations in patients with neck or low-back pain: protocol of an inter-examiner reliability study among manual therapists.

    PubMed

    van Trijffel, Emiel; Lindeboom, Robert; Bossuyt, Patrick Mm; Schmitt, Maarten A; Lucas, Cees; Koes, Bart W; Oostendorp, Rob Ab

    2014-01-01

    Manual spinal joint mobilisations and manipulations are widely used treatments in patients with neck and low-back pain. Inter-examiner reliability of passive intervertebral motion assessment of the cervical and lumbar spine, perceived as important for indicating these interventions, is poor within a univariable approach. The diagnostic process as a whole in daily practice in manual therapy has a multivariable character, however, in which the use and interpretation of passive intervertebral motion assessment depend on earlier results from the diagnostic process. To date, the inter-examiner reliability among manual therapists of a multivariable diagnostic decision-making process in patients with neck or low-back pain is unknown. This study will be conducted as a repeated-measures design in which 14 pairs of manual therapists independently examine a consecutive series of a planned total of 165 patients with neck or low-back pain presenting in primary care physiotherapy. Primary outcome measure is therapists' decision about whether or not manual spinal joint mobilisations or manipulations, or both, are indicated in each patient, alone or as part of a multimodal treatment. Therapists will largely be free to conduct the full diagnostic process based on their formulated examination objectives. For each pair of therapists, 2×2 tables will be constructed and reliability for the dichotomous decision will be expressed using Cohen's kappa. In addition, observed agreement, prevalence of positive decisions, prevalence index, bias index, and specific agreement in positive and negative decisions will be calculated. Univariable logistic regression analysis of concordant decisions will be performed to explore which demographic, professional, or clinical factors contributed to reliability. This study will provide an estimate of the inter-examiner reliability among manual therapists of indicating spinal joint mobilisations or manipulations in patients with neck or low-back pain based on a multivariable diagnostic reasoning and decision-making process, as opposed to reliability of individual tests. As such, it is proposed as an initial step toward the development of an alternative approach to current classification systems and prediction rules for identifying those patients with spinal disorders that may show a better response to manual therapy which can be incorporated in randomised clinical trials. Potential methodological limitations of this study are discussed.

  20. Indicating spinal joint mobilisations or manipulations in patients with neck or low-back pain: protocol of an inter-examiner reliability study among manual therapists

    PubMed Central

    2014-01-01

    Background Manual spinal joint mobilisations and manipulations are widely used treatments in patients with neck and low-back pain. Inter-examiner reliability of passive intervertebral motion assessment of the cervical and lumbar spine, perceived as important for indicating these interventions, is poor within a univariable approach. The diagnostic process as a whole in daily practice in manual therapy has a multivariable character, however, in which the use and interpretation of passive intervertebral motion assessment depend on earlier results from the diagnostic process. To date, the inter-examiner reliability among manual therapists of a multivariable diagnostic decision-making process in patients with neck or low-back pain is unknown. Methods This study will be conducted as a repeated-measures design in which 14 pairs of manual therapists independently examine a consecutive series of a planned total of 165 patients with neck or low-back pain presenting in primary care physiotherapy. Primary outcome measure is therapists’ decision about whether or not manual spinal joint mobilisations or manipulations, or both, are indicated in each patient, alone or as part of a multimodal treatment. Therapists will largely be free to conduct the full diagnostic process based on their formulated examination objectives. For each pair of therapists, 2×2 tables will be constructed and reliability for the dichotomous decision will be expressed using Cohen’s kappa. In addition, observed agreement, prevalence of positive decisions, prevalence index, bias index, and specific agreement in positive and negative decisions will be calculated. Univariable logistic regression analysis of concordant decisions will be performed to explore which demographic, professional, or clinical factors contributed to reliability. Discussion This study will provide an estimate of the inter-examiner reliability among manual therapists of indicating spinal joint mobilisations or manipulations in patients with neck or low-back pain based on a multivariable diagnostic reasoning and decision-making process, as opposed to reliability of individual tests. As such, it is proposed as an initial step toward the development of an alternative approach to current classification systems and prediction rules for identifying those patients with spinal disorders that may show a better response to manual therapy which can be incorporated in randomised clinical trials. Potential methodological limitations of this study are discussed. PMID:24982754

  1. Current medical staff governance and physician sensemaking: a formula for resistance to high reliability.

    PubMed

    Flitter, Marc A; Riesenmy, Kelly Rouse; van Stralen, Daved

    2012-01-01

    To offer a theoretical explanation for observed physician resistance and rejection of high reliability patient safety initiatives. A grounded theoretical qualitative approach, utilizing the organizational theory of sensemaking, provided the foundation for inductive and deductive reasoning employed to analyze medical staff rejection of two successfully performing high reliability programs at separate hospitals. Physician behaviors resistant to patient-centric high reliability processes were traced to provider-centric physician sensemaking. Research, conducted with the advantage that prospective studies have over the limitations of this retrospective investigation, is needed to evaluate the potential for overcoming physician resistance to innovation implementation, employing strategies based upon these findings and sensemaking theory in general. If hospitals are to emulate high reliability industries that do successfully manage environments of extreme hazard, physicians must be fully integrated into the complex teams required to accomplish this goal. Reforming health care, through high reliability organizing, with its attendant continuous focus on patient-centric processes, offers a distinct alternative to efforts directed primarily at reforming health care insurance. It is by changing how health care is provided that true cost efficiencies can be achieved. Technology and the insights of organizational science present the opportunity of replacing the current emphasis on privileged information with collective tools capable of providing quality and safety in health care. The fictions that have sustained a provider-centric health care system have been challenged. The benefits of patient-centric care should be obtainable.

  2. Perceiving numbers does not cause automatic shifts of spatial attention.

    PubMed

    Fattorini, Enrico; Pinto, Mario; Rotondaro, Francesca; Doricchi, Fabrizio

    2015-12-01

    It is frequently assumed that the brain codes number magnitudes according to an inherent left-to-right spatial organization. In support of this hypothesis it has been reported that in humans, perceiving small numbers induces automatic shifts of attention toward the left side of space whereas perceiving large numbers automatically shifts attention to the right side of space (i.e., Attentional SNARC: Att-SNARC; Fischer, Castel, Dodd, & Pratt, 2003). Nonetheless, the Att-SNARC has been often not replicated and its reliability never tested. To ascertain whether the mere perception of numbers causes shifts of spatial attention or whether number-space interaction takes place at a different stage of cognitive processing, we re-assessed the consistency and reliability of the Att-SNARC and investigated its role in the production of SNARC effects in Parity Judgement (PJ) and Magnitude Comparison (MC) tasks. In a first study in 60 participants, we found no Att-SNARC, despite finding strong PJ- and MC-SNARC effects. No correlation was present between the Att-SNARC and the SNARC. Split-half tests showed no reliability of the Att-SNARC and high reliability of the PJ- and MC-SNARC. In a second study, we re-assessed the Att-SNARC and tested its direct influence on a MC-SNARC task with laterally presented targets. No Att-SNARC and no influence of the Att-SNARC on the MC-SNARC were found. Also in this case, the SNARC was reliable whereas the Att-SNARC task was not. Finally, in a third study we observed a significant Att-SNARC when participants were asked to recall the position occupied on a ruler by the numbers presented in each trial: however the Att-SNARC task was not reliable. These results show that perceiving numbers does not cause automatic shifts of spatial attention and that whenever present, these shifts do not modulate the SNARC. The same results suggest that numbers have no inherent mental left-to-right organization and that, whenever present, this organization can have both response-related and strategically driven memory-related origins. Nonetheless, response-related factors generate more reliable and stable spatial representations of numbers. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Analysis the Transient Process of Wind Power Resources when there are Voltage Sags in Distribution Grid

    NASA Astrophysics Data System (ADS)

    Nhu Y, Do

    2018-03-01

    Vietnam has many advantages of wind power resources. Time by time there are more and more capacity as well as number of wind power project in Vietnam. Corresponding to the increase of wind power emitted into national grid, It is necessary to research and analyze in order to ensure the safety and reliability of win power connection. In national distribution grid, voltage sag occurs regularly, it can strongly influence on the operation of wind power. The most serious consequence is the disconnection. The paper presents the analysis of distribution grid's transient process when voltage is sagged. Base on the analysis, the solutions will be recommended to improve the reliability and effective operation of wind power resources.

  4. Reliability modelling and analysis of a multi-state element based on a dynamic Bayesian network

    NASA Astrophysics Data System (ADS)

    Li, Zhiqiang; Xu, Tingxue; Gu, Junyuan; Dong, Qi; Fu, Linyu

    2018-04-01

    This paper presents a quantitative reliability modelling and analysis method for multi-state elements based on a combination of the Markov process and a dynamic Bayesian network (DBN), taking perfect repair, imperfect repair and condition-based maintenance (CBM) into consideration. The Markov models of elements without repair and under CBM are established, and an absorbing set is introduced to determine the reliability of the repairable element. According to the state-transition relations between the states determined by the Markov process, a DBN model is built. In addition, its parameters for series and parallel systems, namely, conditional probability tables, can be calculated by referring to the conditional degradation probabilities. Finally, the power of a control unit in a failure model is used as an example. A dynamic fault tree (DFT) is translated into a Bayesian network model, and subsequently extended to a DBN. The results show the state probabilities of an element and the system without repair, with perfect and imperfect repair, and under CBM, with an absorbing set plotted by differential equations and verified. Through referring forward, the reliability value of the control unit is determined in different kinds of modes. Finally, weak nodes are noted in the control unit.

  5. SenseMyHeart: A cloud service and API for wearable heart monitors.

    PubMed

    Pinto Silva, P M; Silva Cunha, J P

    2015-01-01

    In the era of ubiquitous computing, the growing adoption of wearable systems and body sensor networks is trailing the path for new research and software for cardiovascular intensity, energy expenditure and stress and fatigue detection through cardiovascular monitoring. Several systems have received clinical-certification and provide huge amounts of reliable heart-related data in a continuous basis. PhysioNet provides equally reliable open-source software tools for ECG processing and analysis that can be combined with these devices. However, this software remains difficult to use in a mobile environment and for researchers unfamiliar with Linux-based systems. In the present paper we present an approach that aims at tackling these limitations by developing a cloud service that provides an API for a PhysioNet-based pipeline for ECG processing and Heart Rate Variability measurement. We describe the proposed solution, along with its advantages and tradeoffs. We also present some client tools (windows and Android) and several projects where the developed cloud service has been used successfully as a standard for Heart Rate and Heart Rate Variability studies in different scenarios.

  6. Life test failure of harmonic gears in a Two-axis Gimbal for the Mars Reconnaissance Orbiter Spacecraft

    NASA Technical Reports Server (NTRS)

    Johnson, Michael R.; Gehling, Russ; Head, Ray

    2006-01-01

    This paper will present a process for increasing the stiffness of harmonic gear assemblies and recommend a maximum stiffness point that, if exceeded, compromises the reliability of the gear components for long life applications.

  7. The probability estimation of the electronic lesson implementation taking into account software reliability

    NASA Astrophysics Data System (ADS)

    Gurov, V. V.

    2017-01-01

    Software tools for educational purposes, such as e-lessons, computer-based testing system, from the point of view of reliability, have a number of features. The main ones among them are the need to ensure a sufficiently high probability of their faultless operation for a specified time, as well as the impossibility of their rapid recovery by the way of replacing it with a similar running program during the classes. The article considers the peculiarities of reliability evaluation of programs in contrast to assessments of hardware reliability. The basic requirements to reliability of software used for carrying out practical and laboratory classes in the form of computer-based training programs are given. The essential requirements applicable to the reliability of software used for conducting the practical and laboratory studies in the form of computer-based teaching programs are also described. The mathematical tool based on Markov chains, which allows to determine the degree of debugging of the training program for use in the educational process by means of applying the graph of the software modules interaction, is presented.

  8. System reliability, performance and trust in adaptable automation.

    PubMed

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-01-01

    The present study examined the effects of reduced system reliability on operator performance and automation management in an adaptable automation environment. 39 operators were randomly assigned to one of three experimental groups: low (60%), medium (80%), and high (100%) reliability of automation support. The support system provided five incremental levels of automation which operators could freely select according to their needs. After 3 h of training on a simulated process control task (AutoCAMS) in which the automation worked infallibly, operator performance and automation management were measured during a 2.5-h testing session. Trust and workload were also assessed through questionnaires. Results showed that although reduced system reliability resulted in lower levels of trust towards automation, there were no corresponding differences in the operators' reliance on automation. While operators showed overall a noteworthy ability to cope with automation failure, there were, however, decrements in diagnostic speed and prospective memory with lower reliability. Copyright © 2015. Published by Elsevier Ltd.

  9. The Automated Array Assembly Task of the Low-cost Silicon Solar Array Project, Phase 2

    NASA Technical Reports Server (NTRS)

    Coleman, M. G.; Grenon, L.; Pastirik, E. M.; Pryor, R. A.; Sparks, T. G.

    1978-01-01

    An advanced process sequence for manufacturing high efficiency solar cells and modules in a cost-effective manner is discussed. Emphasis is on process simplicity and minimizing consumed materials. The process sequence incorporates texture etching, plasma processes for damage removal and patterning, ion implantation, low pressure silicon nitride deposition, and plated metal. A reliable module design is presented. Specific process step developments are given. A detailed cost analysis was performed to indicate future areas of fruitful cost reduction effort. Recommendations for advanced investigations are included.

  10. The Reliability of Teacher Decision-Making in Recommending Accommodations for Large-Scale Tests. Technical Report # 08-01

    ERIC Educational Resources Information Center

    Tindal, Gerald; Lee, Daesik; Geller, Leanne Ketterlin

    2008-01-01

    In this paper we review different methods for teachers to recommend accommodations in large scale tests. Then we present data on the stability of their judgments on variables relevant to this decision-making process. The outcomes from the judgments support the need for a more explicit model. Four general categories are presented: student…

  11. A software tool to analyze clinical workflows from direct observations.

    PubMed

    Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander

    2015-01-01

    Observational data of clinical processes need to be managed in a convenient way, so that process information is reliable, valid and viable for further analysis. However, existing tools for allocating observations fail in systematic data collection of specific workflow recordings. We present a software tool which was developed to facilitate the analysis of clinical process observations. The tool was successfully used in the project OntoHealth, to build, store and analyze observations of diabetes routine consultations.

  12. Seeking high reliability in primary care: Leadership, tools, and organization.

    PubMed

    Weaver, Robert R

    2015-01-01

    Leaders in health care increasingly recognize that improving health care quality and safety requires developing an organizational culture that fosters high reliability and continuous process improvement. For various reasons, a reliability-seeking culture is lacking in most health care settings. Developing a reliability-seeking culture requires leaders' sustained commitment to reliability principles using key mechanisms to embed those principles widely in the organization. The aim of this study was to examine how key mechanisms used by a primary care practice (PCP) might foster a reliability-seeking, system-oriented organizational culture. A case study approach was used to investigate the PCP's reliability culture. The study examined four cultural artifacts used to embed reliability-seeking principles across the organization: leadership statements, decision support tools, and two organizational processes. To decipher their effects on reliability, the study relied on observations of work patterns and the tools' use, interactions during morning huddles and process improvement meetings, interviews with clinical and office staff, and a "collective mindfulness" questionnaire. The five reliability principles framed the data analysis. Leadership statements articulated principles that oriented the PCP toward a reliability-seeking culture of care. Reliability principles became embedded in the everyday discourse and actions through the use of "problem knowledge coupler" decision support tools and daily "huddles." Practitioners and staff were encouraged to report unexpected events or close calls that arose and which often initiated a formal "process change" used to adjust routines and prevent adverse events from recurring. Activities that foster reliable patient care became part of the taken-for-granted routine at the PCP. The analysis illustrates the role leadership, tools, and organizational processes play in developing and embedding a reliable-seeking culture across an organization. Progress toward a reliability-seeking, system-oriented approach to care remains ongoing, and movement in that direction requires deliberate and sustained effort by committed leaders in health care.

  13. Assessing the utility of frequency tagging for tracking memory-based reactivation of word representations.

    PubMed

    Lewis, Ashley Glen; Schriefers, Herbert; Bastiaansen, Marcel; Schoffelen, Jan-Mathijs

    2018-05-21

    Reinstatement of memory-related neural activity measured with high temporal precision potentially provides a useful index for real-time monitoring of the timing of activation of memory content during cognitive processing. The utility of such an index extends to any situation where one is interested in the (relative) timing of activation of different sources of information in memory, a paradigm case of which is tracking lexical activation during language processing. Essential for this approach is that memory reinstatement effects are robust, so that their absence (in the average) definitively indicates that no lexical activation is present. We used electroencephalography to test the robustness of a reported subsequent memory finding involving reinstatement of frequency-specific entrained oscillatory brain activity during subsequent recognition. Participants learned lists of words presented on a background flickering at either 6 or 15 Hz to entrain a steady-state brain response. Target words subsequently presented on a non-flickering background that were correctly identified as previously seen exhibited reinstatement effects at both entrainment frequencies. Reliability of these statistical inferences was however critically dependent on the approach used for multiple comparisons correction. We conclude that effects are not robust enough to be used as a reliable index of lexical activation during language processing.

  14. Reliable aluminum contact formation by electrostatic bonding

    NASA Astrophysics Data System (ADS)

    Kárpáti, T.; Pap, A. E.; Radnóczi, Gy; Beke, B.; Bársony, I.; Fürjes, P.

    2015-07-01

    The paper presents a detailed study of a reliable method developed for aluminum fusion wafer bonding assisted by the electrostatic force evolving during the anodic bonding process. The IC-compatible procedure described allows the parallel formation of electrical and mechanical contacts, facilitating a reliable packaging of electromechanical systems with backside electrical contacts. This fusion bonding method supports the fabrication of complex microelectromechanical systems (MEMS) and micro-opto-electromechanical systems (MOEMS) structures with enhanced temperature stability, which is crucial in mechanical sensor applications such as pressure or force sensors. Due to the applied electrical potential of  -1000 V the Al metal layers are compressed by electrostatic force, and at the bonding temperature of 450 °C intermetallic diffusion causes aluminum ions to migrate between metal layers.

  15. A Bayesian modification to the Jelinski-Moranda software reliability growth model

    NASA Technical Reports Server (NTRS)

    Littlewood, B.; Sofer, A.

    1983-01-01

    The Jelinski-Moranda (JM) model for software reliability was examined. It is suggested that a major reason for the poor results given by this model is the poor performance of the maximum likelihood method (ML) of parameter estimation. A reparameterization and Bayesian analysis, involving a slight modelling change, are proposed. It is shown that this new Bayesian-Jelinski-Moranda model (BJM) is mathematically quite tractable, and several metrics of interest to practitioners are obtained. The BJM and JM models are compared by using several sets of real software failure data collected and in all cases the BJM model gives superior reliability predictions. A change in the assumption which underlay both models to present the debugging process more accurately is discussed.

  16. A Conceptual Design for a Reliable Optical Bus (ROBUS)

    NASA Technical Reports Server (NTRS)

    Miner, Paul S.; Malekpour, Mahyar; Torres, Wilfredo

    2002-01-01

    The Scalable Processor-Independent Design for Electromagnetic Resilience (SPIDER) is a new family of fault-tolerant architectures under development at NASA Langley Research Center (LaRC). The SPIDER is a general-purpose computational platform suitable for use in ultra-reliable embedded control applications. The design scales from a small configuration supporting a single aircraft function to a large distributed configuration capable of supporting several functions simultaneously. SPIDER consists of a collection of simplex processing elements communicating via a Reliable Optical Bus (ROBUS). The ROBUS is an ultra-reliable, time-division multiple access broadcast bus with strictly enforced write access (no babbling idiots) providing basic fault-tolerant services using formally verified fault-tolerance protocols including Interactive Consistency (Byzantine Agreement), Internal Clock Synchronization, and Distributed Diagnosis. The conceptual design of the ROBUS is presented in this paper including requirements, topology, protocols, and the block-level design. Verification activities, including the use of formal methods, are also discussed.

  17. [Evaluation of the reliability of freight elevator operators].

    PubMed

    Gosk, A; Borodulin-Nadzieja, L; Janocha, A; Salomon, E

    1991-01-01

    The study involved 58 workers employed at winding machines. Their reliability was estimated from the results of psychomotoric test precision, condition of the vegetative nervous system, and from the results of psychological tests. The tests were carried out at the laboratory and at the workplaces, with all distractive factors and functional connection of the work process present. We have found that the reliability of the workers may be affected by a variety of factors. Among the winding machine operators, work monotony can lead to "monotony syndrome". Among the signalists , the appreciation of great responsibility can lead to unpredictable and non-adequate reactions. From both groups, persons displaying a lower-than-average precision were isolated. All those persons demonstrated a reckless attitude and the opinion of their superiors about them was poor. Those persons constitute potential risk for the reliable operation of the discussed team.

  18. A new and reliable method for live imaging and quantification of reactive oxygen species in Botrytis cinerea: technological advancement.

    PubMed

    Marschall, Robert; Tudzynski, Paul

    2014-10-01

    Reactive oxygen species (ROS) are produced in conserved cellular processes either as by-products of the cellular respiration in mitochondria, or purposefully for defense mechanisms, signaling cascades or cell homeostasis. ROS have two diametrically opposed attributes due to their highly damaging potential for DNA, lipids and other molecules and due to their indispensability for signaling and developmental processes. In filamentous fungi, the role of ROS in growth and development has been studied in detail, but these analyses were often hampered by the lack of reliable and specific techniques to monitor different activities of ROS in living cells. Here, we present a new method for live cell imaging of ROS in filamentous fungi. We demonstrate that by use of a mixture of two fluorescent dyes it is possible to monitor H2O2 and superoxide specifically and simultaneously in distinct cellular structures during various hyphal differentiation processes. In addition, the method allows for reliable fluorometric quantification of ROS. We demonstrate that this can be used to characterize different mutants with respect to their ROS production/scavenging potential. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Nearest-neighbor guided evaluation of data reliability and its applications.

    PubMed

    Boongoen, Tossapon; Shen, Qiang

    2010-12-01

    The intuition of data reliability has recently been incorporated into the main stream of research on ordered weighted averaging (OWA) operators. Instead of relying on human-guided variables, the aggregation behavior is determined in accordance with the underlying characteristics of the data being aggregated. Data-oriented operators such as the dependent OWA (DOWA) utilize centralized data structures to generate reliable weights, however. Despite their simplicity, the approach taken by these operators neglects entirely any local data structure that represents a strong agreement or consensus. To address this issue, the cluster-based OWA (Clus-DOWA) operator has been proposed. It employs a cluster-based reliability measure that is effective to differentiate the accountability of different input arguments. Yet, its actual application is constrained by the high computational requirement. This paper presents a more efficient nearest-neighbor-based reliability assessment for which an expensive clustering process is not required. The proposed measure can be perceived as a stress function, from which the OWA weights and associated decision-support explanations can be generated. To illustrate the potential of this measure, it is applied to both the problem of information aggregation for alias detection and the problem of unsupervised feature selection (in which unreliable features are excluded from an actual learning process). Experimental results demonstrate that these techniques usually outperform their conventional state-of-the-art counterparts.

  20. Research on Novel Algorithms for Smart Grid Reliability Assessment and Economic Dispatch

    NASA Astrophysics Data System (ADS)

    Luo, Wenjin

    In this dissertation, several studies of electric power system reliability and economy assessment methods are presented. To be more precise, several algorithms in evaluating power system reliability and economy are studied. Furthermore, two novel algorithms are applied to this field and their simulation results are compared with conventional results. As the electrical power system develops towards extra high voltage, remote distance, large capacity and regional networking, the application of a number of new technique equipments and the electric market system have be gradually established, and the results caused by power cut has become more and more serious. The electrical power system needs the highest possible reliability due to its complication and security. In this dissertation the Boolean logic Driven Markov Process (BDMP) method is studied and applied to evaluate power system reliability. This approach has several benefits. It allows complex dynamic models to be defined, while maintaining its easy readability as conventional methods. This method has been applied to evaluate IEEE reliability test system. The simulation results obtained are close to IEEE experimental data which means that it could be used for future study of the system reliability. Besides reliability, modern power system is expected to be more economic. This dissertation presents a novel evolutionary algorithm named as quantum evolutionary membrane algorithm (QEPS), which combines the concept and theory of quantum-inspired evolutionary algorithm and membrane computation, to solve the economic dispatch problem in renewable power system with on land and offshore wind farms. The case derived from real data is used for simulation tests. Another conventional evolutionary algorithm is also used to solve the same problem for comparison. The experimental results show that the proposed method is quick and accurate to obtain the optimal solution which is the minimum cost for electricity supplied by wind farm system.

  1. High-efficiency high-reliability optical components for a large, high-average-power visible laser system

    NASA Astrophysics Data System (ADS)

    Taylor, John R.; Stolz, Christopher J.

    1993-08-01

    Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.

  2. High-efficiency high-reliability optical components for a large, high-average-power visible laser system

    NASA Astrophysics Data System (ADS)

    Taylor, J. R.; Stolz, C. J.

    1992-12-01

    Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.

  3. The multiple mini-interview for selecting medical residents: first experience in the Middle East region.

    PubMed

    Ahmed, Ashraf; Qayed, Khalil Ibrahim; Abdulrahman, Mahera; Tavares, Walter; Rosenfeld, Jack

    2014-08-01

    Numerous studies have shown that multiple mini-interviews (MMI) provides a standard, fair, and more reliable method for assessing applicants. This article presents the first MMI experience for selection of medical residents in the Middle East culture and an Arab country. In 2012, we started using the MMI in interviewing applicants to the residency program of Dubai Health Authority. This interview process consisted of eight, eight-minute structured interview scenarios. Applicants rotated through the stations, each with its own interviewer and scenario. They read the scenario and were requested to discuss the issues with the interviewers. Sociodemographic and station assessment data provided for each applicant were analyzed to determine whether the MMI was a reliable assessment of the non-clinical attributes in the present setting of an Arab country. One hundred and eighty-seven candidates from 27 different countries were interviewed for Dubai Residency Training Program using MMI. They were graduates of 5 medical universities within United Arab Emirates (UAE) and 60 different universities outside UAE. With this applicant's pool, a MMI with eight stations, produced absolute and relative reliability of 0.8 and 0.81, respectively. The person × station interaction contributed 63% of the variance components, the person contributed 34% of the variance components, and the station contributed 2% of the variance components. The MMI has been used in numerous universities in English speaking countries. The MMI evaluates non-clinical attributes and this study provides further evidence for its reliability but in a different country and culture. The MMI offers a fair and more reliable assessment of applicants to medical residency programs. The present data show that this assessment technique applied in a non-western country and Arab culture still produced reliable results.

  4. Evaluation of hides and leather using ultrasonic technology

    USDA-ARS?s Scientific Manuscript database

    Hides are visually inspected and ranked for quality and sale price. Because visual inspection is not reliable for detecting defects when hair is present, hides cannot be effectively sorted at the earliest stage of processing. Furthermore, this subjective assessment is non-uniform among operators, ...

  5. Evaluation of coal feed systems being developed by the Energy Research and Development administration

    NASA Technical Reports Server (NTRS)

    Phen, R. L.; Luckow, W. K.; Mattson, L.; Otth, D.; Tsou, P.

    1977-01-01

    Development criteria and recommendations for coal feed system selections that include supporting data are presented. Considered are the areas of coal feed coasts, coal feed system reliability, and the interaction of the feed system with the conversion process.

  6. TCOPPE school environmental audit tool: assessing safety and walkability of school environments.

    PubMed

    Lee, Chanam; Kim, Hyung Jin; Dowdy, Diane M; Hoelscher, Deanna M; Ory, Marcia G

    2013-09-01

    Several environmental audit instruments have been developed for assessing streets, parks and trails, but none for schools. This paper introduces a school audit tool that includes 3 subcomponents: 1) street audit, 2) school site audit, and 3) map audit. It presents the conceptual basis and the development process of this instrument, and the methods and results of the reliability assessments. Reliability tests were conducted by 2 trained auditors on 12 study schools (high-low income and urban-suburban-rural settings). Kappa statistics (categorical, factual items) and ICC (Likert-scale, perceptual items) were used to assess a) interrater, b) test-retest, and c) peak vs. off-peak hour reliability tests. For the interrater reliability test, the average Kappa was 0.839 and the ICC was 0.602. For the test-retest reliability, the average Kappa was 0.903 and the ICC was 0.774. The peak-off peak reliability was 0.801. Rural schools showed the most consistent results in the peak-off peak and test-retest assessments. For interrater tests, urban schools showed the highest ICC, and rural schools showed the highest Kappa. Most items achieved moderate to high levels of reliabilities in all study schools. With proper training, this audit can be used to assess school environments reliably for research, outreach, and policy-support purposes.

  7. 76 FR 58716 - Interpretation of Transmission Planning Reliability Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-22

    ... directs NERC and Commission staff to initiate a process to identify any reliability issues, as discussed... Commission directs NERC and Commission staff to initiate a process to identify any reliability issues, as... established a process to select and certify an ERO,\\5\\ and subsequently certified NERC.\\6\\ On April 4, 2006...

  8. A critical assessment of in-flight particle state during plasma spraying of YSZ and its implications on coating properties and process reliability

    NASA Astrophysics Data System (ADS)

    Srinivasan, Vasudevan

    Air plasma spray is inherently complex due to the deviation from equilibrium conditions, three dimensional nature, multitude of interrelated (controllable) parameters and (uncontrollable) variables involved, and stochastic variability at different stages. The resultant coatings are complex due to the layered high defect density microstructure. Despite the widespread use and commercial success for decades in earthmoving, automotive, aerospace and power generation industries, plasma spray has not been completely understood and prime reliance for critical applications such as thermal barrier coatings on gas turbines are yet to be accomplished. This dissertation is aimed at understanding the in-flight particle state of the plasma spray process towards designing coatings and achieving coating reliability with the aid of noncontact in-flight particle and spray stream sensors. Key issues such as the phenomena of optimum particle injection and the definition of spray stream using particle state are investigated. Few strategies to modify the microstructure and properties of Yttria Stabilized Zirconia coatings are examined systematically using the framework of process maps. An approach to design process window based on design relevant coating properties is presented. Options to control the process for enhanced reproducibility and reliability are examined and the resultant variability is evaluated systematically at the different stages in the process. The 3D variability due to the difference in plasma characteristics has been critically examined by investigating splats collected from the entire spray footprint.

  9. Low cost MATLAB-based pulse oximeter for deployment in research and development applications.

    PubMed

    Shokouhian, M; Morling, R C S; Kale, I

    2013-01-01

    Problems such as motion artifact and effects of ambient lights have forced developers to design different signal processing techniques and algorithms to increase the reliability and accuracy of the conventional pulse oximeter device. To evaluate the robustness of these techniques, they are applied either to recorded data or are implemented on chip to be applied to real-time data. Recorded data is the most common method of evaluating however it is not as reliable as real-time measurements. On the other hand, hardware implementation can be both expensive and time consuming. This paper presents a low cost MATLAB-based pulse oximeter that can be used for rapid evaluation of newly developed signal processing techniques and algorithms. Flexibility to apply different signal processing techniques, providing both processed and unprocessed data along with low implementation cost are the important features of this design which makes it ideal for research and development purposes, as well as commercial, hospital and healthcare application.

  10. Reliability model generator

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C. (Inventor); McMann, Catherine M. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  11. Hybrid processing of laser scanning data

    NASA Astrophysics Data System (ADS)

    Badenko, Vladimir; Zotov, Dmitry; Fedotov, Alexander

    2018-03-01

    In this article the analysis of gaps in processing of raw laser scanning data and results of bridging the gaps discovered on the base of usage of laser scanning data for historic building information modeling is presented. The results of the development of a unified hybrid technology for the processing, storage, access and visualization of combined laser scanning and photography data about historical buildings are analyzed. The first result of the technology application for the historical building of St. Petersburg Polytechnic University shows reliability of the proposed approaches.

  12. PREDICTIVE UNCERTAINTY IN HYDROLOGIC AND WATER QUALITY MODELING: APPROACHES, APPLICATION TO ENVIRONMENTAL MANAGEMENT, AND FUTURE CHALLENGES (PRESENTATION)

    EPA Science Inventory

    Extant process-based hydrologic and water quality models are indispensable to water resources planning and environmental management. However, models are only approximations of real systems and often calibrated with incomplete and uncertain data. Reliable estimates, or perhaps f...

  13. System and process for pulsed multiple reaction monitoring

    DOEpatents

    Belov, Mikhail E

    2013-05-17

    A new pulsed multiple reaction monitoring process and system are disclosed that uses a pulsed ion injection mode for use in conjunction with triple-quadrupole instruments. The pulsed injection mode approach reduces background ion noise at the detector, increases amplitude of the ion signal, and includes a unity duty cycle that provides a significant sensitivity increase for reliable quantitation of proteins/peptides present at attomole levels in highly complex biological mixtures.

  14. Rocketdyne PSAM: In-house enhancement/application

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ohara, K.

    1991-01-01

    The development was initiated of the Probabilistic Design Analysis (PDA) Process for rocket engines. This will enable engineers a quantitative assessment of calculated reliability during the design process. The PDA will help choose better designs, make them more robust, and help decide on critical tests to help demonstrate key reliability issues to aid in improving the confidence of the engine capabilities. Rockedyne's involvement with the Composite Loads Spectra (CLS) and Probabilistic Structural Analysis Methodology (PSAM) contracts started this effort and are key elements in the on-going developments. Internal development efforts and hardware applications complement and extend the CLS and PSAM efforts. The completion of the CLS option work and the follow-on PSAM developments will also be integral parts of this methodology. A brief summary of these efforts is presented.

  15. Acceptance and Commitment Therapy for chronic pain: A diary study of treatment process in relation to reliable change in disability

    PubMed Central

    Vowles, Kevin E.; Fink, Brandi C.; Cohen, Lindsey L.

    2016-01-01

    In chronic pain treatment, a primary goal is reduced disability. It is often assumed that a central process by which disability reduction occurs is pain reduction. Conversely, approaches such as Acceptance and Commitment Therapy (ACT) posit that pain reduction is not necessary for reduced disability. Instead, disability reduction occurs when responses to pain are changed, such that as unsuccessful struggles for pain control decreases and engagement in personally-valued activities increases. Treatment outcome studies have supported ACT’s effectiveness; however, less work has examined how within-treatment patterns of change relate to treatment success or failure (i.e., decreased or sustained disability). The present study, therefore, sought to examine this issue. Specifically, struggles for pain control and engagement in valued activities were recorded weekly in 21 patients who completed a four week interdisciplinary ACT intervention for chronic pain. It was hypothesized that the presence or absence of reliable change in disability at a three month follow-up would be predicted by within treatment patterns of change in the weekly data. At follow-up, 47.6% of patients evidenced reliable disability reduction. The expected pattern of change occurred in 81.0% of patients–specifically, when pain control attempts decreased and engagement in valued activities increased, reliably reduced disability typically occurred, while the absence of this pattern was typically associated with a lack of reliable change. Further, changes in pain intensity, also assessed weekly, were unrelated to reliable change. Overall, these results provide additional support for the ACT model and further suggest some possible requirements for treatment success. PMID:27818931

  16. The Vanderbilt Holistic Face Processing Test: A short and reliable measure of holistic face processing

    PubMed Central

    Richler, Jennifer J.; Floyd, R. Jackie; Gauthier, Isabel

    2014-01-01

    Efforts to understand individual differences in high-level vision necessitate the development of measures that have sufficient reliability, which is generally not a concern in group studies. Holistic processing is central to research on face recognition and, more recently, to the study of individual differences in this area. However, recent work has shown that the most popular measure of holistic processing, the composite task, has low reliability. This is particularly problematic for the recent surge in interest in studying individual differences in face recognition. Here, we developed and validated a new measure of holistic face processing specifically for use in individual-differences studies. It avoids some of the pitfalls of the standard composite design and capitalizes on the idea that trial variability allows for better traction on reliability. Across four experiments, we refine this test and demonstrate its reliability. PMID:25228629

  17. Turboexpanders with dry gas seals and active magnetic bearings in hydrocarbon processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agahi, R.R.

    1999-07-01

    Since its first application in hydrocarbon processing in the early 1960s, turboexpander design has changed, evolved and improved tremendously. Today, hydrocarbon process designers use turboexpanders for almost all hydrocarbon liquid rejection and hydrocarbon dew point control for onshore and offshore installations. There are presently more than 3,000 turboexpanders operating in hydrocarbon gas processing plants worldwide. Due to the wide application of turboexpanders in hydrocarbon processing, the API-617 committee has assigned a task force to prepare an appendix to API-617 to cover design and manufacturing standards for turboexpanders. Dry gas seals (DGS) were cautiously introduced in the early 1980s for compressorsmore » used in hydrocarbon processing. It took almost a decade before dry gas seals found their application in turboexpanders. Dry gas seals were originally utilized to protect cryogenic hydrocarbon process gas from contamination by lubricating oil. Later on, dry gas seals were used to minimized hydrocarbon process gas leakage and also to provide an inert-gas-purged environment for both oil bearings and active magnetic bearings. The former eliminates the lubricating oil dilution problem and the latter made certification of active magnetic bearings by international certifying agencies possible. Active magnetic bearings (AMB), similar to dry gas seals, were originally introduced into hydrocarbon process gas compressors in the mid 1980s. The hydrocarbon processing industry waited half a decade to adopt this innovative technology for turboexpanders in the hydrocarbon process. The first turboexpander with active magnetic bearings was installed on an offshore platform in 1991. High reliability, low capital investment, low capital investment, low operating costs and more compact design have accelerated demand in recent years for turboexpanders with active magnetic bearings. In this paper, the author describes the technology of turboexpanders with dry gas seals and active magnetic bearings. Several applications are presented and performance, reliability and availability data will be presented.« less

  18. 75 FR 35011 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-21

    ... Processes Manual Incorporating Proposed Revisions to the Reliability Standards Development Process. Filed..., June 21, 2010. Take notice that the Commission received the following electric reliability filings: Docket Numbers: RR10-12-000. Applicants: North American Electric Reliability Corp. Description: Petition...

  19. Reliability of system for precise cold forging

    NASA Astrophysics Data System (ADS)

    Krušič, Vid; Rodič, Tomaž

    2017-07-01

    The influence of scatter of principal input parameters of the forging system on the dimensional accuracy of product and on the tool life for closed-die forging process is presented in this paper. Scatter of the essential input parameters for the closed-die upsetting process was adjusted to the maximal values that enabled the reliable production of a dimensionally accurate product at optimal tool life. An operating window was created in which exists the maximal scatter of principal input parameters for the closed-die upsetting process that still ensures the desired dimensional accuracy of the product and the optimal tool life. Application of the adjustment of the process input parameters is shown on the example of making an inner race of homokinetic joint from mass production. High productivity in manufacture of elements by cold massive extrusion is often achieved by multiple forming operations that are performed simultaneously on the same press. By redesigning the time sequences of forming operations at multistage forming process of starter barrel during the working stroke the course of the resultant force is optimized.

  20. Reliability of CGA/LGA/HDI Package Board/Assembly (Final Report)

    NASA Technical Reports Server (NTRS)

    Ghaffaroam. Reza

    2014-01-01

    Package manufacturers are now offering commercial-off-the-shelf column grid array (COTS CGA) packaging technologies in high-reliability versions. Understanding the process and quality assurance (QA) indicators for reliability are important for low-risk insertion of these advanced electronics packages. The previous reports, released in January of 2012 and January of 2013, presented package test data, assembly information, and reliability evaluation by thermal cycling for CGA packages with 1752, 1517, 1509, and 1272 inputs/outputs (I/Os) and 1-mm pitch. It presented the thermal cycling (-55C either 100C or 125C) test results for up to 200 cycles. This report presents up to 500 thermal cycles with quality assurance and failure analysis evaluation represented by optical photomicrographs, 2D real time X-ray images, dye-and-pry photomicrographs, and optical/scanning electron Microscopy (SEM) cross-sectional images. The report also presents assembly challenge using reflowing by either vapor phase or rework station of CGA and land grid array (LGA) versions of three high I/O packages both ceramic and plastic configuration. A new test vehicle was designed having high density interconnect (HDI) printed circuit board (PCB) with microvia-in-pad to accommodate both LGA packages as well as a large number of fine pitch ball grid arrays (BGAs). The LGAs either were assembled onto HDI PCB as an LGA or were solder paste print and reflow first to form solder dome on pads before assembly. Both plastic BGAs with 1156 I/O and ceramic LGAs were assembled. It also presented the X-ray inspection results as well as failures due to 200 thermal cycles. Lessons learned on assembly of ceramic LGAs are also presented.

  1. Improving reliability of a residency interview process.

    PubMed

    Peeters, Michael J; Serres, Michelle L; Gundrum, Todd E

    2013-10-14

    To improve the reliability and discrimination of a pharmacy resident interview evaluation form, and thereby improve the reliability of the interview process. In phase 1 of the study, authors used a Many-Facet Rasch Measurement model to optimize an existing evaluation form for reliability and discrimination. In phase 2, interviewer pairs used the modified evaluation form within 4 separate interview stations. In phase 3, 8 interviewers individually-evaluated each candidate in one-on-one interviews. In phase 1, the evaluation form had a reliability of 0.98 with person separation of 6.56; reproducibly, the form separated applicants into 6 distinct groups. Using that form in phase 2 and 3, our largest variation source was candidates, while content specificity was the next largest variation source. The phase 2 g-coefficient was 0.787, while confirmatory phase 3 was 0.922. Process reliability improved with more stations despite fewer interviewers per station-impact of content specificity was greatly reduced with more interview stations. A more reliable, discriminating evaluation form was developed to evaluate candidates during resident interviews, and a process was designed that reduced the impact from content specificity.

  2. US Activities in Making Life Cycle Inventory Data More Available to Users

    EPA Science Inventory

    The demand for LCA studies continues to grow, although, the lack of reliable, transparent Life Cycle Inventory (LCI) data is hampering the wide-spread application of LCA. This paper will present activities related to the development and accessibility of process LCI data in the U...

  3. Goal Structuring Notation in a Radiation Hardening Assurance Case for COTS-Based Spacecraft

    NASA Technical Reports Server (NTRS)

    Witulski, A.; Austin, R.; Evans, J.; Mahadevan, N.; Karsai, G.; Sierawski, B.; LaBel, K.; Reed, R.

    2016-01-01

    The attached presentation is a summary of how mission assurance is supported by model-based representations of spacecraft systems that can define sub-system functionality and interfacing, reliability parameters, as well as detailing a new paradigm for assurance, a model-centric and not document-centric process.

  4. An Assessment of the Myers-Briggs Type Indicator

    ERIC Educational Resources Information Center

    Carlyn, Marcia

    1977-01-01

    The Myers Briggs Type Indicator is a self-report inventory developed to measure variables in Carl Jung's personality typology. The four personality scales measured by the instrument, and the scoring process are described, and an extensive review of the intercorrelation, reliability, and validity research is presented. (Author/MV)

  5. Interrater Reliability to Assure Valid Content in Peer Review of CME-Accredited Presentations

    ERIC Educational Resources Information Center

    Quigg, Mark; Lado, Fred A.

    2009-01-01

    Introduction: The Accreditation Council for Continuing Medical Education (ACCME) provides guidelines for continuing medical education (CME) materials to mitigate problems in the independence or validity of content in certified activities; however, the process of peer review of materials appears largely unstudied and the reproducibility of…

  6. Improving the Analysis of Anthocyanidins from Blueberries Using Response Surface Methodology

    USDA-ARS?s Scientific Manuscript database

    Background: Recent interest in the health promoting potential of anthocyanins points to the need for robust and reliable analytical methods. It is essential to know that the health promoting chemicals are present in juices and other products processed from whole fruit. Many different methods have be...

  7. Reliability modelling and analysis of a multi-state element based on a dynamic Bayesian network

    PubMed Central

    Xu, Tingxue; Gu, Junyuan; Dong, Qi; Fu, Linyu

    2018-01-01

    This paper presents a quantitative reliability modelling and analysis method for multi-state elements based on a combination of the Markov process and a dynamic Bayesian network (DBN), taking perfect repair, imperfect repair and condition-based maintenance (CBM) into consideration. The Markov models of elements without repair and under CBM are established, and an absorbing set is introduced to determine the reliability of the repairable element. According to the state-transition relations between the states determined by the Markov process, a DBN model is built. In addition, its parameters for series and parallel systems, namely, conditional probability tables, can be calculated by referring to the conditional degradation probabilities. Finally, the power of a control unit in a failure model is used as an example. A dynamic fault tree (DFT) is translated into a Bayesian network model, and subsequently extended to a DBN. The results show the state probabilities of an element and the system without repair, with perfect and imperfect repair, and under CBM, with an absorbing set plotted by differential equations and verified. Through referring forward, the reliability value of the control unit is determined in different kinds of modes. Finally, weak nodes are noted in the control unit. PMID:29765629

  8. Rigor or Reliability and Validity in Qualitative Research: Perspectives, Strategies, Reconceptualization, and Recommendations.

    PubMed

    Cypress, Brigitte S

    Issues are still raised even now in the 21st century by the persistent concern with achieving rigor in qualitative research. There is also a continuing debate about the analogous terms reliability and validity in naturalistic inquiries as opposed to quantitative investigations. This article presents the concept of rigor in qualitative research using a phenomenological study as an exemplar to further illustrate the process. Elaborating on epistemological and theoretical conceptualizations by Lincoln and Guba, strategies congruent with qualitative perspective for ensuring validity to establish the credibility of the study are described. A synthesis of the historical development of validity criteria evident in the literature during the years is explored. Recommendations are made for use of the term rigor instead of trustworthiness and the reconceptualization and renewed use of the concept of reliability and validity in qualitative research, that strategies for ensuring rigor must be built into the qualitative research process rather than evaluated only after the inquiry, and that qualitative researchers and students alike must be proactive and take responsibility in ensuring the rigor of a research study. The insights garnered here will move novice researchers and doctoral students to a better conceptual grasp of the complexity of reliability and validity and its ramifications for qualitative inquiry.

  9. Design of Oil-Lubricated Machine for Life and Reliability

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin V.

    2007-01-01

    In the post-World War II era, the major technology drivers for improving the life, reliability, and performance of rolling-element bearings and gears have been the jet engine and the helicopter. By the late 1950s, most of the materials used for bearings and gears in the aerospace industry had been introduced into use. By the early 1960s, the life of most steels was increased over that experienced in the early 1940s, primarily by the introduction of vacuum degassing and vacuum melting processes in the late 1950s. The development of elastohydrodynamic (EHD) theory showed that most rolling bearings and gears have a thin film separating the contacting bodies during motion and it is that film which affects their lives. Computer programs modeling bearing and gear dynamics that incorporate probabilistic life prediction methods and EHD theory enable optimization of rotating machinery based on life and reliability. With improved manufacturing and processing, the potential improvement in bearing and gear life can be as much as 80 times that attainable in the early 1950s. The work presented summarizes the use of laboratory fatigue data for bearings and gears coupled with probabilistic life prediction and EHD theories to predict the life and reliability of a commercial turboprop gearbox. The resulting predictions are compared with field data.

  10. Assessment of a condition-specific quality-of-life measure for patients with developmentally absent teeth: validity and reliability testing.

    PubMed

    Akram, A J; Ireland, A J; Postlethwaite, K C; Sandy, J R; Jerreat, A S

    2013-11-01

    This article describes the process of validity and reliability testing of a condition-specific quality-of-life measure for patients with hypodontia presenting for orthodontic treatment. The development of the instrument is described in a previous article. Royal Devon and Exeter NHS Foundation Trust & Musgrove Park Hospital, Taunton. The child perception questionnaire was used as a standard against which to test criterion validity. The Bland and Altman method was used to check agreement between the two questionnaires. Construct validity was tested using principal component analysis on the four sections of the questionnaire. Test-retest reliability was tested using intraclass correlation coefficient and Bland and Altman method. Cronbach's alpha was used to test internal consistency reliability. Overall the questionnaire showed good reliability, criterion and construct validity. This together with previous evidence of good face and content validity suggests that the instrument may prove useful in clinical practice and further research. This study has demonstrated that the newly developed condition-specific quality-of-life questionnaire is both valid and reliable for use in young patients with hypodontia. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.

  11. Accelerated stress testing of terrestrial solar cells

    NASA Technical Reports Server (NTRS)

    Prince, J. L.; Lathrop, J. W.

    1979-01-01

    A program to investigate the reliability characteristics of unencapsulated low-cost terrestrial solar cells using accelerated stress testing is described. Reliability (or parametric degradation) factors appropriate to the cell technologies and use conditions were studied and a series of accelerated stress tests was synthesized. An electrical measurement procedure and a data analysis and management system was derived, and stress test fixturing and material flow procedures were set up after consideration was given to the number of cells to be stress tested and measured and the nature of the information to be obtained from the process. Selected results and conclusions are presented.

  12. Voltage Sag due to Pollution Induced Flashover Across Ceramic Insulator Strings

    NASA Astrophysics Data System (ADS)

    Reddy B, Subba; Goswami, Arup Kumar

    2017-11-01

    Voltage sag or voltage dips are significant to industrial reliability. There is a necessity to characterize the feeder level power quality (PQ) and the PQ performance among various utility companies. Contamination/pollution induced flashover is the ultimate consequence of the creeping discharges across the insulator strings which induce voltage sag. These have a severe threat on the safe and reliable operation of power systems. In the present work an attempt has been made to experimentally investigate the occurrence of voltage sag/dips during pollution induced flashovers. Results show significant dip/sag in the voltage magnitude during the flashover process.

  13. Reliable method for determination of the velocity of a sinker in a high-pressure falling body type viscometer

    NASA Astrophysics Data System (ADS)

    Dindar, Cigdem; Kiran, Erdogan

    2002-10-01

    We present a new sensor configuration and data reduction process to improve the accuracy and reliability of determining the terminal velocity of a falling sinker in falling body type viscometers. This procedure is based on the use of multiple linear variable differential transformer sensors and precise mapping of the sensor signal and position along with the time of fall which is then converted to distance versus fall time along the complete fall path. The method and its use in determination of high-pressure viscosity of n-pentane and carbon dioxide are described.

  14. Operator adaptation to changes in system reliability under adaptable automation.

    PubMed

    Chavaillaz, Alain; Sauer, Juergen

    2017-09-01

    This experiment examined how operators coped with a change in system reliability between training and testing. Forty participants were trained for 3 h on a complex process control simulation modelling six levels of automation (LOA). In training, participants either experienced a high- (100%) or low-reliability system (50%). The impact of training experience on operator behaviour was examined during a 2.5 h testing session, in which participants either experienced a high- (100%) or low-reliability system (60%). The results showed that most operators did not often switch between LOA. Most chose an LOA that relieved them of most tasks but maintained their decision authority. Training experience did not have a strong impact on the outcome measures (e.g. performance, complacency). Low system reliability led to decreased performance and self-confidence. Furthermore, complacency was observed under high system reliability. Overall, the findings suggest benefits of adaptable automation because it accommodates different operator preferences for LOA. Practitioner Summary: The present research shows that operators can adapt to changes in system reliability between training and testing sessions. Furthermore, it provides evidence that each operator has his/her preferred automation level. Since this preference varies strongly between operators, adaptable automation seems to be suitable to accommodate these large differences.

  15. The development of data acquisition and processing application system for RF ion source

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodan; Wang, Xiaoying; Hu, Chundong; Jiang, Caichao; Xie, Yahong; Zhao, Yuanzhe

    2017-07-01

    As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and high reliability. In addition, it easily achieves long-pulse steady-state operation. During the process of the development and testing of the RF ion source, a lot of original experimental data will be generated. Therefore, it is necessary to develop a stable and reliable computer data acquisition and processing application system for realizing the functions of data acquisition, storage, access, and real-time monitoring. In this paper, the development of a data acquisition and processing application system for the RF ion source is presented. The hardware platform is based on the PXI system and the software is programmed on the LabVIEW development environment. The key technologies that are used for the implementation of this software programming mainly include the long-pulse data acquisition technology, multi-threading processing technology, transmission control communication protocol, and the Lempel-Ziv-Oberhumer data compression algorithm. Now, this design has been tested and applied on the RF ion source. The test results show that it can work reliably and steadily. With the help of this design, the stable plasma discharge data of the RF ion source are collected, stored, accessed, and monitored in real-time. It is shown that it has a very practical application significance for the RF experiments.

  16. Solid Waste Processing: An Essential Technology for the Early Phases of Mars Exploration and Colonization

    NASA Technical Reports Server (NTRS)

    Wignarajah, Kanapathipillai; Pisharody, Suresh; Fisher, John; Flynn, Michael; Kliss, Mark (Technical Monitor)

    1997-01-01

    Terraforming of Mars is the long-term goal of colonization of Mars. However, this process is likely to be a very slow process and conservative estimates involving a synergic, technocentric approach estimate that it may take around 10,000 years before the planet can be parallel to that of Earth and where humans can live in open systems. Hence, any early missions will require the presence of a closed life support system where all wastes, both solids and liquids, will need to be recycled or where all consumables will need to be supplied. The economics of both are often a matter of speculation and conjecture, but some attempt is made here to evaluate the choice. If a choice is made to completely resupply and eject the waste mass, a number of unknown issues are at hand. On the other hand, processing of the wastes, will enable predictability and reliability of the ecosystem. Solid wastes though smaller in volume and mass than the liquid wastes contains more than 90% of the essential elements required by humans and plants. Further, if left unprocessed they present a serious risk to human health. This paper presents the use of well established technology in processing solid wastes, ensuring that the biogeochemical cycles of ecosystems are maintained, reliability of the closed life support system maintained and the establishment of the early processes necessary for the permanent presence of humans on Mars.

  17. A Novel Hybrid Error Criterion-Based Active Control Method for on-Line Milling Vibration Suppression with Piezoelectric Actuators and Sensors

    PubMed Central

    Zhang, Xingwu; Wang, Chenxi; Gao, Robert X.; Yan, Ruqiang; Chen, Xuefeng; Wang, Shibin

    2016-01-01

    Milling vibration is one of the most serious factors affecting machining quality and precision. In this paper a novel hybrid error criterion-based frequency-domain LMS active control method is constructed and used for vibration suppression of milling processes by piezoelectric actuators and sensors, in which only one Fast Fourier Transform (FFT) is used and no Inverse Fast Fourier Transform (IFFT) is involved. The correction formulas are derived by a steepest descent procedure and the control parameters are analyzed and optimized. Then, a novel hybrid error criterion is constructed to improve the adaptability, reliability and anti-interference ability of the constructed control algorithm. Finally, based on piezoelectric actuators and acceleration sensors, a simulation of a spindle and a milling process experiment are presented to verify the proposed method. Besides, a protection program is added in the control flow to enhance the reliability of the control method in applications. The simulation and experiment results indicate that the proposed method is an effective and reliable way for on-line vibration suppression, and the machining quality can be obviously improved. PMID:26751448

  18. The quantity time relation in the ionizing radiations

    NASA Astrophysics Data System (ADS)

    Jordão, B. O.; Quaresma, D. S.; Peixoto, J. G. P.

    2018-03-01

    The metrology area has taken a step forward with regard to the uncertainty calculation. This mathematical tool used in laboratories is essential to ensure that the values resulting from a measurement are reliable. For this to be possible, all equipment used in a measurement process must be reliable and, above all, traceable to the international metrology system. We propose to present in this work: (i) the development and calibration of a microcontrolled time device with a resolution of 1x10-4 s, in order to characterize the time greatness and make it re-producible; (ii) the calibration of the quartz clock present in a computer present in the dosimetry laboratories; (iii) a more in-depth study of the influence of time quantity on calibrations of instruments used in the area of radiological protection, diagnostic radiology and radiotherapy, with measurements performed on the Kerma magnitude in air or its rate.

  19. A Diagnostic Marker to Discriminate Childhood Apraxia of Speech from Speech Delay: III. Theoretical Coherence of the Pause Marker with Speech Processing Deficits in Childhood Apraxia of Speech

    ERIC Educational Resources Information Center

    Shriberg, Lawrence D.; Strand, Edythe A.; Fourakis, Marios; Jakielski, Kathy J.; Hall, Sheryl D.; Karlsson, Heather B.; Mabie, Heather L.; McSweeny, Jane L.; Tilkens, Christie M.; Wilson, David L.

    2017-01-01

    Purpose: Previous articles in this supplement described rationale for and development of the pause marker (PM), a diagnostic marker of childhood apraxia of speech (CAS), and studies supporting its validity and reliability. The present article assesses the theoretical coherence of the PM with speech processing deficits in CAS. Method: PM and other…

  20. Eye Movement Analysis and Cognitive Assessment. The Use of Comparative Visual Search Tasks in a Non-immersive VR Application.

    PubMed

    Rosa, Pedro J; Gamito, Pedro; Oliveira, Jorge; Morais, Diogo; Pavlovic, Matthew; Smyth, Olivia; Maia, Inês; Gomes, Tiago

    2017-03-23

    An adequate behavioral response depends on attentional and mnesic processes. When these basic cognitive functions are impaired, the use of non-immersive Virtual Reality Applications (VRAs) can be a reliable technique for assessing the level of impairment. However, most non-immersive VRAs use indirect measures to make inferences about visual attention and mnesic processes (e.g., time to task completion, error rate). To examine whether the eye movement analysis through eye tracking (ET) can be a reliable method to probe more effectively where and how attention is deployed and how it is linked with visual working memory during comparative visual search tasks (CVSTs) in non-immersive VRAs. The eye movements of 50 healthy participants were continuously recorded while CVSTs, selected from a set of cognitive tasks in the Systemic Lisbon Battery (SLB). Then a VRA designed to assess of cognitive impairments were randomly presented. The total fixation duration, the number of visits in the areas of interest and in the interstimulus space, along with the total execution time was significantly different as a function of the Mini Mental State Examination (MMSE) scores. The present study demonstrates that CVSTs in SLB, when combined with ET, can be a reliable and unobtrusive method for assessing cognitive abilities in healthy individuals, opening it to potential use in clinical samples.

  1. Knowledge-based personalized search engine for the Web-based Human Musculoskeletal System Resources (HMSR) in biomechanics.

    PubMed

    Dao, Tien Tuan; Hoang, Tuan Nha; Ta, Xuan Hien; Tho, Marie Christine Ho Ba

    2013-02-01

    Human musculoskeletal system resources of the human body are valuable for the learning and medical purposes. Internet-based information from conventional search engines such as Google or Yahoo cannot response to the need of useful, accurate, reliable and good-quality human musculoskeletal resources related to medical processes, pathological knowledge and practical expertise. In this present work, an advanced knowledge-based personalized search engine was developed. Our search engine was based on a client-server multi-layer multi-agent architecture and the principle of semantic web services to acquire dynamically accurate and reliable HMSR information by a semantic processing and visualization approach. A security-enhanced mechanism was applied to protect the medical information. A multi-agent crawler was implemented to develop a content-based database of HMSR information. A new semantic-based PageRank score with related mathematical formulas were also defined and implemented. As the results, semantic web service descriptions were presented in OWL, WSDL and OWL-S formats. Operational scenarios with related web-based interfaces for personal computers and mobile devices were presented and analyzed. Functional comparison between our knowledge-based search engine, a conventional search engine and a semantic search engine showed the originality and the robustness of our knowledge-based personalized search engine. In fact, our knowledge-based personalized search engine allows different users such as orthopedic patient and experts or healthcare system managers or medical students to access remotely into useful, accurate, reliable and good-quality HMSR information for their learning and medical purposes. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. Optimization of Bioethanol Production Using Whole Plant of Water Hyacinth as Substrate in Simultaneous Saccharification and Fermentation Process

    PubMed Central

    Zhang, Qiuzhuo; Weng, Chen; Huang, Huiqin; Achal, Varenyam; Wang, Duanchao

    2016-01-01

    Water hyacinth was used as substrate for bioethanol production in the present study. Combination of acid pretreatment and enzymatic hydrolysis was the most effective process for sugar production that resulted in the production of 402.93 mg reducing sugar at optimal condition. A regression model was built to optimize the fermentation factors according to response surface method in saccharification and fermentation (SSF) process. The optimized condition for ethanol production by SSF process was fermented at 38.87°C in 81.87 h when inoculated with 6.11 ml yeast, where 1.291 g/L bioethanol was produced. Meanwhile, 1.289 g/L ethanol was produced during experimentation, which showed reliability of presented regression model in this research. The optimization method discussed in the present study leading to relatively high bioethanol production could provide a promising way for Alien Invasive Species with high cellulose content. PMID:26779125

  3. The demise of plastic encapsulated microcircuit myths

    NASA Astrophysics Data System (ADS)

    Hakim, E. B.; Agarwal, R. K.; Pecht, M.

    1994-10-01

    Production of microelectronic devices encapsulated in solid, molded plastic packages has rapidly increased since the early 1980's. Today, millions of plastic-encapsulated devices are produced daily. On the other hand, only a few million hermetic (cavity) packages are produced per year. Reasons for the increased use of plastic-encapsulated packages include cost, availability, size, weight, quality, and reliability. Markets taking advantage of this technology range from computers and telecommunications to automotive uses. Yet, several industries, the military in particular, will not accept such devices. One reason for this reluctance to use the best available commercial parts is a perceived risk of poor reliability, derived from antiquated military specifications, standards, and handbooks; other common justifications cite differing environments; inadequate screens; inadequate test data, and required government audits of suppliers' processes. This paper describes failure mechanisms associated with plastic encapsulation and their elimination. It provides data indicating the relative reliability of cavity and solid-encapsulated packaging, and presents possible approaches to assuring quality and reliability in the procuring and applying this successful commercial technology.

  4. Weighted integration of short-term memory and sensory signals in the oculomotor system.

    PubMed

    Deravet, Nicolas; Blohm, Gunnar; de Xivry, Jean-Jacques Orban; Lefèvre, Philippe

    2018-05-01

    Oculomotor behaviors integrate sensory and prior information to overcome sensory-motor delays and noise. After much debate about this process, reliability-based integration has recently been proposed and several models of smooth pursuit now include recurrent Bayesian integration or Kalman filtering. However, there is a lack of behavioral evidence in humans supporting these theoretical predictions. Here, we independently manipulated the reliability of visual and prior information in a smooth pursuit task. Our results show that both smooth pursuit eye velocity and catch-up saccade amplitude were modulated by visual and prior information reliability. We interpret these findings as the continuous reliability-based integration of a short-term memory of target motion with visual information, which support modeling work. Furthermore, we suggest that saccadic and pursuit systems share this short-term memory. We propose that this short-term memory of target motion is quickly built and continuously updated, and constitutes a general building block present in all sensorimotor systems.

  5. Concurrent validity and reliability of using ground reaction force and center of pressure parameters in the determination of leg movement initiation during single leg lift.

    PubMed

    Aldabe, Daniela; de Castro, Marcelo Peduzzi; Milosavljevic, Stephan; Bussey, Melanie Dawn

    2016-09-01

    Postural adjustment evaluations during single leg lift requires the initiation of heel lift (T1) identification. T1 measured by means of motion analyses system is the most reliable approach. However, this method involves considerable workspace, expensive cameras, and time processing data and setting up laboratory. The use of ground reaction forces (GRF) and centre of pressure (COP) data is an alternative method as its data processing and setting up is less time consuming. Further, kinetic data is normally collected using frequency samples higher than 1000Hz whereas kinematic data are commonly captured using 50-200Hz. This study describes the concurrent-validity and reliability of GRF and COP measurements in determining T1, using a motion analysis system as reference standard. Kinematic and kinetic data during single leg lift were collected from ten participants. GRF and COP data were collected using one and two force plates. Displacement of a single heel marker was captured by means of ten Vicon(©) cameras. Kinetic and kinematic data were collected using a sample frequency of 1000Hz. Data were analysed in two stages: identification of key events in the kinetic data, and assessing concurrent validity of T1 based on the chosen key events with T1 provided by the kinematic data. The key event presenting the least systematic bias, along with a narrow 95% CI and limits of agreement against the reference standard T1, was the Baseline COPy event. Baseline COPy event was obtained using one force plate and presented excellent between-tester reliability. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Improving Reliability of a Residency Interview Process

    PubMed Central

    Serres, Michelle L.; Gundrum, Todd E.

    2013-01-01

    Objective. To improve the reliability and discrimination of a pharmacy resident interview evaluation form, and thereby improve the reliability of the interview process. Methods. In phase 1 of the study, authors used a Many-Facet Rasch Measurement model to optimize an existing evaluation form for reliability and discrimination. In phase 2, interviewer pairs used the modified evaluation form within 4 separate interview stations. In phase 3, 8 interviewers individually-evaluated each candidate in one-on-one interviews. Results. In phase 1, the evaluation form had a reliability of 0.98 with person separation of 6.56; reproducibly, the form separated applicants into 6 distinct groups. Using that form in phase 2 and 3, our largest variation source was candidates, while content specificity was the next largest variation source. The phase 2 g-coefficient was 0.787, while confirmatory phase 3 was 0.922. Process reliability improved with more stations despite fewer interviewers per station—impact of content specificity was greatly reduced with more interview stations. Conclusion. A more reliable, discriminating evaluation form was developed to evaluate candidates during resident interviews, and a process was designed that reduced the impact from content specificity. PMID:24159209

  7. Risk perception and information processing: the development and validation of a questionnaire to assess self-reported information processing.

    PubMed

    Smerecnik, Chris M R; Mesters, Ilse; Candel, Math J J M; De Vries, Hein; De Vries, Nanne K

    2012-01-01

    The role of information processing in understanding people's responses to risk information has recently received substantial attention. One limitation of this research concerns the unavailability of a validated questionnaire of information processing. This article presents two studies in which we describe the development and validation of the Information-Processing Questionnaire to meet that need. Study 1 describes the development and initial validation of the questionnaire. Participants were randomized to either a systematic processing or a heuristic processing condition after which they completed a manipulation check and the initial 15-item questionnaire and again two weeks later. The questionnaire was subjected to factor reliability and validity analyses on both measurement times for purposes of cross-validation of the results. A two-factor solution was observed representing a systematic processing and a heuristic processing subscale. The resulting scale showed good reliability and validity, with the systematic condition scoring significantly higher on the systematic subscale and the heuristic processing condition significantly higher on the heuristic subscale. Study 2 sought to further validate the questionnaire in a field study. Results of the second study corresponded with those of Study 1 and provided further evidence of the validity of the Information-Processing Questionnaire. The availability of this information-processing scale will be a valuable asset for future research and may provide researchers with new research opportunities. © 2011 Society for Risk Analysis.

  8. The Reliability Estimation for the Open Function of Cabin Door Affected by the Imprecise Judgment Corresponding to Distribution Hypothesis

    NASA Astrophysics Data System (ADS)

    Yu, Z. P.; Yue, Z. F.; Liu, W.

    2018-05-01

    With the development of artificial intelligence, more and more reliability experts have noticed the roles of subjective information in the reliability design of complex system. Therefore, based on the certain numbers of experiment data and expert judgments, we have divided the reliability estimation based on distribution hypothesis into cognition process and reliability calculation. Consequently, for an illustration of this modification, we have taken the information fusion based on intuitional fuzzy belief functions as the diagnosis model of cognition process, and finished the reliability estimation for the open function of cabin door affected by the imprecise judgment corresponding to distribution hypothesis.

  9. Performance characterization of water recovery and water quality from chemical/organic waste products

    NASA Technical Reports Server (NTRS)

    Moses, W. M.; Rogers, T. D.; Chowdhury, H.; Cullingford, H. S.

    1989-01-01

    The water reclamation subsystems currently being evaluated for the Space Shuttle Freedom are briefly reviewed with emphasis on a waste water management system capable of processing wastes containing high concentrations of organic/inorganic materials. The process combines low temperature/pressure to vaporize water with high temperature catalytic oxidation to decompose volatile organics. The reclaimed water is of potable quality and has high potential for maintenance under sterile conditions. Results from preliminary experiments and modifications in process and equipment required to control reliability and repeatability of system operation are presented.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roger Lew; Ronald L. Boring; Thomas A. Ulrich

    Operators of critical processes, such as nuclear power production, must contend with highly complex systems, procedures, and regulations. Developing human-machine interfaces (HMIs) that better support operators is a high priority for ensuring the safe and reliable operation of critical processes. Human factors engineering (HFE) provides a rich and mature set of tools for evaluating the performance of HMIs, but the set of tools for developing and designing HMIs is still in its infancy. Here we propose that Microsoft Windows Presentation Foundation (WPF) is well suited for many roles in the research and development of HMIs for process control.

  11. RELIABILITY STUDY OF THE U.S. EPA'S METHODS 101A - DETERMINATION OF PARTICULATE AND GASEOUS MERCURY EMISSIONS

    EPA Science Inventory

    EPA Method 101A applies to the determination of particulate and gaseous mercury missions from sewage sludge incinerators and other sources. oncern has been expressed hat ammonia or hydrogen chloride (HCl) when present in the emissions, interferes in the analytical processes and p...

  12. Accelerated life testing effects on CMOS microcircuit characteristics

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Modifications and additions to the present process of making CMOS microcircuits which are designed to provide protective layers on the chip to guard against moisture and contaminants were investigated. High and low temperature Si3N4 protective layers were tested on the CMOS microcircuits and no conclusive improvements in device reliability characteristics were evidenced.

  13. Market study phase 2 follow-up activity. The Baylor Mark 3 Haploscope

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Efforts to accelerate commercialization of the haploscope, and to determine quickly and reliably the level of manufacturer interest in the product are presented. The nature of the decision making process within firms as it concerns project selection and new product evaluation is discussed. Implications for the NASA marketing program were assessed.

  14. Online Meta-data Collection and Monitoring Framework for the STAR Experiment at RHIC

    NASA Astrophysics Data System (ADS)

    Arkhipkin, D.; Lauret, J.; Betts, W.; Van Buren, G.

    2012-12-01

    The STAR Experiment further exploits scalable message-oriented model principles to achieve a high level of control over online data streams. In this paper we present an AMQP-powered Message Interface and Reliable Architecture framework (MIRA), which allows STAR to orchestrate the activities of Meta-data Collection, Monitoring, Online QA and several Run-Time and Data Acquisition system components in a very efficient manner. The very nature of the reliable message bus suggests parallel usage of multiple independent storage mechanisms for our meta-data. We describe our experience with a robust data-taking setup employing MySQL- and HyperTable-based archivers for meta-data processing. In addition, MIRA has an AJAX-enabled web GUI, which allows real-time visualisation of online process flow and detector subsystem states, and doubles as a sophisticated alarm system when combined with complex event processing engines like Esper, Borealis or Cayuga. The performance data and our planned path forward are based on our experience during the 2011-2012 running of STAR.

  15. An Investigation of the Reliability and Self-Regulatory Correlates of Conflict Adaptation.

    PubMed

    Feldman, Julia L; Freitas, Antonio L

    2016-07-01

    The study of the conflict-adaptation effect, in which encountering information-processing conflict attenuates the disruptive influence of information-processing conflicts encountered subsequently, is a burgeoning area of research. The present study investigated associations among performance measures on a Stroop-trajectory task (measuring Stroop interference and conflict adaptation), on a Wisconsin Card Sorting Task (WCST; measuring cognitive flexibility), and on self-reported measures of self-regulation (including impulsivity and tenacity). We found significant reliability of the conflict-adaptation effects across a two-week period, for response-time and accuracy. Variability in conflict adaptation was not associated significantly with any indicators of performance on the WCST or with most of the self-reported self-regulation measures. There was substantial covariance between Stroop interference for accuracy and conflict adaptation for accuracy. The lack of evidence of covariance across distinct aspects of cognitive control (conflict adaptation, WCST performance, self-reported self-control) may reflect the operation of relatively independent component processes.

  16. Optimum random and age replacement policies for customer-demand multi-state system reliability under imperfect maintenance

    NASA Astrophysics Data System (ADS)

    Chen, Yen-Luan; Chang, Chin-Chih; Sheu, Dwan-Fang

    2016-04-01

    This paper proposes the generalised random and age replacement policies for a multi-state system composed of multi-state elements. The degradation of the multi-state element is assumed to follow the non-homogeneous continuous time Markov process which is a continuous time and discrete state process. A recursive approach is presented to efficiently compute the time-dependent state probability distribution of the multi-state element. The state and performance distribution of the entire multi-state system is evaluated via the combination of the stochastic process and the Lz-transform method. The concept of customer-centred reliability measure is developed based on the system performance and the customer demand. We develop the random and age replacement policies for an aging multi-state system subject to imperfect maintenance in a failure (or unacceptable) state. For each policy, the optimum replacement schedule which minimises the mean cost rate is derived analytically and discussed numerically.

  17. Using the virtual reality device Oculus Rift for neuropsychological assessment of visual processing capabilities

    PubMed Central

    Foerster, Rebecca M.; Poth, Christian H.; Behler, Christian; Botsch, Mario; Schneider, Werner X.

    2016-01-01

    Neuropsychological assessment of human visual processing capabilities strongly depends on visual testing conditions including room lighting, stimuli, and viewing-distance. This limits standardization, threatens reliability, and prevents the assessment of core visual functions such as visual processing speed. Increasingly available virtual reality devices allow to address these problems. One such device is the portable, light-weight, and easy-to-use Oculus Rift. It is head-mounted and covers the entire visual field, thereby shielding and standardizing the visual stimulation. A fundamental prerequisite to use Oculus Rift for neuropsychological assessment is sufficient test-retest reliability. Here, we compare the test-retest reliabilities of Bundesen’s visual processing components (visual processing speed, threshold of conscious perception, capacity of visual working memory) as measured with Oculus Rift and a standard CRT computer screen. Our results show that Oculus Rift allows to measure the processing components as reliably as the standard CRT. This means that Oculus Rift is applicable for standardized and reliable assessment and diagnosis of elementary cognitive functions in laboratory and clinical settings. Oculus Rift thus provides the opportunity to compare visual processing components between individuals and institutions and to establish statistical norm distributions. PMID:27869220

  18. Using the virtual reality device Oculus Rift for neuropsychological assessment of visual processing capabilities.

    PubMed

    Foerster, Rebecca M; Poth, Christian H; Behler, Christian; Botsch, Mario; Schneider, Werner X

    2016-11-21

    Neuropsychological assessment of human visual processing capabilities strongly depends on visual testing conditions including room lighting, stimuli, and viewing-distance. This limits standardization, threatens reliability, and prevents the assessment of core visual functions such as visual processing speed. Increasingly available virtual reality devices allow to address these problems. One such device is the portable, light-weight, and easy-to-use Oculus Rift. It is head-mounted and covers the entire visual field, thereby shielding and standardizing the visual stimulation. A fundamental prerequisite to use Oculus Rift for neuropsychological assessment is sufficient test-retest reliability. Here, we compare the test-retest reliabilities of Bundesen's visual processing components (visual processing speed, threshold of conscious perception, capacity of visual working memory) as measured with Oculus Rift and a standard CRT computer screen. Our results show that Oculus Rift allows to measure the processing components as reliably as the standard CRT. This means that Oculus Rift is applicable for standardized and reliable assessment and diagnosis of elementary cognitive functions in laboratory and clinical settings. Oculus Rift thus provides the opportunity to compare visual processing components between individuals and institutions and to establish statistical norm distributions.

  19. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  20. KNIME4NGS: a comprehensive toolbox for next generation sequencing analysis.

    PubMed

    Hastreiter, Maximilian; Jeske, Tim; Hoser, Jonathan; Kluge, Michael; Ahomaa, Kaarin; Friedl, Marie-Sophie; Kopetzky, Sebastian J; Quell, Jan-Dominik; Mewes, H Werner; Küffner, Robert

    2017-05-15

    Analysis of Next Generation Sequencing (NGS) data requires the processing of large datasets by chaining various tools with complex input and output formats. In order to automate data analysis, we propose to standardize NGS tasks into modular workflows. This simplifies reliable handling and processing of NGS data, and corresponding solutions become substantially more reproducible and easier to maintain. Here, we present a documented, linux-based, toolbox of 42 processing modules that are combined to construct workflows facilitating a variety of tasks such as DNAseq and RNAseq analysis. We also describe important technical extensions. The high throughput executor (HTE) helps to increase the reliability and to reduce manual interventions when processing complex datasets. We also provide a dedicated binary manager that assists users in obtaining the modules' executables and keeping them up to date. As basis for this actively developed toolbox we use the workflow management software KNIME. See http://ibisngs.github.io/knime4ngs for nodes and user manual (GPLv3 license). robert.kueffner@helmholtz-muenchen.de. Supplementary data are available at Bioinformatics online.

  1. Advances in Thin Film Thermocouple Durability Under High Temperature and Pressure Testing Conditions

    NASA Technical Reports Server (NTRS)

    Martin, Lisa C.; Fralick, Gustave C.; Taylor, Keith F.

    1999-01-01

    Thin film thermocouples for measuring material surface temperature have been previously demonstrated on several material systems and in various hostile test environments. A well-developed thin film fabrication procedure utilizing shadow masking for patterning the sensors elements had produced thin films with sufficient durability for applications in high temperature and pressure environments that exist in air-breathing and hydrogen-fueled burner rig and engine test facilities. However, while shadow masking had been a reliable method for specimens with flat and gently curved surfaces, it had not been consistently reliable for use on test components with sharp contours. This work reports on the feasibility of utilizing photolithography processing for patterning thin film thermocouples. Because this patterning process required changes in the thin film deposition process from that developed for shadow masking, the effect of these changes on thin film adherence during burner rig testing was evaluated. In addition to the results of changing the patterning method, the effects on thin film adherence of other processes used in the thin film fabrication procedure is also presented.

  2. Anticipatory versus reactive spatial attentional bias to threat.

    PubMed

    Gladwin, Thomas E; Möbius, Martin; McLoughlin, Shane; Tyndall, Ian

    2018-05-10

    Dot-probe or visual probe tasks (VPTs) are used extensively to measure attentional biases. A novel variant termed the cued VPT (cVPT) was developed to focus on the anticipatory component of attentional bias. This study aimed to establish an anticipatory attentional bias to threat using the cVPT and compare its split-half reliability with a typical dot-probe task. A total of 120 students performed the cVPT task and dot-probe tasks. Essentially, the cVPT uses cues that predict the location of pictorial threatening stimuli, but on trials on which probe stimuli are presented the pictures do not appear. Hence, actual presentation of emotional stimuli did not affect responses. The reliability of the cVPT was higher at most cue-stimulus intervals and was .56 overall. A clear anticipatory attentional bias was found. In conclusion, the cVPT may be of methodological and theoretical interest. Using visually neutral predictive cues may remove sources of noise that negatively impact reliability. Predictive cues are able to bias response selection, suggesting a role of predicted outcomes in automatic processes. © 2018 The British Psychological Society.

  3. A Z-number-based decision making procedure with ranking fuzzy numbers method

    NASA Astrophysics Data System (ADS)

    Mohamad, Daud; Shaharani, Saidatull Akma; Kamis, Nor Hanimah

    2014-12-01

    The theory of fuzzy set has been in the limelight of various applications in decision making problems due to its usefulness in portraying human perception and subjectivity. Generally, the evaluation in the decision making process is represented in the form of linguistic terms and the calculation is performed using fuzzy numbers. In 2011, Zadeh has extended this concept by presenting the idea of Z-number, a 2-tuple fuzzy numbers that describes the restriction and the reliability of the evaluation. The element of reliability in the evaluation is essential as it will affect the final result. Since this concept can still be considered as new, available methods that incorporate reliability for solving decision making problems is still scarce. In this paper, a decision making procedure based on Z-numbers is proposed. Due to the limitation of its basic properties, Z-numbers will be first transformed to fuzzy numbers for simpler calculations. A method of ranking fuzzy number is later used to prioritize the alternatives. A risk analysis problem is presented to illustrate the effectiveness of this proposed procedure.

  4. A Protocol for Advanced Psychometric Assessment of Surveys

    PubMed Central

    Squires, Janet E.; Hayduk, Leslie; Hutchinson, Alison M.; Cranley, Lisa A.; Gierl, Mark; Cummings, Greta G.; Norton, Peter G.; Estabrooks, Carole A.

    2013-01-01

    Background and Purpose. In this paper, we present a protocol for advanced psychometric assessments of surveys based on the Standards for Educational and Psychological Testing. We use the Alberta Context Tool (ACT) as an exemplar survey to which this protocol can be applied. Methods. Data mapping, acceptability, reliability, and validity are addressed. Acceptability is assessed with missing data frequencies and the time required to complete the survey. Reliability is assessed with internal consistency coefficients and information functions. A unitary approach to validity consisting of accumulating evidence based on instrument content, response processes, internal structure, and relations to other variables is taken. We also address assessing performance of survey data when aggregated to higher levels (e.g., nursing unit). Discussion. In this paper we present a protocol for advanced psychometric assessment of survey data using the Alberta Context Tool (ACT) as an exemplar survey; application of the protocol to the ACT survey is underway. Psychometric assessment of any survey is essential to obtaining reliable and valid research findings. This protocol can be adapted for use with any nursing survey. PMID:23401759

  5. Multiscale Microstructures and Microstructural Effects on the Reliability of Microbumps in Three-Dimensional Integration

    PubMed Central

    Huang, Zhiheng; Xiong, Hua; Wu, Zhiyong; Conway, Paul; Altmann, Frank

    2013-01-01

    The dimensions of microbumps in three-dimensional integration reach microscopic scales and thus necessitate a study of the multiscale microstructures in microbumps. Here, we present simulated mesoscale and atomic-scale microstructures of microbumps using phase field and phase field crystal models. Coupled microstructure, mechanical stress, and electromigration modeling was performed to highlight the microstructural effects on the reliability of microbumps. The results suggest that the size and geometry of microbumps can influence both the mesoscale and atomic-scale microstructural formation during solidification. An external stress imposed on the microbump can cause ordered phase growth along the boundaries of the microbump. Mesoscale microstructures formed in the microbumps from solidification, solid state phase separation, and coarsening processes suggest that the microstructures in smaller microbumps are more heterogeneous. Due to the differences in microstructures, the von Mises stress distributions in microbumps of different sizes and geometries vary. In addition, a combined effect resulting from the connectivity of the phase morphology and the amount of interface present in the mesoscale microstructure can influence the electromigration reliability of microbumps. PMID:28788356

  6. Reliability and validity evidence of the Assessment of Language Use in Social Contexts for Adults (ALUSCA).

    PubMed

    Valente, Ana Rita S; Hall, Andreia; Alvelos, Helena; Leahy, Margaret; Jesus, Luis M T

    2018-04-12

    The appropriate use of language in context depends on the speaker's pragmatic language competencies. A coding system was used to develop a specific and adult-focused self-administered questionnaire to adults who stutter and adults who do not stutter, The Assessment of Language Use in Social Contexts for Adults, with three categories: precursors, basic exchanges, and extended literal/non-literal discourse. This paper presents the content validity, item analysis, reliability coefficients and evidences of construct validity of the instrument. Content validity analysis was based on a two-stage process: first, 11 pragmatic questionnaires were assessed to identify items that probe each pragmatic competency and to create the first version of the instrument; second, items were assessed qualitatively by an expert panel composed by adults who stutter and controls, and quantitatively and qualitatively by an expert panel composed by clinicians. A pilot study was conducted with five adults who stutter and five controls to analyse items and calculate reliability. Construct validity evidences were obtained using the hypothesized relationships method and factor analysis with 28 adults who stutter and 28 controls. Concerning content validity, the questionnaires assessed up to 13 pragmatic competencies. Qualitative and quantitative analysis revealed ambiguities in items construction. Disagreement between experts was solved through item modification. The pilot study showed that the instrument presented internal consistency and temporal stability. Significant differences between adults who stutter and controls and different response profiles revealed the instrument's underlying construct. The instrument is reliable and presented evidences of construct validity.

  7. Modified personal interviews: resurrecting reliable personal interviews for admissions?

    PubMed

    Hanson, Mark D; Kulasegaram, Kulamakan Mahan; Woods, Nicole N; Fechtig, Lindsey; Anderson, Geoff

    2012-10-01

    Traditional admissions personal interviews provide flexible faculty-student interactions but are plagued by low inter-interview reliability. Axelson and Kreiter (2009) retrospectively showed that multiple independent sampling (MIS) may improve reliability of personal interviews; thus, the authors incorporated MIS into the admissions process for medical students applying to the University of Toronto's Leadership Education and Development Program (LEAD). They examined the reliability and resource demands of this modified personal interview (MPI) format. In 2010-2011, LEAD candidates submitted written applications, which were used to screen for participation in the MPI process. Selected candidates completed four brief (10-12 minutes) independent MPIs each with a different interviewer. The authors blueprinted MPI questions to (i.e., aligned them with) leadership attributes, and interviewers assessed candidates' eligibility on a five-point Likert-type scale. The authors analyzed inter-interview reliability using the generalizability theory. Sixteen candidates submitted applications; 10 proceeded to the MPI stage. Reliability of the written application components was 0.75. The MPI process had overall inter-interview reliability of 0.79. Correlation between the written application and MPI scores was 0.49. A decision study showed acceptable reliability of 0.74 with only three MPIs scored using one global rating. Furthermore, a traditional admissions interview format would take 66% more time than the MPI format. The MPI format, used during the LEAD admissions process, achieved high reliability with minimal faculty resources. The MPI format's reliability and effective resource use were possible through MIS and employment of expert interviewers. MPIs may be useful for other admissions tasks.

  8. Speech perception in autism spectrum disorder: An activation likelihood estimation meta-analysis.

    PubMed

    Tryfon, Ana; Foster, Nicholas E V; Sharda, Megha; Hyde, Krista L

    2018-02-15

    Autism spectrum disorder (ASD) is often characterized by atypical language profiles and auditory and speech processing. These can contribute to aberrant language and social communication skills in ASD. The study of the neural basis of speech perception in ASD can serve as a potential neurobiological marker of ASD early on, but mixed results across studies renders it difficult to find a reliable neural characterization of speech processing in ASD. To this aim, the present study examined the functional neural basis of speech perception in ASD versus typical development (TD) using an activation likelihood estimation (ALE) meta-analysis of 18 qualifying studies. The present study included separate analyses for TD and ASD, which allowed us to examine patterns of within-group brain activation as well as both common and distinct patterns of brain activation across the ASD and TD groups. Overall, ASD and TD showed mostly common brain activation of speech processing in bilateral superior temporal gyrus (STG) and left inferior frontal gyrus (IFG). However, the results revealed trends for some distinct activation in the TD group showing additional activation in higher-order brain areas including left superior frontal gyrus (SFG), left medial frontal gyrus (MFG), and right IFG. These results provide a more reliable neural characterization of speech processing in ASD relative to previous single neuroimaging studies and motivate future work to investigate how these brain signatures relate to behavioral measures of speech processing in ASD. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Understanding the reliability of solder joints used in advanced structural and electronics applications: Part 1 - Filler metal properties and the soldering process

    DOE PAGES

    Vianco, Paul T.

    2017-02-01

    Soldering technology has made tremendous strides in the past half-century. Whether structural or electronic, all solder joints must provide a level of reliability that is required by the application. This Part 1 report examines the effects of filler metal properties and soldering process on joint reliability. Solder alloy composition must have the appropriate melting and mechanical properties that suit the product's assembly process(es) and use environment. The filler metal must also optimize solderability (wetting-and-spreading) to realize the proper joint geometry. Here, the soldering process also affects joint reliability. The choice of flux and thermal profile support the solderability performance ofmore » the molten filler metal to successfully fill the gap and complete the fillet.« less

  10. The role of water vapor in climate. A strategic research plan for the proposed GEWEX water vapor project (GVaP)

    NASA Technical Reports Server (NTRS)

    Starr, D. OC. (Editor); Melfi, S. Harvey (Editor)

    1991-01-01

    The proposed GEWEX Water Vapor Project (GVaP) addresses fundamental deficiencies in the present understanding of moist atmospheric processes and the role of water vapor in the global hydrologic cycle and climate. Inadequate knowledge of the distribution of atmospheric water vapor and its transport is a major impediment to progress in achieving a fuller understanding of various hydrologic processes and a capability for reliable assessment of potential climatic change on global and regional scales. GVap will promote significant improvements in knowledge of atmospheric water vapor and moist processes as well as in present capabilities to model these processes on global and regional scales. GVaP complements a number of ongoing and planned programs focused on various aspects of the hydrologic cycle. The goal of GVaP is to improve understanding of the role of water vapor in meteorological, hydrological, and climatological processes through improved knowledge of water vapor and its variability on all scales. A detailed description of the GVaP is presented.

  11. Bridge reliability assessment based on the PDF of long-term monitored extreme strains

    NASA Astrophysics Data System (ADS)

    Jiao, Meiju; Sun, Limin

    2011-04-01

    Structural health monitoring (SHM) systems can provide valuable information for the evaluation of bridge performance. As the development and implementation of SHM technology in recent years, the data mining and use has received increasingly attention and interests in civil engineering. Based on the principle of probabilistic and statistics, a reliability approach provides a rational basis for analysis of the randomness in loads and their effects on structures. A novel approach combined SHM systems with reliability method to evaluate the reliability of a cable-stayed bridge instrumented with SHM systems was presented in this paper. In this study, the reliability of the steel girder of the cable-stayed bridge was denoted by failure probability directly instead of reliability index as commonly used. Under the assumption that the probability distributions of the resistance are independent to the responses of structures, a formulation of failure probability was deduced. Then, as a main factor in the formulation, the probability density function (PDF) of the strain at sensor locations based on the monitoring data was evaluated and verified. That Donghai Bridge was taken as an example for the application of the proposed approach followed. In the case study, 4 years' monitoring data since the operation of the SHM systems was processed, and the reliability assessment results were discussed. Finally, the sensitivity and accuracy of the novel approach compared with FORM was discussed.

  12. Rationality versus reality: the challenges of evidence-based decision making for health policy makers

    PubMed Central

    2010-01-01

    Background Current healthcare systems have extended the evidence-based medicine (EBM) approach to health policy and delivery decisions, such as access-to-care, healthcare funding and health program continuance, through attempts to integrate valid and reliable evidence into the decision making process. These policy decisions have major impacts on society and have high personal and financial costs associated with those decisions. Decision models such as these function under a shared assumption of rational choice and utility maximization in the decision-making process. Discussion We contend that health policy decision makers are generally unable to attain the basic goals of evidence-based decision making (EBDM) and evidence-based policy making (EBPM) because humans make decisions with their naturally limited, faulty, and biased decision-making processes. A cognitive information processing framework is presented to support this argument, and subtle cognitive processing mechanisms are introduced to support the focal thesis: health policy makers' decisions are influenced by the subjective manner in which they individually process decision-relevant information rather than on the objective merits of the evidence alone. As such, subsequent health policy decisions do not necessarily achieve the goals of evidence-based policy making, such as maximizing health outcomes for society based on valid and reliable research evidence. Summary In this era of increasing adoption of evidence-based healthcare models, the rational choice, utility maximizing assumptions in EBDM and EBPM, must be critically evaluated to ensure effective and high-quality health policy decisions. The cognitive information processing framework presented here will aid health policy decision makers by identifying how their decisions might be subtly influenced by non-rational factors. In this paper, we identify some of the biases and potential intervention points and provide some initial suggestions about how the EBDM/EBPM process can be improved. PMID:20504357

  13. Rationality versus reality: the challenges of evidence-based decision making for health policy makers.

    PubMed

    McCaughey, Deirdre; Bruning, Nealia S

    2010-05-26

    Current healthcare systems have extended the evidence-based medicine (EBM) approach to health policy and delivery decisions, such as access-to-care, healthcare funding and health program continuance, through attempts to integrate valid and reliable evidence into the decision making process. These policy decisions have major impacts on society and have high personal and financial costs associated with those decisions. Decision models such as these function under a shared assumption of rational choice and utility maximization in the decision-making process. We contend that health policy decision makers are generally unable to attain the basic goals of evidence-based decision making (EBDM) and evidence-based policy making (EBPM) because humans make decisions with their naturally limited, faulty, and biased decision-making processes. A cognitive information processing framework is presented to support this argument, and subtle cognitive processing mechanisms are introduced to support the focal thesis: health policy makers' decisions are influenced by the subjective manner in which they individually process decision-relevant information rather than on the objective merits of the evidence alone. As such, subsequent health policy decisions do not necessarily achieve the goals of evidence-based policy making, such as maximizing health outcomes for society based on valid and reliable research evidence. In this era of increasing adoption of evidence-based healthcare models, the rational choice, utility maximizing assumptions in EBDM and EBPM, must be critically evaluated to ensure effective and high-quality health policy decisions. The cognitive information processing framework presented here will aid health policy decision makers by identifying how their decisions might be subtly influenced by non-rational factors. In this paper, we identify some of the biases and potential intervention points and provide some initial suggestions about how the EBDM/EBPM process can be improved.

  14. Reliability theory for repair service organization simulation and increase of innovative attraction of industrial enterprises

    NASA Astrophysics Data System (ADS)

    Dolzhenkova, E. V.; Iurieva, L. V.

    2018-05-01

    The study presents the author's algorithm for the industrial enterprise repair service organization simulation based on the reliability theory, as well as the results of its application. The monitoring of the industrial enterprise repair service organization is proposed to perform on the basis of the enterprise's state indexes for the main resources (equipment, labour, finances, repair areas), which allows quantitative evaluation of the reliability level as a resulting summary rating of the said parameters and the ensuring of an appropriate level of the operation reliability of the serviced technical objects. Under the conditions of the tough competition, the following approach is advisable: the higher efficiency of production and a repair service itself, the higher the innovative attractiveness of an industrial enterprise. The results of the calculations show that in order to prevent inefficient losses of production and to reduce the repair costs, it is advisable to apply the reliability theory. The overall reliability rating calculated on the basis of the author's algorithm has low values. The processing of the statistical data forms the reliability characteristics for the different workshops and services of an industrial enterprise, which allows one to define the failure rates of the various units of equipment and to establish the reliability indexes necessary for the subsequent mathematical simulation. The proposed simulating algorithm contributes to an increase of the efficiency of the repair service organization and improvement of the innovative attraction of an industrial enterprise.

  15. Wafer level reliability for high-performance VLSI design

    NASA Technical Reports Server (NTRS)

    Root, Bryan J.; Seefeldt, James D.

    1987-01-01

    As very large scale integration architecture requires higher package density, reliability of these devices has approached a critical level. Previous processing techniques allowed a large window for varying reliability. However, as scaling and higher current densities push reliability to its limit, tighter control and instant feedback becomes critical. Several test structures developed to monitor reliability at the wafer level are described. For example, a test structure was developed to monitor metal integrity in seconds as opposed to weeks or months for conventional testing. Another structure monitors mobile ion contamination at critical steps in the process. Thus the reliability jeopardy can be assessed during fabrication preventing defective devices from ever being placed in the field. Most importantly, the reliability can be assessed on each wafer as opposed to an occasional sample.

  16. [Immunocytochemical demonstration of astrocytes in brain sections combined with Nissl staining].

    PubMed

    Korzhevskiĭ, D E; Otellin, V A

    2004-01-01

    The aim of the present study was to develop an easy and reliable protocol of combined preparation staining, which would unite the advantages of immunocytochemical demonstration of astrocytes with the availability to evaluate functional state of neurons provided by Nissl technique. The presented protocol of paraffin sections processing allows to retain high quality of tissue structure and provides for selective demonstration of astrocytes using the monoclonal antibodies against glial fibrillary acidic protein and contrast Nissl staining of cells. The protocol can be used without any changes for processing of brain sections obtained from the humans and other mammals with the exception of mice and rabbits.

  17. Problems and Processes in Medical Encounters: The CASES method of dialogue analysis

    PubMed Central

    Laws, M. Barton; Taubin, Tatiana; Bezreh, Tanya; Lee, Yoojin; Beach, Mary Catherine; Wilson, Ira B.

    2013-01-01

    Objective To develop methods to reliably capture structural and dynamic temporal features of clinical interactions. Methods Observational study of 50 audio-recorded routine outpatient visits to HIV specialty clinics, using innovative analytic methods. The Comprehensive Analysis of the Structure of Encounters System (CASES) uses transcripts coded for speech acts, then imposes larger-scale structural elements: threads – the problems or issues addressed; and processes within threads –basic tasks of clinical care labeled Presentation, Information, Resolution (decision making) and Engagement (interpersonal exchange). Threads are also coded for the nature of resolution. Results 61% of utterances are in presentation processes. Provider verbal dominance is greatest in information and resolution processes, which also contain a high proportion of provider directives. About half of threads result in no action or decision. Information flows predominantly from patient to provider in presentation processes, and from provider to patient in information processes. Engagement is rare. Conclusions In this data, resolution is provider centered; more time for patient participation in resolution, or interpersonal engagement, would have to come from presentation. Practice Implications Awareness of the use of time in clinical encounters, and the interaction processes associated with various tasks, may help make clinical communication more efficient and effective. PMID:23391684

  18. Problems and processes in medical encounters: the cases method of dialogue analysis.

    PubMed

    Laws, M Barton; Taubin, Tatiana; Bezreh, Tanya; Lee, Yoojin; Beach, Mary Catherine; Wilson, Ira B

    2013-05-01

    To develop methods to reliably capture structural and dynamic temporal features of clinical interactions. Observational study of 50 audio-recorded routine outpatient visits to HIV specialty clinics, using innovative analytic methods. The comprehensive analysis of the structure of encounters system (CASES) uses transcripts coded for speech acts, then imposes larger-scale structural elements: threads--the problems or issues addressed; and processes within threads--basic tasks of clinical care labeled presentation, information, resolution (decision making) and Engagement (interpersonal exchange). Threads are also coded for the nature of resolution. 61% of utterances are in presentation processes. Provider verbal dominance is greatest in information and resolution processes, which also contain a high proportion of provider directives. About half of threads result in no action or decision. Information flows predominantly from patient to provider in presentation processes, and from provider to patient in information processes. Engagement is rare. In this data, resolution is provider centered; more time for patient participation in resolution, or interpersonal engagement, would have to come from presentation. Awareness of the use of time in clinical encounters, and the interaction processes associated with various tasks, may help make clinical communication more efficient and effective. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Space Shuttle Propulsion System Reliability

    NASA Technical Reports Server (NTRS)

    Welzyn, Ken; VanHooser, Katherine; Moore, Dennis; Wood, David

    2011-01-01

    This session includes the following sessions: (1) External Tank (ET) System Reliability and Lessons, (2) Space Shuttle Main Engine (SSME), Reliability Validated by a Million Seconds of Testing, (3) Reusable Solid Rocket Motor (RSRM) Reliability via Process Control, and (4) Solid Rocket Booster (SRB) Reliability via Acceptance and Testing.

  20. Reliability and validity of the revised Gibson Test of Cognitive Skills, a computer-based test battery for assessing cognition across the lifespan.

    PubMed

    Moore, Amy Lawson; Miller, Terissa M

    2018-01-01

    The purpose of the current study is to evaluate the validity and reliability of the revised Gibson Test of Cognitive Skills, a computer-based battery of tests measuring short-term memory, long-term memory, processing speed, logic and reasoning, visual processing, as well as auditory processing and word attack skills. This study included 2,737 participants aged 5-85 years. A series of studies was conducted to examine the validity and reliability using the test performance of the entire norming group and several subgroups. The evaluation of the technical properties of the test battery included content validation by subject matter experts, item analysis and coefficient alpha, test-retest reliability, split-half reliability, and analysis of concurrent validity with the Woodcock Johnson III Tests of Cognitive Abilities and Tests of Achievement. Results indicated strong sources of evidence of validity and reliability for the test, including internal consistency reliability coefficients ranging from 0.87 to 0.98, test-retest reliability coefficients ranging from 0.69 to 0.91, split-half reliability coefficients ranging from 0.87 to 0.91, and concurrent validity coefficients ranging from 0.53 to 0.93. The Gibson Test of Cognitive Skills-2 is a reliable and valid tool for assessing cognition in the general population across the lifespan.

  1. An application of characteristic function in order to predict reliability and lifetime of aeronautical hardware

    NASA Astrophysics Data System (ADS)

    Żurek, Józef; Kaleta, Ryszard; Zieja, Mariusz

    2016-06-01

    The forecasting of reliability and life of aeronautical hardware requires recognition of many and various destructive processes that deteriorate the health/maintenance status thereof. The aging of technical components of aircraft as an armament system proves of outstanding significance to reliability and safety of the whole system. The aging process is usually induced by many and various factors, just to mention mechanical, biological, climatic, or chemical ones. The aging is an irreversible process and considerably affects (i.e. reduces) reliability and lifetime of aeronautical equipment. Application of the characteristic function of the aging process is suggested to predict reliability and lifetime of aeronautical hardware. An increment in values of diagnostic parameters is introduced to formulate then, using the characteristic function and after some rearrangements, the partial differential equation. An analytical dependence for the characteristic function of the aging process is a solution to this equation. With the inverse transformation applied, the density function of the aging of aeronautical hardware is found. Having found the density function, one can determine the aeronautical equipment's reliability and lifetime. The in-service collected or the life tests delivered data are used to attain this goal. Coefficients in this relationship are found using the likelihood function.

  2. An application of characteristic function in order to predict reliability and lifetime of aeronautical hardware

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Żurek, Józef; Kaleta, Ryszard; Zieja, Mariusz

    2016-06-08

    The forecasting of reliability and life of aeronautical hardware requires recognition of many and various destructive processes that deteriorate the health/maintenance status thereof. The aging of technical components of aircraft as an armament system proves of outstanding significance to reliability and safety of the whole system. The aging process is usually induced by many and various factors, just to mention mechanical, biological, climatic, or chemical ones. The aging is an irreversible process and considerably affects (i.e. reduces) reliability and lifetime of aeronautical equipment. Application of the characteristic function of the aging process is suggested to predict reliability and lifetime ofmore » aeronautical hardware. An increment in values of diagnostic parameters is introduced to formulate then, using the characteristic function and after some rearrangements, the partial differential equation. An analytical dependence for the characteristic function of the aging process is a solution to this equation. With the inverse transformation applied, the density function of the aging of aeronautical hardware is found. Having found the density function, one can determine the aeronautical equipment’s reliability and lifetime. The in-service collected or the life tests delivered data are used to attain this goal. Coefficients in this relationship are found using the likelihood function.« less

  3. Probabilistic structural mechanics research for parallel processing computers

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Martin, William R.

    1991-01-01

    Aerospace structures and spacecraft are a complex assemblage of structural components that are subjected to a variety of complex, cyclic, and transient loading conditions. Significant modeling uncertainties are present in these structures, in addition to the inherent randomness of material properties and loads. To properly account for these uncertainties in evaluating and assessing the reliability of these components and structures, probabilistic structural mechanics (PSM) procedures must be used. Much research has focused on basic theory development and the development of approximate analytic solution methods in random vibrations and structural reliability. Practical application of PSM methods was hampered by their computationally intense nature. Solution of PSM problems requires repeated analyses of structures that are often large, and exhibit nonlinear and/or dynamic response behavior. These methods are all inherently parallel and ideally suited to implementation on parallel processing computers. New hardware architectures and innovative control software and solution methodologies are needed to make solution of large scale PSM problems practical.

  4. Flow measurements in sewers based on image analysis: automatic flow velocity algorithm.

    PubMed

    Jeanbourquin, D; Sage, D; Nguyen, L; Schaeli, B; Kayal, S; Barry, D A; Rossi, L

    2011-01-01

    Discharges of combined sewer overflows (CSOs) and stormwater are recognized as an important source of environmental contamination. However, the harsh sewer environment and particular hydraulic conditions during rain events reduce the reliability of traditional flow measurement probes. An in situ system for sewer water flow monitoring based on video images was evaluated. Algorithms to determine water velocities were developed based on image-processing techniques. The image-based water velocity algorithm identifies surface features and measures their positions with respect to real world coordinates. A web-based user interface and a three-tier system architecture enable remote configuration of the cameras and the image-processing algorithms in order to calculate automatically flow velocity on-line. Results of investigations conducted in a CSO are presented. The system was found to measure reliably water velocities, thereby providing the means to understand particular hydraulic behaviors.

  5. Indoor and outdoor characterization of the HIRL prototype: An innovative highly integrated receiverless LCPV concept using multijunction cells

    NASA Astrophysics Data System (ADS)

    Weick, Clément; De Betelu, Romain; Tauzin, Aurélie; Baudrit, Mathieu

    2017-09-01

    Concentrator photovoltaic (CPV) modules are composed of many components and interfaces, which require complex assembling processes, resulting in fabrication complexity and often lack of reliability. The present work addresses these issues, by proposing an innovative low concentration photovoltaic (LCPV) concept. In particular, the purpose here is to develop a module with a high level of integration by lowering the number of components and interfaces. The mirror used as the concentrator optic is multifunctional, as it combines thermal, structural and optical function. Moreover, the proposed design claims to demonstrate the applicability of reliable flat PV processes (such as lamination and cells interconnections), for the manufacturing of this LCPV module. The paper describes both indoor and outdoor characterization of a new prototype. Performances by means of IV curves tracing will be discussed regarding the losses distribution within the optical chain.

  6. eLaunch Hypersonics: An Advanced Launch System

    NASA Technical Reports Server (NTRS)

    Starr, Stanley

    2010-01-01

    This presentation describes a new space launch system that NASA can and should develop. This approach can significantly reduce ground processing and launch costs, improve reliability, and broaden the scope of what we do in near earth orbit. The concept (not new) is to launch a re-usable air-breathing hypersonic vehicle from a ground based electric track. This vehicle launches a final rocket stage at high altitude/velocity for the final leg to orbit. The proposal here differs from past studies in that we will launch above Mach 1.5 (above transonic pinch point) which further improves the efficiency of air breathing, horizontal take-off launch systems. The approach described here significantly reduces cost per kilogram to orbit, increases safety and reliability of the boost systems, and reduces ground costs due to horizontal-processing. Finally, this approach provides significant technology transfer benefits for our national infrastructure.

  7. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  8. 76 FR 16277 - System Restoration Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ... system restoration process. The Commission also approves the NERC's proposal to retire four existing EOP... prepare personnel to enable effective coordination of the system restoration process. The Commission also..., through the Reliability Standard development process, a modification to EOP-005-1 that identifies time...

  9. Trends in high performance compressors for petrochemical and natural gas industry in China

    NASA Astrophysics Data System (ADS)

    Zhao, Yuanyang; Li, Liansheng

    2015-08-01

    Compressors are the key equipment in the petrochemical and natural gas industry system. The performance and reliability of them are very important for the process system. The application status of petrochemical & natural gas compressors in China is presented in this paper. The present status of design and operating technologies of compressors in China are mentioned in this paper. The turbo, reciprocating and twin screw compressors are discussed. The market demands for different structure compressors in process gas industries are analysed. This paper also introduces the research and developments for high performance compressors in China. The recent research results on efficiency improvement methods, stability improvement, online monitor and fault diagnosis will also be presented in details.

  10. Overall Key Performance Indicator to Optimizing Operation of High-Pressure Homogenizers for a Reliable Quantification of Intracellular Components in Pichia pastoris.

    PubMed

    Garcia-Ortega, Xavier; Reyes, Cecilia; Montesinos, José Luis; Valero, Francisco

    2015-01-01

    The most commonly used cell disruption procedures may present lack of reproducibility, which introduces significant errors in the quantification of intracellular components. In this work, an approach consisting in the definition of an overall key performance indicator (KPI) was implemented for a lab scale high-pressure homogenizer (HPH) in order to determine the disruption settings that allow the reliable quantification of a wide sort of intracellular components. This innovative KPI was based on the combination of three independent reporting indicators: decrease of absorbance, release of total protein, and release of alkaline phosphatase activity. The yeast Pichia pastoris growing on methanol was selected as model microorganism due to it presents an important widening of the cell wall needing more severe methods and operating conditions than Escherichia coli and Saccharomyces cerevisiae. From the outcome of the reporting indicators, the cell disruption efficiency achieved using HPH was about fourfold higher than other lab standard cell disruption methodologies, such bead milling cell permeabilization. This approach was also applied to a pilot plant scale HPH validating the methodology in a scale-up of the disruption process. This innovative non-complex approach developed to evaluate the efficacy of a disruption procedure or equipment can be easily applied to optimize the most common disruption processes, in order to reach not only reliable quantification but also recovery of intracellular components from cell factories of interest.

  11. Overall Key Performance Indicator to Optimizing Operation of High-Pressure Homogenizers for a Reliable Quantification of Intracellular Components in Pichia pastoris

    PubMed Central

    Garcia-Ortega, Xavier; Reyes, Cecilia; Montesinos, José Luis; Valero, Francisco

    2015-01-01

    The most commonly used cell disruption procedures may present lack of reproducibility, which introduces significant errors in the quantification of intracellular components. In this work, an approach consisting in the definition of an overall key performance indicator (KPI) was implemented for a lab scale high-pressure homogenizer (HPH) in order to determine the disruption settings that allow the reliable quantification of a wide sort of intracellular components. This innovative KPI was based on the combination of three independent reporting indicators: decrease of absorbance, release of total protein, and release of alkaline phosphatase activity. The yeast Pichia pastoris growing on methanol was selected as model microorganism due to it presents an important widening of the cell wall needing more severe methods and operating conditions than Escherichia coli and Saccharomyces cerevisiae. From the outcome of the reporting indicators, the cell disruption efficiency achieved using HPH was about fourfold higher than other lab standard cell disruption methodologies, such bead milling cell permeabilization. This approach was also applied to a pilot plant scale HPH validating the methodology in a scale-up of the disruption process. This innovative non-complex approach developed to evaluate the efficacy of a disruption procedure or equipment can be easily applied to optimize the most common disruption processes, in order to reach not only reliable quantification but also recovery of intracellular components from cell factories of interest. PMID:26284241

  12. The evolution of automated launch processing

    NASA Technical Reports Server (NTRS)

    Tomayko, James E.

    1988-01-01

    The NASA Launch Processing System (LPS) to which attention is presently given has arrived at satisfactory solutions for the distributed-computing, good user interface and dissimilar-hardware interface, and automation-related problems that emerge in the specific arena of spacecraft launch preparations. An aggressive effort was made to apply the lessons learned in the 1960s, during the first attempts at automatic launch vehicle checkout, to the LPS. As the Space Shuttle System continues to evolve, the primary contributor to safety and reliability will be the LPS.

  13. Automatic non-destructive system for quality assurance of welded elements in the aircraft industry

    NASA Astrophysics Data System (ADS)

    Chady, Tomasz; Waszczuk, Paweł; Szydłowski, Michał; Szwagiel, Mariusz

    2018-04-01

    Flaws that might be a result of the welding process have to be detected, in order to assure high quality thus reliability of elements exploited in aircraft industry. Currently the inspection stage is conducted manually by a qualified workforce. There are no commercially available systems that could support or replace humans in the flaw detection process. In this paper authors present a novel non-destructive system developed for quality assurance purposes of welded elements utilized in the aircraft industry.

  14. Development of a Scale to Measure Academic Capital in High-Risk College Students

    ERIC Educational Resources Information Center

    Winkler, Christa; Sriram, Rishi

    2015-01-01

    This study presents a psychometric instrument that measures academic capital in college students. Academic capital is a set of social processes that aid students in acquiring the knowledge and support necessary to access and navigate higher education. This study establishes the validity and reliability of the Academic Capital Scale. In addition to…

  15. Video Analysis of Mother-Child Interactions: Does the Role of Experience Affect the Accuracy and Reliability of Clinical Observations?

    ERIC Educational Resources Information Center

    Choo, Dawn; Dettman, Shani J.

    2016-01-01

    During the pre- and post-implant habilitation process, mothers of children using cochlear implants may be coached by clinicians to use appropriate communicative strategies during play according to the family's choice of communication approach. The present study compared observations made by experienced and inexperienced individuals in the analysis…

  16. Improving the Financial Aid Delivery Process and the Federal Family Education Loan Program: Program Recommendations.

    ERIC Educational Resources Information Center

    Coalition for Student Loan Reform, Washington, DC.

    This publication presents a set of eight recommended reforms and improvements for delivering financial aid to postsecondary students especially the Federal Family Education Loan Program (FFELP). The recommendations are: (1) make applying for student aid simpler for students; (2) assure the continued availability of a dependable, reliable source of…

  17. Mastering Overdetection and Underdetection in Learner-Answer Processing: Simple Techniques for Analysis and Diagnosis

    ERIC Educational Resources Information Center

    Blanchard, Alexia; Kraif, Olivier; Ponton, Claude

    2009-01-01

    This paper presents a "didactic triangulation" strategy to cope with the problem of reliability of NLP applications for computer-assisted language learning (CALL) systems. It is based on the implementation of basic but well mastered NLP techniques and puts the emphasis on an adapted gearing between computable linguistic clues and didactic features…

  18. Effects of Secondary Task Modality and Processing Code on Automation Trust and Utilization During Simulated Airline Luggage Screening

    NASA Technical Reports Server (NTRS)

    Phillips, Rachel; Madhavan, Poornima

    2010-01-01

    The purpose of this research was to examine the impact of environmental distractions on human trust and utilization of automation during the process of visual search. Participants performed a computer-simulated airline luggage screening task with the assistance of a 70% reliable automated decision aid (called DETECTOR) both with and without environmental distractions. The distraction was implemented as a secondary task in either a competing modality (visual) or non-competing modality (auditory). The secondary task processing code either competed with the luggage screening task (spatial code) or with the automation's textual directives (verbal code). We measured participants' system trust, perceived reliability of the system (when a target weapon was present and absent), compliance, reliance, and confidence when agreeing and disagreeing with the system under both distracted and undistracted conditions. Results revealed that system trust was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Perceived reliability of the system (when the target was present) was significantly higher when the secondary task was visual rather than auditory. Compliance with the aid increased in all conditions except for the auditory-verbal condition, where it decreased. Similar to the pattern for trust, reliance on the automation was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Confidence when agreeing with the system decreased with the addition of any kind of distraction; however, confidence when disagreeing increased with the addition of an auditory secondary task but decreased with the addition of a visual task. A model was developed to represent the research findings and demonstrate the relationship between secondary task modality, processing code, and automation use. Results suggest that the nature of environmental distractions influence interaction with automation via significant effects on trust and system utilization. These findings have implications for both automation design and operator training.

  19. Functional description of signal processing in the Rogue GPS receiver

    NASA Technical Reports Server (NTRS)

    Thomas, J. B.

    1988-01-01

    Over the past year, two Rogue GPS prototype receivers have been assembled and successfully subjected to a variety of laboratory and field tests. A functional description is presented of signal processing in the Rogue receiver, tracing the signal from RF input to the output values of group delay, phase, and data bits. The receiver can track up to eight satellites, without time multiplexing among satellites or channels, simultaneously measuring both group delay and phase for each of three channels (L1-C/A, L1-P, L2-P). The Rogue signal processing described requires generation of the code for all three channels. Receiver functional design, which emphasized accuracy, reliability, flexibility, and dynamic capability, is summarized. A detailed functional description of signal processing is presented, including C/A-channel and P-channel processing, carrier-aided averaging of group delays, checks for cycle slips, acquistion, and distinctive features.

  20. Thermal sensors to control polymer forming. Challenge and solutions

    NASA Astrophysics Data System (ADS)

    Lemeunier, F.; Boyard, N.; Sarda, A.; Plot, C.; Lefèvre, N.; Petit, I.; Colomines, G.; Allanic, N.; Bailleul, J. L.

    2017-10-01

    Many thermal sensors are already used, for many years, to better understand and control material forming processes, especially polymer processing. Due to technical constraints (high pressure, sealing, sensor dimensions…) the thermal measurement is often performed in the tool or close its surface. Thus, it only gives partial and disturbed information. Having reliable information about the heat flux exchanges between the tool and the material during the process would be very helpful to improve the control of the process and to favor the development of new materials. In this work, we present several sensors developed in labs to study the molding steps in forming processes. The analysis of the obtained thermal measurements (temperature, heat flux) shows the required sensitivity threshold of sensitivity of thermal sensors to be able to detect on-line the rate of thermal reaction. Based on these data, we will present new sensor designs which have been patented.

  1. Replicable effects of primes on human behavior.

    PubMed

    Payne, B Keith; Brown-Iannuzzi, Jazmin L; Loersch, Chris

    2016-10-01

    [Correction Notice: An Erratum for this article was reported online in Journal of Experimental Psychology: General on Oct 31 2016 (see record 2016-52334-001). ] The effect of primes (i.e., incidental cues) on human behavior has become controversial. Early studies reported counterintuitive findings, suggesting that primes can shape a wide range of human behaviors. Recently, several studies failed to replicate some earlier priming results, raising doubts about the reliability of those effects. We present a within-subjects procedure for priming behavior, in which participants decide whether to bet or pass on each trial of a gambling game. We report 6 replications (N = 988) showing that primes consistently affected gambling decisions when the decision was uncertain. Decisions were influenced by primes presented visibly, with a warning to ignore the primes (Experiments 1 through 3) and with subliminally presented masked primes (Experiment 4). Using a process dissociation procedure, we found evidence that primes influenced responses through both automatic and controlled processes (Experiments 5 and 6). Results provide evidence that primes can reliably affect behavior, under at least some conditions, without intention. The findings suggest that the psychological question of whether behavior priming effects are real should be separated from methodological issues affecting how easily particular experimental designs will replicate. PsycINFO Database Record (c) 2016 APA, all rights reserved

  2. Methods for Calculating Frequency of Maintenance of Complex Information Security System Based on Dynamics of Its Reliability

    NASA Astrophysics Data System (ADS)

    Varlataya, S. K.; Evdokimov, V. E.; Urzov, A. Y.

    2017-11-01

    This article describes a process of calculating a certain complex information security system (CISS) reliability using the example of the technospheric security management model as well as ability to determine the frequency of its maintenance using the system reliability parameter which allows one to assess man-made risks and to forecast natural and man-made emergencies. The relevance of this article is explained by the fact the CISS reliability is closely related to information security (IS) risks. Since reliability (or resiliency) is a probabilistic characteristic of the system showing the possibility of its failure (and as a consequence - threats to the protected information assets emergence), it is seen as a component of the overall IS risk in the system. As it is known, there is a certain acceptable level of IS risk assigned by experts for a particular information system; in case of reliability being a risk-forming factor maintaining an acceptable risk level should be carried out by the routine analysis of the condition of CISS and its elements and their timely service. The article presents a reliability parameter calculation for the CISS with a mixed type of element connection, a formula of the dynamics of such system reliability is written. The chart of CISS reliability change is a S-shaped curve which can be divided into 3 periods: almost invariable high level of reliability, uniform reliability reduction, almost invariable low level of reliability. Setting the minimum acceptable level of reliability, the graph (or formula) can be used to determine the period of time during which the system would meet requirements. Ideally, this period should not be longer than the first period of the graph. Thus, the proposed method of calculating the CISS maintenance frequency helps to solve a voluminous and critical task of the information assets risk management.

  3. Present status and trends of image fusion

    NASA Astrophysics Data System (ADS)

    Xiang, Dachao; Fu, Sheng; Cai, Yiheng

    2009-10-01

    Image fusion information extracted from multiple images which is more accurate and reliable than that from just a single image. Since various images contain different information aspects of the measured parts, and comprehensive information can be obtained by integrating them together. Image fusion is a main branch of the application of data fusion technology. At present, it was widely used in computer vision technology, remote sensing, robot vision, medical image processing and military field. This paper mainly presents image fusion's contents, research methods, and the status quo at home and abroad, and analyzes the development trend.

  4. Evaluation of the psychometric properties of the main meal quality index when applied in the UK population.

    PubMed

    Gorgulho, B M; Pot, G K; Marchioni, D M

    2017-05-01

    The aim of this study was to evaluate the validity and reliability of the Main Meal Quality Index when applied on the UK population. The indicator was developed to assess meal quality in different populations, and is composed of 10 components: fruit, vegetables (excluding potatoes), ratio of animal protein to total protein, fiber, carbohydrate, total fat, saturated fat, processed meat, sugary beverages and desserts, and energy density, resulting in a score range of 0-100 points. The performance of the indicator was measured using strategies for assessing content validity, construct validity, discriminant validity and reliability, including principal component analysis, linear regression models and Cronbach's alpha. The indicator presented good reliability. The Main Meal Quality Index has been shown to be valid for use as an instrument to evaluate, monitor and compare the quality of meals consumed by adults in the United Kingdom.

  5. Validation of new psychosocial factors questionnaires: a Colombian national study.

    PubMed

    Villalobos, Gloria H; Vargas, Angélica M; Rondón, Martin A; Felknor, Sarah A

    2013-01-01

    The study of workers' health problems possibly associated with stressful conditions requires valid and reliable tools for monitoring risk factors. The present study validates two questionnaires to assess psychosocial risk factors for stress-related illnesses within a sample of Colombian workers. The validation process was based on a representative sample survey of 2,360 Colombian employees, aged 18-70 years. Worker response rate was 90%; 46% of the responders were women. Internal consistency was calculated, construct validity was tested with factor analysis and concurrent validity was tested with Spearman correlations. The questionnaires demonstrated adequate reliability (0.88-0.95). Factor analysis confirmed the dimensions proposed in the measurement model. Concurrent validity resulted in significant correlations with stress and health symptoms. "Work and Non-work Psychosocial Factors Questionnaires" were found to be valid and reliable for the assessment of workers' psychosocial factors, and they provide information for research and intervention. Copyright © 2012 Wiley Periodicals, Inc.

  6. Integrated Logistics Support approach: concept for the new big projects: E-ELT, SKA, CTA

    NASA Astrophysics Data System (ADS)

    Marchiori, G.; Rampini, F.; Formentin, F.

    2014-08-01

    The Integrated Logistic Support is a process supporting strategies and optimizing activities for a correct project management and system engineering development. From the design & engineering of complex technical systems, to the erection on site, acceptance and after-sales service, EIE GROUP covers all aspects of the Integrated Logistics Support (ILS) process that includes: costing process centered around the life cycle cost and Level of Repair Analyses; engineering process which influences the design via means of reliability, modularization, etc.; technical publishing process based on international specifications; ordering administration process for supply support. Through the ILS, EIE GROUP plans and directs the identification and development of logistics support and system requirements for its products, with the goal of creating systems that last longer and require less support, thereby reducing costs and increasing return on investments. ILS therefore, addresses these aspects of supportability not only during acquisition, but also throughout the operational life cycle of the system. The impact of the ILS is often measured in terms of metrics such as reliability, availability, maintainability and testability (RAMT), and System Safety (RAMS). Example of the criteria and approach adopted by EIE GROUP during the design, manufacturing and test of the ALMA European Antennas and during the design phase of the E-ELT telescope and Dome are presented.

  7. Automatic detection of blurred images in UAV image sets

    NASA Astrophysics Data System (ADS)

    Sieberth, Till; Wackrow, Rene; Chandler, Jim H.

    2016-12-01

    Unmanned aerial vehicles (UAV) have become an interesting and active research topic for photogrammetry. Current research is based on images acquired by an UAV, which have a high ground resolution and good spectral and radiometrical resolution, due to the low flight altitudes combined with a high resolution camera. UAV image flights are also cost effective and have become attractive for many applications including, change detection in small scale areas. One of the main problems preventing full automation of data processing of UAV imagery is the degradation effect of blur caused by camera movement during image acquisition. This can be caused by the normal flight movement of the UAV as well as strong winds, turbulence or sudden operator inputs. This blur disturbs the visual analysis and interpretation of the data, causes errors and can degrade the accuracy in automatic photogrammetric processing algorithms. The detection and removal of these images is currently achieved manually, which is both time consuming and prone to error, particularly for large image-sets. To increase the quality of data processing an automated process is necessary, which must be both reliable and quick. This paper describes the development of an automatic filtering process, which is based upon the quantification of blur in an image. Images with known blur are processed digitally to determine a quantifiable measure of image blur. The algorithm is required to process UAV images fast and reliably to relieve the operator from detecting blurred images manually. The newly developed method makes it possible to detect blur caused by linear camera displacement and is based on human detection of blur. Humans detect blurred images best by comparing it to other images in order to establish whether an image is blurred or not. The developed algorithm simulates this procedure by creating an image for comparison using image processing. Creating internally a comparable image makes the method independent of additional images. However, the calculated blur value named SIEDS (saturation image edge difference standard-deviation) on its own does not provide an absolute number to judge if an image is blurred or not. To achieve a reliable judgement of image sharpness the SIEDS value has to be compared to other SIEDS values from the same dataset. The speed and reliability of the method was tested using a range of different UAV datasets. Two datasets will be presented in this paper to demonstrate the effectiveness of the algorithm. The algorithm proves to be fast and the returned values are optically correct, making the algorithm applicable for UAV datasets. Additionally, a close range dataset was processed to determine whether the method is also useful for close range applications. The results show that the method is also reliable for close range images, which significantly extends the field of application for the algorithm.

  8. Integrating Safety and Mission Assurance in Design

    NASA Technical Reports Server (NTRS)

    Cianciola, Chris; Crane, Kenneth

    2008-01-01

    This presentation describes how the Ares Projects are learning from the successes and failures of previous launch systems in order to maximize safety and reliability while maintaining fiscal responsibility. The Ares Projects are integrating Safety and Mission Assurance into design activities and embracing independent assessments by Quality experts in thorough reviews of designs and processes. Incorporating Lean thinking into the design process, Ares is also streamlining existing processes and future manufacturing flows which will yield savings during production. Understanding the value of early involvement of Quality experts, the Ares Projects are leading launch vehicle development into the 21st century.

  9. Numerical simulation of complex part manufactured by selective laser melting process

    NASA Astrophysics Data System (ADS)

    Van Belle, Laurent

    2017-10-01

    Selective Laser Melting (SLM) process belonging to the family of the Additive Manufacturing (AM) technologies, enable to build parts layer by layer, from metallic powder and a CAD model. Physical phenomena that occur in the process have the same issues as conventional welding. Thermal gradients generate significant residual stresses and distortions in the parts. Moreover, the large and complex parts to manufacturing, accentuate the undesirable effects. Therefore, it is essential for manufacturers to offer a better understanding of the process and to ensure production reliability of parts with high added value. This paper focuses on the simulation of manufacturing turbine by SLM process in order to calculate residual stresses and distortions. Numerical results will be presented.

  10. Recommendations for elaboration, transcultural adaptation and validation process of tests in Speech, Hearing and Language Pathology.

    PubMed

    Pernambuco, Leandro; Espelt, Albert; Magalhães, Hipólito Virgílio; Lima, Kenio Costa de

    2017-06-08

    to present a guide with recommendations for translation, adaptation, elaboration and process of validation of tests in Speech and Language Pathology. the recommendations were based on international guidelines with a focus on the elaboration, translation, cross-cultural adaptation and validation process of tests. the recommendations were grouped into two Charts, one of them with procedures for translation and transcultural adaptation and the other for obtaining evidence of validity, reliability and measures of accuracy of the tests. a guide with norms for the organization and systematization of the process of elaboration, translation, cross-cultural adaptation and validation process of tests in Speech and Language Pathology was created.

  11. The Topological Weighted Centroid (TWC): A topological approach to the time-space structure of epidemic and pseudo-epidemic processes

    NASA Astrophysics Data System (ADS)

    Buscema, Massimo; Massini, Giulia; Sacco, Pier Luigi

    2018-02-01

    This paper offers the first systematic presentation of the topological approach to the analysis of epidemic and pseudo-epidemic spatial processes. We introduce the basic concepts and proofs, at test the approach on a diverse collection of case studies of historically documented epidemic and pseudo-epidemic processes. The approach is found to consistently provide reliable estimates of the structural features of epidemic processes, and to provide useful analytical insights and interpretations of fragmentary pseudo-epidemic processes. Although this analysis has to be regarded as preliminary, we find that the approach's basic tenets are strongly corroborated by this first test and warrant future research in this vein.

  12. Testing for PV Reliability (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, S.; Bansal, S.

    The DOE SUNSHOT workshop is seeking input from the community about PV reliability and how the DOE might address gaps in understanding. This presentation describes the types of testing that are needed for PV reliability and introduces a discussion to identify gaps in our understanding of PV reliability testing.

  13. A signal processing framework for simultaneous detection of multiple environmental contaminants

    NASA Astrophysics Data System (ADS)

    Chakraborty, Subhadeep; Manahan, Michael P.; Mench, Matthew M.

    2013-11-01

    The possibility of large-scale attacks using chemical warfare agents (CWAs) has exposed the critical need for fundamental research enabling the reliable, unambiguous and early detection of trace CWAs and toxic industrial chemicals. This paper presents a unique approach for the identification and classification of simultaneously present multiple environmental contaminants by perturbing an electrochemical (EC) sensor with an oscillating potential for the extraction of statistically rich information from the current response. The dynamic response, being a function of the degree and mechanism of contamination, is then processed with a symbolic dynamic filter for the extraction of representative patterns, which are then classified using a trained neural network. The approach presented in this paper promises to extend the sensing power and sensitivity of these EC sensors by augmenting and complementing sensor technology with state-of-the-art embedded real-time signal processing capabilities.

  14. Consideration of the use of origami-style solar panels for use on a terrestrial/orbital wireless power generation and transmission spacecraft

    NASA Astrophysics Data System (ADS)

    Holland, Alexander F.; Pearson, Jens; Lysford, Wilson; Straub, Jeremy

    2016-05-01

    This paper presents work on the development of Origami-style solar panels and their adaption and efficacy for use in Earth orbit. It focuses on the enabling capability of this technology for the generation and transmission of power. The proposed approach provides increased collection (solar panel) and transmission (microwave radiation) surface area, as compared to other systems with similar mass and volume. An overview of the system is presented, including its pre-deployment configuration, the deployment process and its final configuration. Its utility for wireless power transmission mission is then considered. An economic discussion is then presented to consider how the mass and volume efficiencies provided enable the system to approach target willingness-to-pay values that were presented and considered in prior work. A key consideration regarding the use of wireless power transfer in Earth orbit is the reliability of the technology. This has several different areas of consideration. It must reliably supply power to its customers (or they would have to have local generation capabilities sufficient for their needs, defeating the benefit of this system). It must also be shown to reliably supply power only to designated locations (and not inadvertently or otherwise beam power at other locations). The effect of the system design (including the Origami structure and deployment / rigidity mechanisms) is considered to assess whether the use of this technology may impair either of these key mission/safety-critical goals. This analysis is presented and a discussion of mitigation techniques to several prospective problems is presented, before concluding with a discussion of future work.

  15. Crystalline-silicon reliability lessons for thin-film modules

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1985-01-01

    The reliability of crystalline silicon modules has been brought to a high level with lifetimes approaching 20 years, and excellent industry credibility and user satisfaction. The transition from crystalline modules to thin film modules is comparable to the transition from discrete transistors to integrated circuits. New cell materials and monolithic structures will require new device processing techniques, but the package function and design will evolve to a lesser extent. Although there will be new encapsulants optimized to take advantage of the mechanical flexibility and low temperature processing features of thin films, the reliability and life degradation stresses and mechanisms will remain mostly unchanged. Key reliability technologies in common between crystalline and thin film modules include hot spot heating, galvanic and electrochemical corrosion, hail impact stresses, glass breakage, mechanical fatigue, photothermal degradation of encapsulants, operating temperature, moisture sorption, circuit design strategies, product safety issues, and the process required to achieve a reliable product from a laboratory prototype.

  16. Applying the High Reliability Health Care Maturity Model to Assess Hospital Performance: A VA Case Study.

    PubMed

    Sullivan, Jennifer L; Rivard, Peter E; Shin, Marlena H; Rosen, Amy K

    2016-09-01

    The lack of a tool for categorizing and differentiating hospitals according to their high reliability organization (HRO)-related characteristics has hindered progress toward implementing and sustaining evidence-based HRO practices. Hospitals would benefit both from an understanding of the organizational characteristics that support HRO practices and from knowledge about the steps necessary to achieve HRO status to reduce the risk of harm and improve outcomes. The High Reliability Health Care Maturity (HRHCM) model, a model for health care organizations' achievement of high reliability with zero patient harm, incorporates three major domains critical for promoting HROs-Leadership, Safety Culture, and Robust Process Improvement ®. A study was conducted to examine the content validity of the HRHCM model and evaluate whether it can differentiate hospitals' maturity levels for each of the model's components. Staff perceptions of patient safety at six US Department of Veterans Affairs (VA) hospitals were examined to determine whether all 14 HRHCM components were present and to characterize each hospital's level of organizational maturity. Twelve of the 14 components from the HRHCM model were detected; two additional characteristics emerged that are present in the HRO literature but not represented in the model-teamwork culture and system-focused tools for learning and improvement. Each hospital's level of organizational maturity could be characterized for 9 of the 14 components. The findings suggest the HRHCM model has good content validity and that there is differentiation between hospitals on model components. Additional research is needed to understand how these components can be used to build the infrastructure necessary for reaching high reliability.

  17. Reliability Constrained Priority Load Shedding for Aerospace Power System Automation

    NASA Technical Reports Server (NTRS)

    Momoh, James A.; Zhu, Jizhong; Kaddah, Sahar S.; Dolce, James L. (Technical Monitor)

    2000-01-01

    The need for improving load shedding on board the space station is one of the goals of aerospace power system automation. To accelerate the optimum load-shedding functions, several constraints must be involved. These constraints include congestion margin determined by weighted probability contingency, component/system reliability index, generation rescheduling. The impact of different faults and indices for computing reliability were defined before optimization. The optimum load schedule is done based on priority, value and location of loads. An optimization strategy capable of handling discrete decision making, such as Everett optimization, is proposed. We extended Everett method to handle expected congestion margin and reliability index as constraints. To make it effective for real time load dispatch process, a rule-based scheme is presented in the optimization method. It assists in selecting which feeder load to be shed, the location of the load, the value, priority of the load and cost benefit analysis of the load profile is included in the scheme. The scheme is tested using a benchmark NASA system consisting of generators, loads and network.

  18. Integration of tools for the Design and Assessment of High-Performance, Highly Reliable Computing Systems (DAHPHRS), phase 1

    NASA Technical Reports Server (NTRS)

    Scheper, C.; Baker, R.; Frank, G.; Yalamanchili, S.; Gray, G.

    1992-01-01

    Systems for Space Defense Initiative (SDI) space applications typically require both high performance and very high reliability. These requirements present the systems engineer evaluating such systems with the extremely difficult problem of conducting performance and reliability trade-offs over large design spaces. A controlled development process supported by appropriate automated tools must be used to assure that the system will meet design objectives. This report describes an investigation of methods, tools, and techniques necessary to support performance and reliability modeling for SDI systems development. Models of the JPL Hypercubes, the Encore Multimax, and the C.S. Draper Lab Fault-Tolerant Parallel Processor (FTPP) parallel-computing architectures using candidate SDI weapons-to-target assignment algorithms as workloads were built and analyzed as a means of identifying the necessary system models, how the models interact, and what experiments and analyses should be performed. As a result of this effort, weaknesses in the existing methods and tools were revealed and capabilities that will be required for both individual tools and an integrated toolset were identified.

  19. Hybrid time-variant reliability estimation for active control structures under aleatory and epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Xiong, Chuang; Wang, Xiaojun; Li, Yunlong; Xu, Menghui

    2018-04-01

    Considering that multi-source uncertainties from inherent nature as well as the external environment are unavoidable and severely affect the controller performance, the dynamic safety assessment with high confidence is of great significance for scientists and engineers. In view of this, the uncertainty quantification analysis and time-variant reliability estimation corresponding to the closed-loop control problems are conducted in this study under a mixture of random, interval, and convex uncertainties. By combining the state-space transformation and the natural set expansion, the boundary laws of controlled response histories are first confirmed with specific implementation of random items. For nonlinear cases, the collocation set methodology and fourth Rounge-Kutta algorithm are introduced as well. Enlightened by the first-passage model in random process theory as well as by the static probabilistic reliability ideas, a new definition of the hybrid time-variant reliability measurement is provided for the vibration control systems and the related solution details are further expounded. Two engineering examples are eventually presented to demonstrate the validity and applicability of the methodology developed.

  20. 76 FR 16263 - Revision to Electric Reliability Organization Definition of Bulk Electric System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ...'s Reliability Standards Development Process, to revise its definition of the term ``bulk electric... definition of ``bulk electric system'' through the NERC Standards Development Process to address the... undertake the process of revising the bulk electric system definition to address the Commission's concerns...

  1. The Effect of Epoxy Molding Compound Floor Life to Reliability Performance and mold ability for QFN Package

    NASA Astrophysics Data System (ADS)

    Peanpunga, Udom; Ugsornrat, Kessararat; Thorlor, Panakamol; Sumithpibul, Chalermsak

    2017-09-01

    This research studied about an epoxy molding compound (EMC) floor life to reliability performance of integrated circuit (IC) package. Molding is the process for protecting the die of IC package form mechanical and chemical reaction from external environment by shaping EMC. From normal manufacturing process, the EMC is stored in the frozen at 5oC and left at around room temperature for aging time or floor life before molding process. The EMC floor life effect to its properties and reliability performance of IC package. Therefore, this work interested in varied the floor life of EMC before molding process to analyze properties of EMC such as spiral flow length, gelation time, and viscosity. In experiment, the floor life of EMC was varied to check the effect of its property to reliability performance. The EMC floor life were varied from 0 hours to 60 hours with a step of 12 hours and observed wire sweep, incomplete EMC, and delamination inside the packages for 3x3, 5x5 and 8x8 mm2 of QFN packages. The evaluation showed about clearly effect of EMC floor life to IC packaging reliability. EMC floor life is not any concern for EMC property, moldabilty, and reliability from 0 hours to 48 hours for molding process of 3x3,5x5 and 8x8 mm2 QFN packaging manufacturing

  2. Information Management of a Structured Admissions Interview Process in a Medical College with an Apple II System

    PubMed Central

    O'Reilly, Robert; Fedorko, Steve; Nicholson, Nigel

    1983-01-01

    This paper describes a structured interview process for medical school admissions supported by an Apple II computer system which provides feedback to interviewers and the College admissions committee. Presented are the rationale for the system, the preliminary results of analysis of some of the interview data, and a brief description of the computer program and output. The present data show that the structured interview yields very high interrater reliability coefficients, is acceptable to the medical school faculty, and results in quantitative data useful in the admission process. The system continues in development at this time, a second year of data will be shortly available, and further refinements are being made to the computer program to enhance its utilization and exportability.

  3. Predicting Software Suitability Using a Bayesian Belief Network

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  4. High voltage requirements and issues for the 1990's. [for spacecraft power supplies

    NASA Technical Reports Server (NTRS)

    Dunbar, W. G.; Faymon, K. A.

    1984-01-01

    The development of high-power high-voltage space systems will require advances in power generation and processing. The systems must be reliable, adaptable, and durable for space mission success. The issues, which must be resolved in order to produce a high power system, are weight and volume reduction of components and modules and the creation of a reliable high repetition pulse power processor. Capacitor energy density must be increased by twice the present capacity and packaging must be reduced by a factor of 10 to 20 times. The packaging must also protect the system from interaction with the natural space environment and the induced environment, produced from spacecraft systems and environment interaction.

  5. Reliability Growth of Tactical Coolers at CMC Electronics Cincinnati: 1/5-Watt Cooler Test Report

    NASA Astrophysics Data System (ADS)

    Kuo, D. T.; Lody, T. D.

    2004-06-01

    CMC Electronics Cincinnati (CMC) is conducting a reliability growth program to extend the life of tactical Stirling-cycle cryocoolers. The continuous product improvement processes consist of testing production coolers to failure, determining the root cause, incorporating improvements and verification. The most recent life data for the 1/5-Watt Cooler (Model B512B) is presented with a discussion of leading root causes and potential improvements. The mean time to failure (MTTF) life of the coolers was found to be 22,552 hours with the root cause of failure attributed to the accumulation of methane and carbon dioxide in the cooler and the wear of the piston.

  6. Integrating High-Reliability Principles to Transform Access and Throughput by Creating a Centralized Operations Center.

    PubMed

    Davenport, Paul B; Carter, Kimberly F; Echternach, Jeffrey M; Tuck, Christopher R

    2018-02-01

    High-reliability organizations (HROs) demonstrate unique and consistent characteristics, including operational sensitivity and control, situational awareness, hyperacute use of technology and data, and actionable process transformation. System complexity and reliance on information-based processes challenge healthcare organizations to replicate HRO processes. This article describes a healthcare organization's 3-year journey to achieve key HRO features to deliver high-quality, patient-centric care via an operations center powered by the principles of high-reliability data and software to impact patient throughput and flow.

  7. A general software reliability process simulation technique

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1991-01-01

    The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.

  8. Improvement of the reliability of laser beam microwelding as interconnection technique

    NASA Astrophysics Data System (ADS)

    Glasmacher, Mathias; Pucher, Hans-Joerg; Geiger, Manfred

    1996-04-01

    The requirements of actual trends for joining within modern electronics production can be met with the technique of laser beam micro welding, which is the topic of this paper. Thereby component leads are welded directly to the conducting tracks of the circuit board. This technique is not limited to electronics, because fine mechanical parts can be joined with the same equipment, too. The advantages as high temperature strength, reduced manufacturing time and simplified material separation at the end of the life cycle are noted. Furthermore the drawbacks of laser beam micro welding as a competitive joining technique to soldering are discussed. The reasons for the unstable process behavior of different welding scenarios can be understood by taking the changes of some process parameters into account. Since the process reliability can be improved by a proper process design as well as by closed-loop-control, results of finite element calculations of the temperature field as well as experimental setup for the determination of the melting point are presented. Future work is stated to spread the applicability of this joining technique as well as to develop an on-line control for high performance welding of locally restricted structures.

  9. Commercial Parts Technology Qualification Processes

    NASA Technical Reports Server (NTRS)

    Cooper, Mark S.

    2013-01-01

    Many high-reliability systems, including space systems, use selected commercial parts (including Plastic Encapsulated Microelectronics or PEMs) for unique functionality, small size, low weight, high mechanical shock resistance, and other factors. Predominantly this usage is subjected to certain 100% tests (typically called screens) and certain destructive tests usually (but not always) performed on the flight lot (typically called qualification tests). Frequently used approaches include those documented in EEE-INST-002 and JPL DocID62212 (which are sometimes modified by the particular aerospace space systems manufacturer). In this study, approaches from these documents and several space systems manufacturers are compared to approaches from a launch systems manufacturer (SpaceX), an implantable medical electronics manufacturer (Medtronics), and a high-reliability transport system process (automotive systems). In the conclusions section, these processes are outlined for all of these cases and presented in tabular form. Then some simple comparisons are made. In this introduction section, the PEM technology qualification process is described, as documented in EEE-INST-002 (written by the Goddard Space Flight Center, GSFC), as well as the somewhat modified approach employed at the Jet Propulsion Laboratory (JPL). Approaches used at several major NASA contractors are also described

  10. The Dutch Review Process for Evaluating the Quality of Psychological Tests: History, Procedure, and Results

    ERIC Educational Resources Information Center

    Evers, Arne; Sijtsma, Klaas; Lucassen, Wouter; Meijer, Rob R.

    2010-01-01

    This article describes the 2009 revision of the Dutch Rating System for Test Quality and presents the results of test ratings from almost 30 years. The rating system evaluates the quality of a test on seven criteria: theoretical basis, quality of the testing materials, comprehensiveness of the manual, norms, reliability, construct validity, and…

  11. [Design of blood-pressure parameter auto-acquisition circuit].

    PubMed

    Chen, Y P; Zhang, D L; Bai, H W; Zhang, D A

    2000-02-01

    This paper presents the realization and design of a kind of blood-pressure parameter auto-acquisition circuit. The auto-acquisition of blood-pressure parameter controlled by 89C2051 single chip microcomputer is accomplished by collecting and processing the driving signal of LCD. The circuit that is successfully applied in the home unit of telemedicine system has the simple and reliable properties.

  12. Factorial Validity of a Questionnaire to Evaluate University Students' Initial Perception of Learning Evaluation

    ERIC Educational Resources Information Center

    Doménech-Betoret, Fernando; Fortea-Bagán, Miguel Angel

    2015-01-01

    Introduction: Education research has clearly verified that a student's perception of the system to evaluate the subject matter will play a fundamental role in his/her implication (deep approach vs. surface approach) in the teaching/learning process of the subject matter. The present work aims to examine the factorial validity and reliability of a…

  13. Autonomy in Action: Linking the Act of Looking to Memory Formation in Infancy via Dynamic Neural Fields

    ERIC Educational Resources Information Center

    Perone, Sammy; Spencer, John P.

    2013-01-01

    Looking is a fundamental exploratory behavior by which infants acquire knowledge about the world. In theories of infant habituation, however, looking as an exploratory behavior has been deemphasized relative to the reliable nature with which looking indexes active cognitive processing. We present a new theory that connects looking to the dynamics…

  14. ADAPSO Computer Services Industry Directory of Members, 1972-1973.

    ERIC Educational Resources Information Center

    Association of Data Processing Service Organizations, New York, NY.

    The 1972-73 directory of the Association of Data Processing Service Organizations was designed to provide a list of those members subscribe to the Code of Ethical Standards and can be expected to provide reliable and efficient services to the users in the community. The Code is presented, and then full member firms are listed for states in the…

  15. Intrinsic Motivation and Engagement as "Active Ingredients" in Garden-Based Education: Examining Models and Measures Derived from Self-Determination Theory

    ERIC Educational Resources Information Center

    Skinner, Ellen A.; Chi, Una

    2012-01-01

    Building on self-determination theory, this study presents a model of intrinsic motivation and engagement as "active ingredients" in garden-based education. The model was used to create reliable and valid measures of key constructs, and to guide the empirical exploration of motivational processes in garden-based learning. Teacher- and…

  16. Evaluation of English Achievement Test: A Comparison between High and Low Achievers amongst Selected Elementary School Students of Pakistan

    ERIC Educational Resources Information Center

    Haider, Zubair; Latif, Farah; Akhtar, Samina; Mushtaq, Maria

    2012-01-01

    Validity, reliability and item analysis are critical to the process of evaluating the quality of an educational measurement. The present study evaluates the quality of an assessment constructed to measure elementary school student's achievement in English. In this study, the survey model of descriptive research was used as a research method.…

  17. Exponential order statistic models of software reliability growth

    NASA Technical Reports Server (NTRS)

    Miller, D. R.

    1985-01-01

    Failure times of a software reliabilty growth process are modeled as order statistics of independent, nonidentically distributed exponential random variables. The Jelinsky-Moranda, Goel-Okumoto, Littlewood, Musa-Okumoto Logarithmic, and Power Law models are all special cases of Exponential Order Statistic Models, but there are many additional examples also. Various characterizations, properties and examples of this class of models are developed and presented.

  18. Scrutinizing A Survey-Based Measure of Science and Mathematics Teacher Knowledge: Relationship to Observations of Teaching Practice

    ERIC Educational Resources Information Center

    Talbot, Robert M., III

    2017-01-01

    There is a clear need for valid and reliable instrumentation that measures teacher knowledge. However, the process of investigating and making a case for instrument validity is not a simple undertaking; rather, it is a complex endeavor. This paper presents the empirical case of one aspect of such an instrument validation effort. The particular…

  19. Approaching Terahertz Range with 3-color Broadband Coherent Raman Micro Spectroscopy

    NASA Astrophysics Data System (ADS)

    Ujj, Laszlo; Olson, Trevor; Amos, James

    The presentation reports the recent progress made on reliable signal recording and processing using 3-color broadband coherent Raman scattering (3C-BCRS). Signals are generated either from nanoparticle structures on surfaces or from bulk samples in transmission and in epi-detected mode. Spectra are recorded with a narrowband (at 532 nm) and a broadband radiation produced by a newly optimized optical parametric oscillator using the signal or idler beams. Vibrational and librational bands are measured over the 0.15-15 THz spectral range from solution and crystalline samples. Volumetric Brag-filter approach is introduced for recording 3C-BCRS spectra at the first time. The technical limitations and advantages of the narrowband filtering relative to the Notch-filter technic is clarified. The signal is proportional to the spectral autocorrelation of the broadband radiation therefore the present scheme gives a better signal-to-noise ratio relative to the traditional multiplex CRS methods. This makes the automation of non-model dependent signal processing more reliable to extract vibrational information which is very crucial in coherent Raman microscopy. Financial support from the Hal Marcus College of Science and Engineering is greatly appreciated.

  20. Tungsten coating for improved wear resistance and reliability of microelectromechanical devices

    DOEpatents

    Fleming, James G.; Mani, Seethambal S.; Sniegowski, Jeffry J.; Blewer, Robert S.

    2001-01-01

    A process is disclosed whereby a 5-50-nanometer-thick conformal tungsten coating can be formed over exposed semiconductor surfaces (e.g. silicon, germanium or silicon carbide) within a microelectromechanical (MEM) device for improved wear resistance and reliability. The tungsten coating is formed after cleaning the semiconductor surfaces to remove any organic material and oxide film from the surface. A final in situ cleaning step is performed by heating a substrate containing the MEM device to a temperature in the range of 200-600 .degree. C. in the presence of gaseous nitrogen trifluoride (NF.sub.3). The tungsten coating can then be formed by a chemical reaction between the semiconductor surfaces and tungsten hexafluoride (WF.sub.6) at an elevated temperature, preferably about 450.degree. C. The tungsten deposition process is self-limiting and covers all exposed semiconductor surfaces including surfaces in close contact. The present invention can be applied to many different types of MEM devices including microrelays, micromirrors and microengines. Additionally, the tungsten wear-resistant coating of the present invention can be used to enhance the hardness, wear resistance, electrical conductivity, optical reflectivity and chemical inertness of one or more semiconductor surfaces within a MEM device.

  1. Reliability Analysis of a Glacier Lake Warning System Using a Bayesian Net

    NASA Astrophysics Data System (ADS)

    Sturny, Rouven A.; Bründl, Michael

    2013-04-01

    Beside structural mitigation measures like avalanche defense structures, dams and galleries, warning and alarm systems have become important measures for dealing with Alpine natural hazards. Integrating them into risk mitigation strategies and comparing their effectiveness with structural measures requires quantification of the reliability of these systems. However, little is known about how reliability of warning systems can be quantified and which methods are suitable for comparing their contribution to risk reduction with that of structural mitigation measures. We present a reliability analysis of a warning system located in Grindelwald, Switzerland. The warning system was built for warning and protecting residents and tourists from glacier outburst floods as consequence of a rapid drain of the glacier lake. We have set up a Bayesian Net (BN, BPN) that allowed for a qualitative and quantitative reliability analysis. The Conditional Probability Tables (CPT) of the BN were determined according to manufacturer's reliability data for each component of the system as well as by assigning weights for specific BN nodes accounting for information flows and decision-making processes of the local safety service. The presented results focus on the two alerting units 'visual acoustic signal' (VAS) and 'alerting of the intervention entities' (AIE). For the summer of 2009, the reliability was determined to be 94 % for the VAS and 83 % for the AEI. The probability of occurrence of a major event was calculated as 0.55 % per day resulting in an overall reliability of 99.967 % for the VAS and 99.906 % for the AEI. We concluded that a failure of the VAS alerting unit would be the consequence of a simultaneous failure of the four probes located in the lake and the gorge. Similarly, we deduced that the AEI would fail either if there were a simultaneous connectivity loss of the mobile and fixed network in Grindelwald, an Internet access loss or a failure of the regional operations centre. However, the probability of a common failure of these components was assumed to be low. Overall it can be stated that due to numerous redundancies, the investigated warning system is highly reliable and its influence on risk reduction is very high. Comparable studies in the future are needed to classify these results and to gain more experience how the reliability of warning systems could be determined in practice.

  2. Evaluation methodologies for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.

    1984-01-01

    The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.

  3. The Soft Rock Socketed Monopile with Creep Effects - A Reliability Approach based on Wavelet Neural Networks

    NASA Astrophysics Data System (ADS)

    Kozubal, Janusz; Tomanovic, Zvonko; Zivaljevic, Slobodan

    2016-09-01

    In the present study the numerical model of the pile embedded in marl described by a time dependent model, based on laboratory tests, is proposed. The solutions complement the state of knowledge of the monopile loaded by horizontal force in its head with respect to its random variability values in time function. The investigated reliability problem is defined by the union of failure events defined by the excessive horizontal maximal displacement of the pile head in each periods of loads. Abaqus has been used for modeling of the presented task with a two layered viscoplastic model for marl. The mechanical parameters for both parts of model: plastic and rheological were calibrated based on the creep laboratory test results. The important aspect of the problem is reliability analysis of a monopile in complex environment under random sequences of loads which help understanding the role of viscosity in nature of rock basis constructions. Due to the lack of analytical solutions the computations were done by the method of response surface in conjunction with wavelet neural network as a method recommended for time sequences process and description of nonlinear phenomenon.

  4. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses

    PubMed Central

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-01-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600

  5. Study on fast discrimination of varieties of yogurt using Vis/NIR-spectroscopy

    NASA Astrophysics Data System (ADS)

    He, Yong; Feng, Shuijuan; Deng, Xunfei; Li, Xiaoli

    2006-09-01

    A new approach for discrimination of varieties of yogurt by means of VisINTR-spectroscopy was present in this paper. Firstly, through the principal component analysis (PCA) of spectroscopy curves of 5 typical kinds of yogurt, the clustering of yogurt varieties was processed. The analysis results showed that the cumulate reliabilities of PC1 and PC2 (the first two principle components) were more than 98.956%, and the cumulate reliabilities from PC1 to PC7 (the first seven principle components) was 99.97%. Secondly, a discrimination model of Artificial Neural Network (ANN-BP) was set up. The first seven principles components of the samples were applied as ANN-BP inputs, and the value of type of yogurt were applied as outputs, then the three-layer ANN-BP model was build. In this model, every variety yogurt includes 27 samples, the total number of sample is 135, and the rest 25 samples were used as prediction set. The results showed the distinguishing rate of the five yogurt varieties was 100%. It presented that this model was reliable and practicable. So a new approach for the rapid and lossless discrimination of varieties of yogurt was put forward.

  6. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    PubMed

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Vision-based system for the control and measurement of wastewater flow rate in sewer systems.

    PubMed

    Nguyen, L S; Schaeli, B; Sage, D; Kayal, S; Jeanbourquin, D; Barry, D A; Rossi, L

    2009-01-01

    Combined sewer overflows and stormwater discharges represent an important source of contamination to the environment. However, the harsh environment inside sewers and particular hydraulic conditions during rain events reduce the reliability of traditional flow measurement probes. In the following, we present and evaluate an in situ system for the monitoring of water flow in sewers based on video images. This paper focuses on the measurement of the water level based on image-processing techniques. The developed image-based water level algorithms identify the wall/water interface from sewer images and measure its position with respect to real world coordinates. A web-based user interface and a 3-tier system architecture enable the remote configuration of the cameras and the image-processing algorithms. Images acquired and processed by our system were found to reliably measure water levels and thereby to provide crucial information leading to better understand particular hydraulic behaviors. In terms of robustness and accuracy, the water level algorithm provided equal or better results compared to traditional water level probes in three different in situ configurations.

  8. JCQ scale reliability and responsiveness to changes in manufacturing process.

    PubMed

    d'Errico, Angelo; Punnett, Laura; Gold, Judith E; Gore, Rebecca

    2008-02-01

    The job content questionnaire (JCQ) was administered to automobile manufacturing workers in two interviews, 5 years apart. Between the two interviews, the company introduced substantial changes in production technology in some production areas. The aims were: (1) to describe the impact of these changes on self-reported psychosocial exposures, and (2) to examine test-retest reliability of the JCQ scales, taking into account changes in job assignment and, for a subset of workers, physical ergonomic exposures as assessed through field observations. The study population included 790 subjects at the first and 519 at the second interview, of whom 387 were present in both. Differences in demand and control scores between interviews were analyzed by Wilcoxon matched-pairs signed-rank test. Test-retest reliability of these scales was evaluated by the intraclass correlation coefficient (ICC) and the Spearman's rho coefficient. The introduction of more automated technology produced an overall increase in job control but did not decrease psychological demand. The reliability of the control scale was low overall but increased to an acceptable level among workers who had not changed job. The demand scale had high reliability only among workers whose physical ergonomic exposures were similar on both survey occasions. These results show that 5-year test-retest reliability of self-reported psychosocial exposures is adequate among workers whose job assignment and ergonomic exposures have remained stable over time.

  9. Asymmetric programming: a highly reliable metadata allocation strategy for MLC NAND flash memory-based sensor systems.

    PubMed

    Huang, Min; Liu, Zhaoqing; Qiao, Liyan

    2014-10-10

    While the NAND flash memory is widely used as the storage medium in modern sensor systems, the aggressive shrinking of process geometry and an increase in the number of bits stored in each memory cell will inevitably degrade the reliability of NAND flash memory. In particular, it's critical to enhance metadata reliability, which occupies only a small portion of the storage space, but maintains the critical information of the file system and the address translations of the storage system. Metadata damage will cause the system to crash or a large amount of data to be lost. This paper presents Asymmetric Programming, a highly reliable metadata allocation strategy for MLC NAND flash memory storage systems. Our technique exploits for the first time the property of the multi-page architecture of MLC NAND flash memory to improve the reliability of metadata. The basic idea is to keep metadata in most significant bit (MSB) pages which are more reliable than least significant bit (LSB) pages. Thus, we can achieve relatively low bit error rates for metadata. Based on this idea, we propose two strategies to optimize address mapping and garbage collection. We have implemented Asymmetric Programming on a real hardware platform. The experimental results show that Asymmetric Programming can achieve a reduction in the number of page errors of up to 99.05% with the baseline error correction scheme.

  10. Translation, cultural adaption, and test-retest reliability of Chinese versions of the Edinburgh Handedness Inventory and Waterloo Footedness Questionnaire.

    PubMed

    Yang, Nan; Waddington, Gordon; Adams, Roger; Han, Jia

    2018-05-01

    Quantitative assessments of handedness and footedness are often required in studies of human cognition and behaviour, yet no reliable Chinese versions of commonly used handedness and footedness questionnaires are available. Accordingly, the objective of the present study was to translate the Edinburgh Handedness Inventory (EHI) and the Waterloo Footedness Questionnaire-Revised (WFQ-R) into Mandarin Chinese and to evaluate the reliability and validity of these translated versions in healthy Chinese people. In the first stage of the study, Chinese versions of the EHI and WFQ-R were produced from a process of translation, back translation and examination, with necessary cultural adaptations. The second stage involved determining the reliability and validity of the translated EHI and WFQ-R for the Chinese population. One hundred and ten Chinese participants were tested online, and the results showed that the Cronbach's alpha coefficient of internal consistency was 0.877 for the translated EHI and 0.855 for the translated WFQ-R. Another 170 Chinese participants were tested and re-tested after a 30-day interval. The intra-class correlation coefficients showed high reliability, 0.898 for the translated EHI and 0.869 for the translated WFQ-R. This preliminary validation study found the translated versions to be reliable and valid tools for assessing handedness and footedness in this population.

  11. Asymmetric Programming: A Highly Reliable Metadata Allocation Strategy for MLC NAND Flash Memory-Based Sensor Systems

    PubMed Central

    Huang, Min; Liu, Zhaoqing; Qiao, Liyan

    2014-01-01

    While the NAND flash memory is widely used as the storage medium in modern sensor systems, the aggressive shrinking of process geometry and an increase in the number of bits stored in each memory cell will inevitably degrade the reliability of NAND flash memory. In particular, it's critical to enhance metadata reliability, which occupies only a small portion of the storage space, but maintains the critical information of the file system and the address translations of the storage system. Metadata damage will cause the system to crash or a large amount of data to be lost. This paper presents Asymmetric Programming, a highly reliable metadata allocation strategy for MLC NAND flash memory storage systems. Our technique exploits for the first time the property of the multi-page architecture of MLC NAND flash memory to improve the reliability of metadata. The basic idea is to keep metadata in most significant bit (MSB) pages which are more reliable than least significant bit (LSB) pages. Thus, we can achieve relatively low bit error rates for metadata. Based on this idea, we propose two strategies to optimize address mapping and garbage collection. We have implemented Asymmetric Programming on a real hardware platform. The experimental results show that Asymmetric Programming can achieve a reduction in the number of page errors of up to 99.05% with the baseline error correction scheme. PMID:25310473

  12. Reliability of fully automated versus visually controlled pre- and post-processing of resting-state EEG.

    PubMed

    Hatz, F; Hardmeier, M; Bousleiman, H; Rüegg, S; Schindler, C; Fuhr, P

    2015-02-01

    To compare the reliability of a newly developed Matlab® toolbox for the fully automated, pre- and post-processing of resting state EEG (automated analysis, AA) with the reliability of analysis involving visually controlled pre- and post-processing (VA). 34 healthy volunteers (age: median 38.2 (20-49), 82% female) had three consecutive 256-channel resting-state EEG at one year intervals. Results of frequency analysis of AA and VA were compared with Pearson correlation coefficients, and reliability over time was assessed with intraclass correlation coefficients (ICC). Mean correlation coefficient between AA and VA was 0.94±0.07, mean ICC for AA 0.83±0.05 and for VA 0.84±0.07. AA and VA yield very similar results for spectral EEG analysis and are equally reliable. AA is less time-consuming, completely standardized, and independent of raters and their training. Automated processing of EEG facilitates workflow in quantitative EEG analysis. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  13. The effect of the stability threshold on time to stabilization and its reliability following a single leg drop jump landing.

    PubMed

    Fransz, Duncan P; Huurnink, Arnold; de Boode, Vosse A; Kingma, Idsart; van Dieën, Jaap H

    2016-02-08

    We aimed to provide insight in how threshold selection affects time to stabilization (TTS) and its reliability to support selection of methods to determine TTS. Eighty-two elite youth soccer players performed six single leg drop jump landings. The TTS was calculated based on four processed signals: raw ground reaction force (GRF) signal (RAW), moving root mean square window (RMS), sequential average (SA) or unbounded third order polynomial fit (TOP). For each trial and processing method a wide range of thresholds was applied. Per threshold, reliability of the TTS was assessed through intra-class correlation coefficients (ICC) for the vertical (V), anteroposterior (AP) and mediolateral (ML) direction of force. Low thresholds resulted in a sharp increase of TTS values and in the percentage of trials in which TTS exceeded trial duration. The TTS and ICC were essentially similar for RAW and RMS in all directions; ICC's were mostly 'insufficient' (<0.4) to 'fair' (0.4-0.6) for the entire range of thresholds. The SA signals resulted in the most stable ICC values across thresholds, being 'substantial' (>0.8) for V, and 'moderate' (0.6-0.8) for AP and ML. The ICC's for TOP were 'substantial' for V, 'moderate' for AP, and 'fair' for ML. The present findings did not reveal an optimal threshold to assess TTS in elite youth soccer players following a single leg drop jump landing. Irrespective of threshold selection, the SA and TOP methods yielded sufficiently reliable TTS values, while for RAW and RMS the reliability was insufficient to differentiate between players. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Translated Versions of Voice Handicap Index (VHI)-30 across Languages: A Systematic Review

    PubMed Central

    SEIFPANAHI, Sadegh; JALAIE, Shohreh; NIKOO, Mohammad Reza; SOBHANI-RAD, Davood

    2015-01-01

    Background: In this systematic review, the aim is to investigate different VHI-30 versions between languages regarding their validity, reliability and their translation process. Methods: Articles were extracted systematically from some of the prime databases including Cochrane, googlescholar, MEDLINE (via PubMed gate), Sciencedirect, Web of science, and their reference lists by Voice Handicap Index keyword with only title limitation and time of publication (from 1997 to 2014). However the other limitations (e.g. excluding non-English, other versions of VHI ones, and so on) applied manually after studying the papers. In order to appraise the methodology of the papers, three authors did it by 12-item diagnostic test checklist in “Critical Appraisal Skills Programme” or (CASP) site. After applying all of the screenings, the papers that had the study eligibility criteria such as; translation, validity, and reliability processes, included in this review. Results: The remained non-repeated articles were 12 from different languages. All of them reported validity, reliability and translation method, which presented in details in this review. Conclusion: Mainly the preferred method for translation in the gathered papers was “Brislin’s classic back-translation model (1970), although the procedure was not performed completely but it was more prominent than other translation procedures. High test-retest reliability, internal consistency and moderate construct validity between different languages in regards to all 3 VHI-30 domains confirm the applicability of translated VHI-30 version across languages. PMID:26056664

  15. Efficient Process Migration for Parallel Processing on Non-Dedicated Networks of Workstations

    NASA Technical Reports Server (NTRS)

    Chanchio, Kasidit; Sun, Xian-He

    1996-01-01

    This paper presents the design and preliminary implementation of MpPVM, a software system that supports process migration for PVM application programs in a non-dedicated heterogeneous computing environment. New concepts of migration point as well as migration point analysis and necessary data analysis are introduced. In MpPVM, process migrations occur only at previously inserted migration points. Migration point analysis determines appropriate locations to insert migration points; whereas, necessary data analysis provides a minimum set of variables to be transferred at each migration pint. A new methodology to perform reliable point-to-point data communications in a migration environment is also discussed. Finally, a preliminary implementation of MpPVM and its experimental results are presented, showing the correctness and promising performance of our process migration mechanism in a scalable non-dedicated heterogeneous computing environment. While MpPVM is developed on top of PVM, the process migration methodology introduced in this study is general and can be applied to any distributed software environment.

  16. Integration of Electrodeposited Ni-Fe in MEMS with Low-Temperature Deposition and Etch Processes

    PubMed Central

    Schiavone, Giuseppe; Murray, Jeremy; Perry, Richard; Mount, Andrew R.; Desmulliez, Marc P. Y.; Walton, Anthony J.

    2017-01-01

    This article presents a set of low-temperature deposition and etching processes for the integration of electrochemically deposited Ni-Fe alloys in complex magnetic microelectromechanical systems, as Ni-Fe is known to suffer from detrimental stress development when subjected to excessive thermal loads. A selective etch process is reported which enables the copper seed layer used for electrodeposition to be removed while preserving the integrity of Ni-Fe. In addition, a low temperature deposition and surface micromachining process is presented in which silicon dioxide and silicon nitride are used, respectively, as sacrificial material and structural dielectric. The sacrificial layer can be patterned and removed by wet buffered oxide etch or vapour HF etching. The reported methods limit the thermal budget and minimise the stress development in Ni-Fe. This combination of techniques represents an advance towards the reliable integration of Ni-Fe components in complex surface micromachined magnetic MEMS. PMID:28772683

  17. Chemical Sensing for Buried Landmines - Fundamental Processes Influencing Trace Chemical Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PHELAN, JAMES M.

    2002-05-01

    Mine detection dogs have a demonstrated capability to locate hidden objects by trace chemical detection. Because of this capability, demining activities frequently employ mine detection dogs to locate individual buried landmines or for area reduction. The conditions appropriate for use of mine detection dogs are only beginning to emerge through diligent research that combines dog selection/training, the environmental conditions that impact landmine signature chemical vapors, and vapor sensing performance capability and reliability. This report seeks to address the fundamental soil-chemical interactions, driven by local weather history, that influence the availability of chemical for trace chemical detection. The processes evaluated include:more » landmine chemical emissions to the soil, chemical distribution in soils, chemical degradation in soils, and weather and chemical transport in soils. Simulation modeling is presented as a method to evaluate the complex interdependencies among these various processes and to establish conditions appropriate for trace chemical detection. Results from chemical analyses on soil samples obtained adjacent to landmines are presented and demonstrate the ultra-trace nature of these residues. Lastly, initial measurements of the vapor sensing performance of mine detection dogs demonstrates the extreme sensitivity of dogs in sensing landmine signature chemicals; however, reliability at these ultra-trace vapor concentrations still needs to be determined. Through this compilation, additional work is suggested that will fill in data gaps to improve the utility of trace chemical detection.« less

  18. Detection of Unknown LEO Satellite Using Radar Measurements

    NASA Astrophysics Data System (ADS)

    Kamensky, S.; Samotokhin, A.; Khutorovsky, Z.; Alfriend, T.

    While processing of the radar information aimed at satellite catalog maintenance some measurements do not correlate with cataloged and tracked satellites. These non-correlated measurements participate in the detection (primary orbit determination) of new (not cataloged) satellites. The satellite is considered newly detected when it is missing in the catalog and the primary orbit determination on the basis of the non-correlated measurements provides the accuracy sufficient for reliable correlation of future measurements. We will call this the detection condition. One non-correlated measurement in real conditions does not have enough accuracy and thus does not satisfy the detection condition. Two measurements separated by a revolution or more normally provides orbit determination with accuracy sufficient for selection of other measurements. However, it is not always possible to say with high probability (close to 1) that two measurements belong to one satellite. Three measurements for different revolutions, which are included into one orbit, have significantly higher chances to belong to one satellite. Thus the suggested detection (primary orbit determination) algorithm looks for three uncorrelated measurements in different revolutions for which we can determine the orbit inscribing them. The detection procedure based on search for the triplets is rather laborious. Thus only relatively high efficiency can be the reason for its practical implementation. The work presents the detailed description of the suggested detection procedure based on the search for triplets of uncorrelated measurements (for radar measurements). The break-ups of the tracked satellites provide the most difficult conditions for the operation of the detection algorithm and reveal explicitly its characteristics. The characteristics of time efficiency and reliability of the detected orbits are of maximum interest. Within this work we suggest to determine these characteristics using simulation of break-ups with further acquisition of measurements generated by the fragments. In particular, using simulation we can not only evaluate the characteristics of the algorithm but adjust its parameters for certain conditions: the orbit of the fragmented satellite, the features of the break-up, capabilities of detection radars etc. We describe the algorithm performing the simulation of radar measurements produced by the fragments of the parent satellite. This algorithm accounts of the basic factors affecting the characteristics of time efficiency and reliability of the detection. The catalog maintenance algorithm includes two major components detection and tracking. These are two processes permanently interacting with each other. This is actually in place for the processing of real radar data. The simulation must take this into account since one cannot obtain reliable characteristics of detection procedure simulating only this process. Thus we simulated both processes in their interaction. The work presents the results of simulation for the simplest case of a break-up in near-circular orbit with insignificant atmospheric drag. The simulations show rather high efficiency. We demonstrate as well that the characteristics of time efficiency and reliability of determined orbits essentially depend on the density of the observed break-up fragments.

  19. Reliable Design Versus Trust

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  20. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 2: HARP tutorial

    NASA Technical Reports Server (NTRS)

    Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Bavuso, Salvatore J.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. The Hybrid Automated Reliability Predictor (HARP) tutorial provides insight into HARP modeling techniques and the interactive textual prompting input language via a step-by-step explanation and demonstration of HARP's fault occurrence/repair model and the fault/error handling models. Example applications are worked in their entirety and the HARP tabular output data are presented for each. Simple models are presented at first with each succeeding example demonstrating greater modeling power and complexity. This document is not intended to present the theoretical and mathematical basis for HARP.

  1. Brazilian version of the Protocole Montréal d'Evaluation de la Communication (Protocole MEC): normative and reliability data.

    PubMed

    Fonseca, Rochele Paz; Joanette, Yves; Côté, Hélène; Ska, Bernadette; Giroux, Francine; Fachel, Jandyra Maria Guimarães; Damasceno Ferreira, Gabriela; Parente, Maria Alice de Mattos Pimenta

    2008-11-01

    The lack of standardized instruments to evaluate communication disorders related to the right hemisphere was verified. A new evaluation tool was developed: Protocole Montréal d'Evaluation de la Communication--Protocole MEC, adapted to Brazilian Portuguese--Bateria Montreal de Avaliação da Comunicação--Bateria MAC (Montreal Evaluation of Communication Battery). The purpose was to present stratified normative data by age and educational level, and to verify the reliability parameters of the MEC Battery. 300 individuals, between the ages of 19 and 75 years, and levels of formal education between 2 and 35 years, participated in this study. They were divided equally into six normative groups, according to three age categories (young adults, intermediary age, and seniors) and two educational levels (low and high). Two procedures were used to check reliability: Cronbach alpha and reliability between evaluators, Results were established at the 10th percentile, and an alert point per task for each normative group. Cronbach's alpha was, in general, between .70 and .90 and the average rate of agreement between evaluators varied from .62 to .94. Standards of age and education were established. The reliability of this instrument was verified. The psychometric legitimization of the MEC Battery will contribute to the diagnostic process for communicative disorders.

  2. Compliance of LC50 and NOEC data with Benford's Law: an indication of reliability?

    PubMed

    de Vries, Pepijn; Murk, Albertinka J

    2013-12-01

    Reliability of research data is essential, especially when potentially far-reaching conclusions will be based on them. This is also, amongst others, the case for ecotoxicological data used in risk assessment. Currently, several approaches are available to classify the reliability of ecotoxicological data. The process of classification, such as using the Klimisch score, is time-consuming and focuses on the application of standardised protocols and the documentation of the study. The presence of irregularities and the integrity of the performed work, however, are not addressed. The present study shows that Benford's Law, based on the occurrence of first digits following a logarithmic scale, can be applied to ecotoxicity test data for identifying irregularities. This approach is already successfully applied in accounting. Benford's Law can be used as reliability indicator, in addition to existing reliability classifications. The law can be used to efficiently trace irregularities in large data sets of interpolated (no) effect concentrations such as LC50s (possibly the result of data manipulation), without having to evaluate the source of each individual record. Application of the law to systems in which large amounts of toxicity data are registered (e.g., European Commission Regulation concerning the Registration, Evaluation, Authorisation and Restriction of Chemicals) can therefore be valuable. © 2013 Elsevier Inc. All rights reserved.

  3. Situation awareness acquired from monitoring process plants - the Process Overview concept and measure.

    PubMed

    Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd

    2016-07-01

    We introduce Process Overview, a situation awareness characterisation of the knowledge derived from monitoring process plants. Process Overview is based on observational studies of process control work in the literature. The characterisation is applied to develop a query-based measure called the Process Overview Measure. The goal of the measure is to improve coupling between situation and awareness according to process plant properties and operator cognitive work. A companion article presents the empirical evaluation of the Process Overview Measure in a realistic process control setting. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA based on data collected by process experts. Practitioner Summary: The Process Overview Measure is a query-based measure for assessing operator situation awareness from monitoring process plants in representative settings.

  4. Can Reliability of Multiple Component Measuring Instruments Depend on Response Option Presentation Mode?

    ERIC Educational Resources Information Center

    Menold, Natalja; Raykov, Tenko

    2016-01-01

    This article examines the possible dependency of composite reliability on presentation format of the elements of a multi-item measuring instrument. Using empirical data and a recent method for interval estimation of group differences in reliability, we demonstrate that the reliability of an instrument need not be the same when polarity of the…

  5. CRYOGENIC UPPER STAGE SYSTEM SAFETY

    NASA Technical Reports Server (NTRS)

    Smith, R. Kenneth; French, James V.; LaRue, Peter F.; Taylor, James L.; Pollard, Kathy (Technical Monitor)

    2005-01-01

    NASA s Exploration Initiative will require development of many new systems or systems of systems. One specific example is that safe, affordable, and reliable upper stage systems to place cargo and crew in stable low earth orbit are urgently required. In this paper, we examine the failure history of previous upper stages with liquid oxygen (LOX)/liquid hydrogen (LH2) propulsion systems. Launch data from 1964 until midyear 2005 are analyzed and presented. This data analysis covers upper stage systems from the Ariane, Centaur, H-IIA, Saturn, and Atlas in addition to other vehicles. Upper stage propulsion system elements have the highest impact on reliability. This paper discusses failure occurrence in all aspects of the operational phases (Le., initial burn, coast, restarts, and trends in failure rates over time). In an effort to understand the likelihood of future failures in flight, we present timelines of engine system failures relevant to initial flight histories. Some evidence suggests that propulsion system failures as a result of design problems occur shortly after initial development of the propulsion system; whereas failures because of manufacturing or assembly processing errors may occur during any phase of the system builds process, This paper also explores the detectability of historical failures. Observations from this review are used to ascertain the potential for increased upper stage reliability given investments in integrated system health management. Based on a clear understanding of the failure and success history of previous efforts by multiple space hardware development groups, the paper will investigate potential improvements that can be realized through application of system safety principles.

  6. Inventing the future of reliability: FERC's recent orders and the consolidation of reliability authority

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skees, J. Daniel

    2010-06-15

    The Energy Policy Act of 2005 established mandatory reliability standard enforcement under a system in which the Federal Energy Regulatory Commission and the Electric Reliability Organization would have their own spheres of responsibility and authority. Recent orders, however, reflect the Commission's frustration with the reliability standard drafting process and suggest that the Electric Reliability Organization's discretion is likely to receive less deference in the future. (author)

  7. Meet OLAF, a Good Friend of the IAPS! The Open Library of Affective Foods: A Tool to Investigate the Emotional Impact of Food in Adolescents

    PubMed Central

    Miccoli, Laura; Delgado, Rafael; Rodríguez-Ruiz, Sonia; Guerra, Pedro; García-Mármol, Eduardo; Fernández-Santaella, M. Carmen

    2014-01-01

    In the last decades, food pictures have been repeatedly employed to investigate the emotional impact of food on healthy participants as well as individuals who suffer from eating disorders and obesity. However, despite their widespread use, food pictures are typically selected according to each researcher's personal criteria, which make it difficult to reliably select food images and to compare results across different studies and laboratories. Therefore, to study affective reactions to food, it becomes pivotal to identify the emotional impact of specific food images based on wider samples of individuals. In the present paper we introduce the Open Library of Affective Foods (OLAF), which is a set of original food pictures created to reliably select food pictures based on the emotions they prompt, as indicated by affective ratings of valence, arousal, and dominance and by an additional food craving scale. OLAF images were designed to allow simultaneous use with affective images from the International Affective Picture System (IAPS), which is a well-known instrument to investigate emotional reactions in the laboratory. The ultimate goal of the OLAF is to contribute to understanding how food is emotionally processed in healthy individuals and in patients who suffer from eating and weight-related disorders. The present normative data, which was based on a large sample of an adolescent population, indicate that when viewing affective non-food IAPS images, valence, arousal, and dominance ratings were in line with expected patterns based on previous emotion research. Moreover, when viewing food pictures, affective and food craving ratings were consistent with research on food cue processing. As a whole, the data supported the methodological and theoretical reliability of the OLAF ratings, therefore providing researchers with a standardized tool to reliably investigate the emotional and motivational significance of food. The OLAF database is publicly available at zenodo.org. PMID:25490404

  8. Meet OLAF, a good friend of the IAPS! The Open Library of Affective Foods: a tool to investigate the emotional impact of food in adolescents.

    PubMed

    Miccoli, Laura; Delgado, Rafael; Rodríguez-Ruiz, Sonia; Guerra, Pedro; García-Mármol, Eduardo; Fernández-Santaella, M Carmen

    2014-01-01

    In the last decades, food pictures have been repeatedly employed to investigate the emotional impact of food on healthy participants as well as individuals who suffer from eating disorders and obesity. However, despite their widespread use, food pictures are typically selected according to each researcher's personal criteria, which make it difficult to reliably select food images and to compare results across different studies and laboratories. Therefore, to study affective reactions to food, it becomes pivotal to identify the emotional impact of specific food images based on wider samples of individuals. In the present paper we introduce the Open Library of Affective Foods (OLAF), which is a set of original food pictures created to reliably select food pictures based on the emotions they prompt, as indicated by affective ratings of valence, arousal, and dominance and by an additional food craving scale. OLAF images were designed to allow simultaneous use with affective images from the International Affective Picture System (IAPS), which is a well-known instrument to investigate emotional reactions in the laboratory. The ultimate goal of the OLAF is to contribute to understanding how food is emotionally processed in healthy individuals and in patients who suffer from eating and weight-related disorders. The present normative data, which was based on a large sample of an adolescent population, indicate that when viewing affective non-food IAPS images, valence, arousal, and dominance ratings were in line with expected patterns based on previous emotion research. Moreover, when viewing food pictures, affective and food craving ratings were consistent with research on food cue processing. As a whole, the data supported the methodological and theoretical reliability of the OLAF ratings, therefore providing researchers with a standardized tool to reliably investigate the emotional and motivational significance of food. The OLAF database is publicly available at zenodo.org.

  9. Precision cast vs. wrought superalloys

    NASA Technical Reports Server (NTRS)

    Tien, J. K.; Borofka, J. C.; Casey, M. E.

    1986-01-01

    While cast polycrystalline superalloys recommend themselves in virtue of better 'buy-to-fly' ratios and higher strengthening gamma-prime volume fractions than those of wrought superalloys, the expansion of their use into such critical superalloy applications as gas turbine hot section components has been slowed by insufficient casting process opportunities for microstructural control. Attention is presently drawn, however, to casting process developments facilitating the production of defect-tolerant superalloy castings having improved fracture reliability. Integrally bladed turbine wheel and thin-walled turbine exhaust case near-net-shape castings have been produced by these means.

  10. An integrated algorithm for hypersonic fluid-thermal-structural numerical simulation

    NASA Astrophysics Data System (ADS)

    Li, Jia-Wei; Wang, Jiang-Feng

    2018-05-01

    In this paper, a fluid-structural-thermal integrated method is presented based on finite volume method. A unified integral equations system is developed as the control equations for physical process of aero-heating and structural heat transfer. The whole physical field is discretized by using an up-wind finite volume method. To demonstrate its capability, the numerical simulation of Mach 6.47 flow over stainless steel cylinder shows a good agreement with measured values, and this method dynamically simulates the objective physical processes. Thus, the integrated algorithm proves to be efficient and reliable.

  11. Laminar Soot Processes (Lsp) Experiment: Findings From Ground-Based Measurements

    NASA Technical Reports Server (NTRS)

    Kim, C. H.; El-Leathy, A. M.; Faeth, G. M.; Xu, F.

    2003-01-01

    Processes of soot formation and oxidation must be understood in order to achieve reliable computational combustion calculations for nonpremixed (diffusion) flames involving hydrocarbon fuels. Motivated by this observation, the present investigation extended earlier work on soot formation and oxidation in laminar jet ethylene/air and methane/oxygen premixed and acetylene-nitrogen/air diffusion flames at atmospheric pressure in this laboratory, emphasizing soot surface growth and early soot surface oxidation in laminar diffusion flames fueled with a variety of hydrocarbons at pressures in the range 0.1 - 1.0 atm.

  12. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  13. Allied health clinicians using translational research in action to develop a reliable stroke audit tool.

    PubMed

    Abery, Philip; Kuys, Suzanne; Lynch, Mary; Low Choy, Nancy

    2018-05-23

    To design and establish reliability of a local stroke audit tool by engaging allied health clinicians within a privately funded hospital. Design: Two-stage study involving a modified Delphi process to inform stroke audit tool development and inter-tester reliability. Allied health clinicians. A modified Delphi process to select stroke guideline recommendations for inclusion in the audit tool. Reliability study: 1 allied health representative from each discipline audited 10 clinical records with sequential admissions to acute and rehabilitation services. Recommendations were admitted to the audit tool when 70% agreement was reached, with 50% set as the reserve agreement. Inter-tester reliability was determined using intra-class correlation coefficients (ICCs) across 10 clinical records. Twenty-two participants (92% female, 50% physiotherapists, 17% occupational therapists) completed the modified Delphi process. Across 6 voting rounds, 8 recommendations reached 70% agreement and 2 reached 50% agreement. Two recommendations (nutrition/hydration; goal setting) were added to ensure representation for all disciplines. Substantial consistency across raters was established for the audit tool applied in acute stroke (ICC .71; range .48 to .90) and rehabilitation (ICC.78; range .60 to .93) services. Allied health clinicians within a privately funded hospital generally agreed in an audit process to develop a reliable stroke audit tool. Allied health clinicians agreed on stroke guideline recommendations to inform a stroke audit tool. The stroke audit tool demonstrated substantial consistency supporting future use for service development. This process, which engages local clinicians, could be adopted by other facilities to design reliable audit tools to identify local service gaps to inform changes to clinical practice. © 2018 John Wiley & Sons, Ltd.

  14. Intelligent Work Process Engineering System

    NASA Technical Reports Server (NTRS)

    Williams, Kent E.

    2003-01-01

    Optimizing performance on work activities and processes requires metrics of performance for management to monitor and analyze in order to support further improvements in efficiency, effectiveness, safety, reliability and cost. Information systems are therefore required to assist management in making timely, informed decisions regarding these work processes and activities. Currently information systems regarding Space Shuttle maintenance and servicing do not exist to make such timely decisions. The work to be presented details a system which incorporates various automated and intelligent processes and analysis tools to capture organize and analyze work process related data, to make the necessary decisions to meet KSC organizational goals. The advantages and disadvantages of design alternatives to the development of such a system will be discussed including technologies, which would need to bedesigned, prototyped and evaluated.

  15. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  16. The NASA Monographs on Shell Stability Design Recommendations: A Review and Suggested Improvements

    NASA Technical Reports Server (NTRS)

    Nemeth, Michael P.; Starnes, James H., Jr.

    1998-01-01

    A summary of existing NASA design criteria monographs for the design of buckling-resistant thin-shell structures is presented. Subsequent improvements in the analysis for nonlinear shell response are reviewed, and current issues in shell stability analysis are discussed. Examples of nonlinear shell responses that are not included in the existing shell design monographs are presented, and an approach for including reliability based analysis procedures in the shell design process is discussed. Suggestions for conducting future shell experiments are presented, and proposed improvements to the NASA shell design criteria monographs are discussed.

  17. The NASA Monographs on Shell Stability Design Recommendations: A Review and Suggested Improvements

    NASA Technical Reports Server (NTRS)

    Nemeth, Michael P.; Starnes, James H., Jr.

    1998-01-01

    A summary of the existing NASA design criteria monographs for the design of buckling-resistant thin-shell structures is presented. Subsequent improvements in the analysis for nonlinear shell response are reviewed, and current issues in shell stability analysis are discussed. Examples of nonlinear shell responses that are not included in the existing shell design monographs are presented, and an approach for including reliability-based analysis procedures in the shell design process is discussed. Suggestions for conducting future shell experiments are presented, and proposed improvements to the NASA shell design criteria monographs are discussed.

  18. Trusted Care In The Air Force Medical Service: Practical Recommendations For Transformation

    DTIC Science & Technology

    2016-02-16

    Organizing for High Reliability: Processes of Collective Mindfulness,” in Research in Organizational Behavior, ed. R.S. Sutton and B.M. Staw...Karl E.Weick, Kathleen M. Sutcliffe, and David Obstfeld, “ Organizing for High Reliability: Processes of Collective Mindfulness,” in Research in...La Porte, “High Reliability Organizations : Unlikely, Demanding and At Risk.” Journal of Contingencies and Crisis Management 4, no. 2 (June 1996): 60

  19. Incorporating travel-time reliability into the congestion management process : a primer.

    DOT National Transportation Integrated Search

    2015-02-01

    This primer explains the value of incorporating travel-time reliability into the Congestion Management Process (CMP) : and identifies the most current tools available to assist with this effort. It draws from applied research and best practices : fro...

  20. Reliability and Probabilistic Risk Assessment - How They Play Together

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Stutts, Richard G.; Zhaofeng, Huang

    2015-01-01

    PRA methodology is one of the probabilistic analysis methods that NASA brought from the nuclear industry to assess the risk of LOM, LOV and LOC for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability and statistical data to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: What can go wrong? How likely is it? What is the severity of the degradation? Since 1986, NASA, along with industry partners, has conducted a number of PRA studies to predict the overall launch vehicles risks. Planning Research Corporation conducted the first of these studies in 1988. In 1995, Science Applications International Corporation (SAIC) conducted a comprehensive PRA study. In July 1996, NASA conducted a two-year study (October 1996 - September 1998) to develop a model that provided the overall Space Shuttle risk and estimates of risk changes due to proposed Space Shuttle upgrades. After the Columbia accident, NASA conducted a PRA on the Shuttle External Tank (ET) foam. This study was the most focused and extensive risk assessment that NASA has conducted in recent years. It used a dynamic, physics-based, integrated system analysis approach to understand the integrated system risk due to ET foam loss in flight. Most recently, a PRA for Ares I launch vehicle has been performed in support of the Constellation program. Reliability, on the other hand, addresses the loss of functions. In a broader sense, reliability engineering is a discipline that involves the application of engineering principles to the design and processing of products, both hardware and software, for meeting product reliability requirements or goals. It is a very broad design-support discipline. It has important interfaces with many other engineering disciplines. Reliability as a figure of merit (i.e. the metric) is the probability that an item will perform its intended function(s) for a specified mission profile. In general, the reliability metric can be calculated through the analyses using reliability demonstration and reliability prediction methodologies. Reliability analysis is very critical for understanding component failure mechanisms and in identifying reliability critical design and process drivers. The following sections discuss the PRA process and reliability engineering in detail and provide an application where reliability analysis and PRA were jointly used in a complementary manner to support a Space Shuttle flight risk assessment.

  1. Human reliability assessment: tools for law enforcement

    NASA Astrophysics Data System (ADS)

    Ryan, Thomas G.; Overlin, Trudy K.

    1997-01-01

    This paper suggests ways in which human reliability analysis (HRA) can assist the United State Justice System, and more specifically law enforcement, in enhancing the reliability of the process from evidence gathering through adjudication. HRA is an analytic process identifying, describing, quantifying, and interpreting the state of human performance, and developing and recommending enhancements based on the results of individual HRA. It also draws on lessons learned from compilations of several HRA. Given the high legal standards the Justice System is bound to, human errors that might appear to be trivial in other venues can make the difference between a successful and unsuccessful prosecution. HRA has made a major contribution to the efficiency, favorable cost-benefit ratio, and overall success of many enterprises where humans interface with sophisticated technologies, such as the military, ground transportation, chemical and oil production, nuclear power generation, commercial aviation and space flight. Each of these enterprises presents similar challenges to the humans responsible for executing action and action sequences, especially where problem solving and decision making are concerned. Nowhere are humans confronted, to a greater degree, with problem solving and decision making than are the diverse individuals and teams responsible for arrest and adjudication of criminal proceedings. This paper concludes that because of the parallels between the aforementioned technologies and the adjudication process, especially crime scene evidence gathering, there is reason to believe that the HRA technology, developed and enhanced in other applications, can be transferred to the Justice System with minimal cost and with significant payoff.

  2. The Use of In-Service Inspection Data in the Performance Measurement of Non-Destructive Inspections (Mise en oeuvre de donn es resultant de visites d’inspection en service pour l’evaluation des performances des visites d’inspection non destructives)

    DTIC Science & Technology

    2005-03-01

    Significant numbers of in-service inspections are occurring but at present, there is no organized process whereby these data are collected and...Reliability under Field Conditions” was held in Brussels in May 1998. The processes under which this data could be collected must be defined and...Requirement 2-3 2.3 Impact on Existing Certification Issues 2-5 2.4 Risk Assessment and POD 2-5 Chapter 3 – Data Collection Process 3-1 3.1

  3. Achieving High Reliability with People, Processes, and Technology.

    PubMed

    Saunders, Candice L; Brennan, John A

    2017-01-01

    High reliability as a corporate value in healthcare can be achieved by meeting the "Quadruple Aim" of improving population health, reducing per capita costs, enhancing the patient experience, and improving provider wellness. This drive starts with the board of trustees, CEO, and other senior leaders who ingrain high reliability throughout the organization. At WellStar Health System, the board developed an ambitious goal to become a top-decile health system in safety and quality metrics. To achieve this goal, WellStar has embarked on a journey toward high reliability and has committed to Lean management practices consistent with the Institute for Healthcare Improvement's definition of a high-reliability organization (HRO): one that is committed to the prevention of failure, early identification and mitigation of failure, and redesign of processes based on identifiable failures. In the end, a successful HRO can provide safe, effective, patient- and family-centered, timely, efficient, and equitable care through a convergence of people, processes, and technology.

  4. Integrating reliability and maintainability into a concurrent engineering environment

    NASA Astrophysics Data System (ADS)

    Phillips, Clifton B.; Peterson, Robert R.

    1993-02-01

    This paper describes the results of a reliability and maintainability study conducted at the University of California, San Diego and supported by private industry. Private industry thought the study was important and provided the university access to innovative tools under cooperative agreement. The current capability of reliability and maintainability tools and how they fit into the design process is investigated. The evolution of design methodologies leading up to today's capability is reviewed for ways to enhance the design process while keeping cost under control. A method for measuring the consequences of reliability and maintainability policy for design configurations in an electronic environment is provided. The interaction of selected modern computer tool sets is described for reliability, maintainability, operations, and other elements of the engineering design process. These tools provide a robust system evaluation capability that brings life cycle performance improvement information to engineers and their managers before systems are deployed, and allow them to monitor and track performance while it is in operation.

  5. Supervisor/Peer Involvement in Evaluation Transfer of Training Process and Results Reliability: A Research in an Italian Public Body

    ERIC Educational Resources Information Center

    Capaldo, Guido; Depolo, Marco; Rippa, Pierluigi; Schiattone, Domenico

    2017-01-01

    Purpose: The aim of this paper is to present a study performed in conjunction with a branch of the Italian Public Italian Administration, the ISSP (Istituto Superiore di Studi Penitenziari--the Higher Institute of Penitentiary Studies). The study aimed to develop a Transfer of Training (ToT) evaluation methodology that would be both scientifically…

  6. Determining the Best-Fit FPGA for a Space Mission: An Analysis of Cost, SEU Sensitivity,and Reliability

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Ken

    2007-01-01

    This viewgraph presentation reviews the selection of the optimum Field Programmable Gate Arrays (FPGA) for space missions. Included in this review is a discussion on differentiating amongst various FPGAs, cost analysis of the various options, the investigation of radiation effects, an expansion of the evaluation criteria, and the application of the evaluation criteria to the selection process.

  7. Compact, Automated, Frequency-Agile Microspectrofluorimeter

    NASA Technical Reports Server (NTRS)

    Fernandez, Salvador M.; Guignon, Ernest F.

    1995-01-01

    Compact, reliable, rugged, automated cell-culture and frequency-agile microspectrofluorimetric apparatus developed to perform experiments involving photometric imaging observations of single live cells. In original application, apparatus operates mostly unattended aboard spacecraft; potential terrestrial applications include automated or semiautomated diagnosis of pathological tissues in clinical laboratories, biomedical instrumentation, monitoring of biological process streams, and portable instrumentation for testing biological conditions in various environments. Offers obvious advantages over present laboratory instrumentation.

  8. Future FAA Telecommunications Plan

    DTIC Science & Technology

    1991-08-01

    the reliability and maintainability of the present system . Increases in flight traffic and resultant data loads have caused delays in the processing of...is received; h. Analyze and evaluate the effectiveness of the traffic management system and notify specialists of results within 10 seconds of request...emergencies. 5-2 August 1991 "o Notice to Airmen (NOTAM): Accept NOTAMs and distribute them to appropriate ACCC, TCCC, and systems engineer positions. o

  9. Automated System of Diagnostic Monitoring at Bureya HPP Hydraulic Engineering Installations: a New Level of Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musyurka, A. V., E-mail: musyurkaav@burges.rushydro.ru

    This article presents the design, hardware, and software solutions developed and placed in service for the automated system of diagnostic monitoring (ASDM) for hydraulic engineering installations at the Bureya HPP, and assuring a reliable process for monitoring hydraulic engineering installations. Project implementation represents a timely solution of problems addressed by the hydraulic engineering installation diagnostics section.

  10. Structural Reliability Analysis and Optimization: Use of Approximations

    NASA Technical Reports Server (NTRS)

    Grandhi, Ramana V.; Wang, Liping

    1999-01-01

    This report is intended for the demonstration of function approximation concepts and their applicability in reliability analysis and design. Particularly, approximations in the calculation of the safety index, failure probability and structural optimization (modification of design variables) are developed. With this scope in mind, extensive details on probability theory are avoided. Definitions relevant to the stated objectives have been taken from standard text books. The idea of function approximations is to minimize the repetitive use of computationally intensive calculations by replacing them with simpler closed-form equations, which could be nonlinear. Typically, the approximations provide good accuracy around the points where they are constructed, and they need to be periodically updated to extend their utility. There are approximations in calculating the failure probability of a limit state function. The first one, which is most commonly discussed, is how the limit state is approximated at the design point. Most of the time this could be a first-order Taylor series expansion, also known as the First Order Reliability Method (FORM), or a second-order Taylor series expansion (paraboloid), also known as the Second Order Reliability Method (SORM). From the computational procedure point of view, this step comes after the design point identification; however, the order of approximation for the probability of failure calculation is discussed first, and it is denoted by either FORM or SORM. The other approximation of interest is how the design point, or the most probable failure point (MPP), is identified. For iteratively finding this point, again the limit state is approximated. The accuracy and efficiency of the approximations make the search process quite practical for analysis intensive approaches such as the finite element methods; therefore, the crux of this research is to develop excellent approximations for MPP identification and also different approximations including the higher-order reliability methods (HORM) for representing the failure surface. This report is divided into several parts to emphasize different segments of the structural reliability analysis and design. Broadly, it consists of mathematical foundations, methods and applications. Chapter I discusses the fundamental definitions of the probability theory, which are mostly available in standard text books. Probability density function descriptions relevant to this work are addressed. In Chapter 2, the concept and utility of function approximation are discussed for a general application in engineering analysis. Various forms of function representations and the latest developments in nonlinear adaptive approximations are presented with comparison studies. Research work accomplished in reliability analysis is presented in Chapter 3. First, the definition of safety index and most probable point of failure are introduced. Efficient ways of computing the safety index with a fewer number of iterations is emphasized. In chapter 4, the probability of failure prediction is presented using first-order, second-order and higher-order methods. System reliability methods are discussed in chapter 5. Chapter 6 presents optimization techniques for the modification and redistribution of structural sizes for improving the structural reliability. The report also contains several appendices on probability parameters.

  11. Reliability Analysis and Standardization of Spacecraft Command Generation Processes

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Grenander, Sven; Evensen, Ken

    2011-01-01

    center dot In order to reduce commanding errors that are caused by humans, we create an approach and corresponding artifacts for standardizing the command generation process and conducting risk management during the design and assurance of such processes. center dot The literature review conducted during the standardization process revealed that very few atomic level human activities are associated with even a broad set of missions. center dot Applicable human reliability metrics for performing these atomic level tasks are available. center dot The process for building a "Periodic Table" of Command and Control Functions as well as Probabilistic Risk Assessment (PRA) models is demonstrated. center dot The PRA models are executed using data from human reliability data banks. center dot The Periodic Table is related to the PRA models via Fault Links.

  12. Ejectors of power plants turbine units efficiency and reliability increasing

    NASA Astrophysics Data System (ADS)

    Aronson, K. E.; Ryabchikov, A. Yu.; Kuptsov, V. K.; Murmanskii, I. B.; Brodov, Yu. M.; Zhelonkin, N. V.; Khaet, S. I.

    2017-11-01

    The functioning of steam turbines condensation systems influence on the efficiency and reliability of a power plant a lot. At the same time, the condensation system operating is provided by basic ejectors, which maintain the vacuum level in the condenser. Development of methods of efficiency and reliability increasing for ejector functioning is an actual problem of up-to-date power engineering. In the paper there is presented statistical analysis of ejector breakdowns, revealed during repairing processes, the influence of such damages on the steam turbine operating reliability. It is determined, that 3% of steam turbine equipment breakdowns are the ejector breakdowns. At the same time, about 7% of turbine breakdowns are caused by different ejector malfunctions. Developed and approved design solutions, which can increase the ejector functioning indexes, are presented. Intercoolers are designed in separated cases, so the air-steam mixture can’t move from the high-pressure zones to the low-pressure zones and the maintainability of the apparatuses is increased. By U-type tubes application, the thermal expansion effect of intercooler tubes is compensated and the heat-transfer area is increased. By the applied nozzle fixing construction, it is possible to change the distance between a nozzle and a mixing chamber (nozzle exit position) for operating performance optimization. In operating conditions there are provided experimental researches of more than 30 serial ejectors and also high-efficient 3-staged ejector EPO-3-80, designed by authors. The measurement scheme of the designed ejector includes 21 indicator. The results of experimental tests with different nozzle exit positions of the ejector EPO-3-80 stream devices are presented. The pressure of primary stream (water steam) is optimized. Experimental data are well-approved by the calculation results.

  13. Column Grid Array Rework for High Reliability

    NASA Technical Reports Server (NTRS)

    Mehta, Atul C.; Bodie, Charles C.

    2008-01-01

    Due to requirements for reduced size and weight, use of grid array packages in space applications has become common place. To meet the requirement of high reliability and high number of I/Os, ceramic column grid array packages (CCGA) were selected for major electronic components used in next MARS Rover mission (specifically high density Field Programmable Gate Arrays). ABSTRACT The probability of removal and replacement of these devices on the actual flight printed wiring board assemblies is deemed to be very high because of last minute discoveries in final test which will dictate changes in the firmware. The questions and challenges presented to the manufacturing organizations engaged in the production of high reliability electronic assemblies are, Is the reliability of the PWBA adversely affected by rework (removal and replacement) of the CGA package? and How many times can we rework the same board without destroying a pad or degrading the lifetime of the assembly? To answer these questions, the most complex printed wiring board assembly used by the project was chosen to be used as the test vehicle, the PWB was modified to provide a daisy chain pattern, and a number of bare PWB s were acquired to this modified design. Non-functional 624 pin CGA packages with internal daisy chained matching the pattern on the PWB were procured. The combination of the modified PWB and the daisy chained packages enables continuity measurements of every soldered contact during subsequent testing and thermal cycling. Several test vehicles boards were assembled, reworked and then thermal cycled to assess the reliability of the solder joints and board material including pads and traces near the CGA. The details of rework process and results of thermal cycling are presented in this paper.

  14. Applicability of near-infrared spectroscopy in the monitoring of film coating and curing process of the prolonged release coated pellets.

    PubMed

    Korasa, Klemen; Hudovornik, Grega; Vrečer, Franc

    2016-10-10

    Although process analytical technology (PAT) guidance has been introduced to the pharmaceutical industry just a decade ago, this innovative approach has already become an important part of efficient pharmaceutical development, manufacturing, and quality assurance. PAT tools are especially important in technologically complex operations which require strict control of critical process parameters and have significant effect on final product quality. Manufacturing of prolonged release film coated pellets is definitely one of such processes. The aim of the present work was to study the applicability of the at-line near-infrared spectroscopy (NIR) approach in the monitoring of pellet film coating and curing steps. Film coated pellets were manufactured by coating the active ingredient containing pellets with film coating based on polymethacrylate polymers (Eudragit® RS/RL). The NIR proved as a useful tool for the monitoring of the curing process since it was able to determine the extent of the curing and hence predict drug release rate by using partial least square (PLS) model. However, such approach also showed a number of limitations, such as low reliability and high susceptibility to pellet moisture content, and was thus not able to predict drug release from pellets with high moisture content. On the other hand, the at-line NIR was capable to predict the thickness of Eudragit® RS/RL film coating in a wide range (up to 40μm) with good accuracy even in the pellets with high moisture content. To sum up, high applicability of the at-line NIR in the monitoring of the prolonged release pellets production was demonstrated in the present study. The present findings may contribute to more efficient and reliable PAT solutions in the manufacturing of prolonged release dosage forms. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Electrocardiograph-gated single photon emission computed tomography radionuclide angiography presents good interstudy reproducibility for the quantification of global systolic right ventricular function.

    PubMed

    Daou, Doumit; Coaguila, Carlos; Vilain, Didier

    2007-05-01

    Electrocardiograph-gated single photon emission computed tomography (SPECT) radionuclide angiography provides accurate measurement of right ventricular ejection fraction and end-diastolic and end-systolic volumes. In this study, we report the interstudy precision and reliability of SPECT radionuclide angiography for the measurement of global systolic right ventricular function using two, three-dimensional volume processing methods (SPECT-QBS, SPECT-35%). These were compared with equilibrium planar radionuclide angiography. Ten patients with chronic coronary artery disease having two SPECT and planar radionuclide angiography acquisitions were included. For the right ventricular ejection fraction, end-diastolic volume and end-systolic volume, the interstudy precision and reliability were better with SPECT-35% than with SPECT-QBS. The sample sizes needed to objectify a change in right ventricular volumes or ejection fraction were lower with SPECT-35% than with SPECT-QBS. The interstudy precision and reliability of SPECT-35% and SPECT-QBS for the right ventricle were better than those of equilibrium planar radionuclide angiography, but poorer than those previously reported for the left ventricle with SPECT radionuclide angiography on the same population. SPECT-35% and SPECT-QBS present good interstudy precision and reliability for right ventricular function, with the results favouring the use of SPECT-35%. The results are better than those of equilibrium planar radionuclide angiography, but poorer than those previously reported for the left ventricle with SPECT radionuclide angiography. They need to be confirmed in a larger population.

  16. Cost-effective solutions to maintaining smart grid reliability

    NASA Astrophysics Data System (ADS)

    Qin, Qiu

    As the aging power systems are increasingly working closer to the capacity and thermal limits, maintaining an sufficient reliability has been of great concern to the government agency, utility companies and users. This dissertation focuses on improving the reliability of transmission and distribution systems. Based on the wide area measurements, multiple model algorithms are developed to diagnose transmission line three-phase short to ground faults in the presence of protection misoperations. The multiple model algorithms utilize the electric network dynamics to provide prompt and reliable diagnosis outcomes. Computational complexity of the diagnosis algorithm is reduced by using a two-step heuristic. The multiple model algorithm is incorporated into a hybrid simulation framework, which consist of both continuous state simulation and discrete event simulation, to study the operation of transmission systems. With hybrid simulation, line switching strategy for enhancing the tolerance to protection misoperations is studied based on the concept of security index, which involves the faulted mode probability and stability coverage. Local measurements are used to track the generator state and faulty mode probabilities are calculated in the multiple model algorithms. FACTS devices are considered as controllers for the transmission system. The placement of FACTS devices into power systems is investigated with a criterion of maintaining a prescribed level of control reconfigurability. Control reconfigurability measures the small signal combined controllability and observability of a power system with an additional requirement on fault tolerance. For the distribution systems, a hierarchical framework, including a high level recloser allocation scheme and a low level recloser placement scheme, is presented. The impacts of recloser placement on the reliability indices is analyzed. Evaluation of reliability indices in the placement process is carried out via discrete event simulation. The reliability requirements are described with probabilities and evaluated from the empirical distributions of reliability indices.

  17. Silk-based blood stabilization for diagnostics.

    PubMed

    Kluge, Jonathan A; Li, Adrian B; Kahn, Brooke T; Michaud, Dominique S; Omenetto, Fiorenzo G; Kaplan, David L

    2016-05-24

    Advanced personalized medical diagnostics depend on the availability of high-quality biological samples. These are typically biofluids, such as blood, saliva, or urine; and their collection and storage is critical to obtain reliable results. Without proper temperature regulation, protein biomarkers in particular can degrade rapidly in blood samples, an effect that ultimately compromises the quality and reliability of laboratory tests. Here, we present the use of silk fibroin as a solid matrix to encapsulate blood analytes, protecting them from thermally induced damage that could be encountered during nonrefrigerated transportation or freeze-thaw cycles. Blood samples are recovered by simple dissolution of the silk matrix in water. This process is demonstrated to be compatible with a number of immunoassays and provides enhanced sample preservation in comparison with traditional air-drying paper approaches. Additional processing can remediate interactions with conformational structures of the silk protein to further enhance blood stabilization and recovery. This approach can provide expanded utility for remote collection of blood and other biospecimens empowering new modalities of temperature-independent remote diagnostics.

  18. Distortion of online reputation by excess reciprocity: quantification and estimation of unbiased reputation

    NASA Astrophysics Data System (ADS)

    Aste, Tomaso; Livan, Giacomo; Caccioli, Fabio

    The peer-to-peer (P2P) economy relies on establishing trust in distributed networked systems, where the reliability of a user is assessed through digital peer-review processes that aggregate ratings into reputation scores. Here we present evidence of a network effect which biases digital reputations, revealing that P2P networks display exceedingly high levels of reciprocity. In fact, these are so large that they are close to the highest levels structurally compatible with the networks reputation landscape. This indicates that the crowdsourcing process underpinning digital reputation is significantly distorted by the attempt of users to mutually boost reputation, or to retaliate, through the exchange of ratings. We uncover that the least active users are predominantly responsible for such reciprocity-induced bias, and that this fact can be exploited to obtain more reliable reputation estimates. Our findings are robust across different P2P platforms, including both cases where ratings are used to vote on the content produced by users and to vote on user profiles.

  19. Computer calculation of device, circuit, equipment, and system reliability.

    NASA Technical Reports Server (NTRS)

    Crosby, D. R.

    1972-01-01

    A grouping into four classes is proposed for all reliability computations that are related to electronic equipment. Examples are presented of reliability computations in three of these four classes. Each of the three specific reliability tasks described was originally undertaken to satisfy an engineering need for reliability data. The form and interpretation of the print-out of the specific reliability computations is presented. The justification for the costs of these computations is indicated. The skills of the personnel used to conduct the analysis, the interfaces between the personnel, and the timing of the projects is discussed.

  20. Measuring the Process and Quality of Informed Consent for Clinical Research: Development and Testing

    PubMed Central

    Cohn, Elizabeth Gross; Jia, Haomiao; Smith, Winifred Chapman; Erwin, Katherine; Larson, Elaine L.

    2013-01-01

    Purpose/Objectives To develop and assess the reliability and validity of an observational instrument, the Process and Quality of Informed Consent (P-QIC). Design A pilot study of the psychometrics of a tool designed to measure the quality and process of the informed consent encounter in clinical research. The study used professionally filmed, simulated consent encounters designed to vary in process and quality. Setting A major urban teaching hospital in the northeastern region of the United States. Sample 63 students enrolled in health-related programs participated in psychometric testing, 16 students participated in test-retest reliability, and 5 investigator-participant dyads were observed for the actual consent encounters. Methods For reliability and validity testing, students watched and rated videotaped simulations of four consent encounters intentionally varied in process and content and rated them with the proposed instrument. Test-retest reliability was established by raters watching the videotaped simulations twice. Inter-rater reliability was demonstrated by two simultaneous but independent raters observing an actual consent encounter. Main Research Variables The essential elements of information and communication for informed consent. Findings The initial testing of the P-QIC demonstrated reliable and valid psychometric properties in both the simulated standardized consent encounters and actual consent encounters in the hospital setting. Conclusions The P-QIC is an easy-to-use observational tool that provides a quick assessment of the areas of strength and areas that need improvement in a consent encounter. It can be used in the initial trainings of new investigators or consent administrators and in ongoing programs of improvement for informed consent. Implications for Nursing The development of a validated observational instrument will allow investigators to assess the consent process more accurately and evaluate strategies designed to improve it. PMID:21708532

  1. The process group approach to reliable distributed computing

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  2. The Requirement for Acquisition and Logistics Integration: An Examination of Reliability Management Within the Marine Corps Acquisition Process

    DTIC Science & Technology

    2002-12-01

    HMMWV family of vehicles, LVS family of vehicles, and the M198 Howitzer). The analysis is limited to an assessment of reliability management issues...AND LOGISTICS INTEGRATION: AN EXAMINATION OF RELIABILITY MANAGEMENT WITHIN THE MARINE CORPS ACQUISITION PROCESS by Marvin L. Norcross, Jr...Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction

  3. Implications of scaling on static RAM bit cell stability and reliability

    NASA Astrophysics Data System (ADS)

    Coones, Mary Ann; Herr, Norm; Bormann, Al; Erington, Kent; Soorholtz, Vince; Sweeney, John; Phillips, Michael

    1993-01-01

    In order to lower manufacturing costs and increase performance, static random access memory (SRAM) bit cells are scaled progressively toward submicron geometries. The reliability of an SRAM is highly dependent on the bit cell stability. Smaller memory cells with less capacitance and restoring current make the array more susceptible to failures from defectivity, alpha hits, and other instabilities and leakage mechanisms. Improving long term reliability while migrating to higher density devices makes the task of building in and improving reliability increasingly difficult. Reliability requirements for high density SRAMs are very demanding with failure rates of less than 100 failures per billion device hours (100 FITs) being a common criteria. Design techniques for increasing bit cell stability and manufacturability must be implemented in order to build in this level of reliability. Several types of analyses are performed to benchmark the performance of the SRAM device. Examples of these analysis techniques which are presented here include DC parametric measurements of test structures, functional bit mapping of the circuit used to characterize the entire distribution of bits, electrical microprobing of weak and/or failing bits, and system and accelerated soft error rate measurements. These tests allow process and design improvements to be evaluated prior to implementation on the final product. These results are used to provide comprehensive bit cell characterization which can then be compared to device models and adjusted accordingly to provide optimized cell stability versus cell size for a particular technology. The result is designed in reliability which can be accomplished during the early stages of product development.

  4. Subject-level reliability analysis of fast fMRI with application to epilepsy.

    PubMed

    Hao, Yongfu; Khoo, Hui Ming; von Ellenrieder, Nicolas; Gotman, Jean

    2017-07-01

    Recent studies have applied the new magnetic resonance encephalography (MREG) sequence to the study of interictal epileptic discharges (IEDs) in the electroencephalogram (EEG) of epileptic patients. However, there are no criteria to quantitatively evaluate different processing methods, to properly use the new sequence. We evaluated different processing steps of this new sequence under the common generalized linear model (GLM) framework by assessing the reliability of results. A bootstrap sampling technique was first used to generate multiple replicated data sets; a GLM with different processing steps was then applied to obtain activation maps, and the reliability of these maps was assessed. We applied our analysis in an event-related GLM related to IEDs. A higher reliability was achieved by using a GLM with head motion confound regressor with 24 components rather than the usual 6, with an autoregressive model of order 5 and with a canonical hemodynamic response function (HRF) rather than variable latency or patient-specific HRFs. Comparison of activation with IED field also favored the canonical HRF, consistent with the reliability analysis. The reliability analysis helps to optimize the processing methods for this fast fMRI sequence, in a context in which we do not know the ground truth of activation areas. Magn Reson Med 78:370-382, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  5. Time Lapse Photography From Arctic Buoys

    NASA Astrophysics Data System (ADS)

    Valentic, T. A.; Matrai, P.; Woods, J. E.

    2013-12-01

    We have equipped a number of buoys with cameras that have been deployed throughout the Arctic. These systems need to be simple, reliable and low power. The images are transmitted over an Iridium satellite link and assembled into long running movies. We have captured a number of interesting events, observed the ice dynamics through the year and visits by local wildlife. Each of the systems have been deployed for periods of up to a year, with images every hour. The cameras have proved to be a great outreach tool and are routinely watched by number of people on our websites. This talk will present the techniques used in developing these camera systems, the methods used for reliably transmitting the images and the process for generating the movies.

  6. Empirical evaluation of the Process Overview Measure for assessing situation awareness in process plants.

    PubMed

    Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd

    2016-03-01

    The Process Overview Measure is a query-based measure developed to assess operator situation awareness (SA) from monitoring process plants. A companion paper describes how the measure has been developed according to process plant properties and operator cognitive work. The Process Overview Measure demonstrated practicality, sensitivity, validity and reliability in two full-scope simulator experiments investigating dramatically different operational concepts. Practicality was assessed based on qualitative feedback of participants and researchers. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA in full-scope simulator settings based on data collected on process experts. Thus, full-scope simulator studies can employ the Process Overview Measure to reveal the impact of new control room technology and operational concepts on monitoring process plants. Practitioner Summary: The Process Overview Measure is a query-based measure that demonstrated practicality, sensitivity, validity and reliability for assessing operator situation awareness (SA) from monitoring process plants in representative settings.

  7. An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.

    PubMed

    Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes

    2017-10-01

    This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice. © 2017 Society for Risk Analysis.

  8. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  9. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  10. Assessment of bachelor's theses in a nursing degree with a rubrics system: Development and validation study.

    PubMed

    González-Chordá, Víctor M; Mena-Tudela, Desirée; Salas-Medina, Pablo; Cervera-Gasch, Agueda; Orts-Cortés, Isabel; Maciá-Soler, Loreto

    2016-02-01

    Writing a bachelor thesis (BT) is the last step to obtain a nursing degree. In order to perform an effective assessment of a nursing BT, certain reliable and valid tools are required. To develop and validate a 3-rubric system (drafting process, dissertation, and viva) to assess final year nursing students' BT. A multi-disciplinary study of content validity and psychometric properties. The study was carried out between December 2014 and July 2015. Nursing Degree at Universitat Jaume I. Spain. Eleven experts (9 nursing professors and 2 education professors from 6 different universities) took part in the development and content validity stages. Fifty-two theses presented during the 2014-2015 academic year were included by consecutive sampling of cases in order to study the psychometric properties. First, a group of experts was created to validate the content of the assessment system based on three rubrics (drafting process, dissertation, and viva). Subsequently, a reliability and validity study of the rubrics was carried out on the 52 theses presented during the 2014-2015 academic year. The BT drafting process rubric has 8 criteria (S-CVI=0.93; α=0.837; ICC=0.614), the dissertation rubric has 7 criteria (S-CVI=0.9; α=0.893; ICC=0.74), and the viva rubric has 4 criteria (S-CVI=0.86; α=8.16; ICC=0.895). A nursing BT assessment system based on three rubrics (drafting process, dissertation, and viva) has been validated. This system may be transferred to other nursing degrees or degrees from other academic areas. It is necessary to continue with the validation process taking into account factors that may affect the results obtained. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Developing a Science Process Skills Test for Secondary Students: Validity and Reliability Study

    ERIC Educational Resources Information Center

    Feyzioglu, Burak; Demirdag, Baris; Akyildiz, Murat; Altun, Eralp

    2012-01-01

    Science process skills are claimed to enable an individual to improve their own life visions and give a scientific view/literacy as a standard of their understanding about the nature of science. The main purpose of this study was to develop a test for measuring a valid, reliable and practical test for Science Process Skills (SPS) in secondary…

  12. Psychometric evaluation of the Swedish adaptation of the Inventory for Assessing the Process of Cultural Competence Among Healthcare Professionals--Revised (IAPCC-R).

    PubMed

    Olt, Helen; Jirwe, Maria; Gustavsson, Petter; Emami, Azita

    2010-01-01

    The purpose of this study was to describe the translation, adaption, and psychometric evaluation process in relation to validity and reliability of the Swedish version of the instrument, Inventory for Assessing The Process of Cultural Competence Among Healthcare Professionals-Revised (IAPCC-R) following the translation, adaptation, and psychometric evaluation process. Validity tests were conducted on the response processes (N = 15), the content (N = 7), and the internal structure of the instrument (N = 334). Reliability (alpha = .65 for the total scale varying between -.01 and .65 for the different subscales) was evaluated in terms of internal consistency. Results indicated weak validity and reliability though it is difficult to conclude whether this is related to adaptation issues or the original construction.The testing of the response process identified problems in relation to respondents' conceptualization of cultural competence. The test of the content identified a weak correspondence between the items and the underlying model. In addition, a confirmatory factor analysis did not confirm the proposed structure of the instrument. This study concludes that this instrument is not valid and reliable for use with a Swedish population of practicing nurses or nursing students.

  13. Model reduction by trimming for a class of semi-Markov reliability models and the corresponding error bound

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Palumbo, Daniel L.

    1991-01-01

    Semi-Markov processes have proved to be an effective and convenient tool to construct models of systems that achieve reliability by redundancy and reconfiguration. These models are able to depict complex system architectures and to capture the dynamics of fault arrival and system recovery. A disadvantage of this approach is that the models can be extremely large, which poses both a model and a computational problem. Techniques are needed to reduce the model size. Because these systems are used in critical applications where failure can be expensive, there must be an analytically derived bound for the error produced by the model reduction technique. A model reduction technique called trimming is presented that can be applied to a popular class of systems. Automatic model generation programs were written to help the reliability analyst produce models of complex systems. This method, trimming, is easy to implement and the error bound easy to compute. Hence, the method lends itself to inclusion in an automatic model generator.

  14. Development of a Survey to Assess Local Health Department Organizational Processes and Infrastructure for Supporting Obesity Prevention.

    PubMed

    Xiao, Ting; Stamatakis, Katherine A; McVay, Allese B

    Local health departments (LHDs) have an important function in controlling the growing epidemic of obesity in the United States. Data are needed to gain insight into the existence of routine functions and structures of LHDs that support and sustain obesity prevention efforts. The purpose of this study was to develop and examine the reliability of measures to assess foundational LHD organizational processes and functions specific to obesity prevention. Survey measures were developed using a stratified, random sample of US LHDs to assess supportive organizational processes and infrastructure for obesity prevention representing different domains. Data were analyzed using weighted κ and intraclass correlation coefficient for assessing test-retest reliability. Most items and summary indices in the majority of survey domains had moderate/substantial or almost perfect reliability. The overall findings support this survey instrument to be a reliable measurement tool for a large number of processes and functions that comprise obesity prevention-related capacity in LHDs.

  15. Assessing and Adapting Scientific Results for Space Weather Research to Operations (R2O)

    NASA Astrophysics Data System (ADS)

    Thompson, B. J.; Friedl, L.; Halford, A. J.; Mays, M. L.; Pulkkinen, A. A.; Singer, H. J.; Stehr, J. W.

    2017-12-01

    Why doesn't a solid scientific paper necessarily result in a tangible improvement in space weather capability? A well-known challenge in space weather forecasting is investing effort to turn the results of basic scientific research into operational knowledge. This process is commonly known as "Research to Operations," abbreviated R2O. There are several aspects of this process: 1) How relevant is the scientific result to a particular space weather process? 2) If fully utilized, how much will that result improve the reliability of the forecast for the associated process? 3) How much effort will this transition require? Is it already in a relatively usable form, or will it require a great deal of adaptation? 4) How much burden will be placed on forecasters? Is it "plug-and-play" or will it require effort to operate? 5) How can robust space weather forecasting identify challenges for new research? This presentation will cover several approaches that have potential utility in assessing scientific results for use in space weather research. The demonstration of utility is the first step, relating to the establishment of metrics to ensure that there will be a clear benefit to the end user. The presentation will then move to means of determining cost vs. benefit, (where cost involves the full effort required to transition the science to forecasting, and benefit concerns the improvement of forecast reliability), and conclude with a discussion of the role of end users and forecasters in driving further innovation via "O2R."

  16. Emergent Aerospace Designs Using Negotiating Autonomous Agents

    NASA Technical Reports Server (NTRS)

    Deshmukh, Abhijit; Middelkoop, Timothy; Krothapalli, Anjaneyulu; Smith, Charles

    2000-01-01

    This paper presents a distributed design methodology where designs emerge as a result of the negotiations between different stake holders in the process, such as cost, performance, reliability, etc. The proposed methodology uses autonomous agents to represent design decision makers. Each agent influences specific design parameters in order to maximize their utility. Since the design parameters depend on the aggregate demand of all the agents in the system, design agents need to negotiate with others in the market economy in order to reach an acceptable utility value. This paper addresses several interesting research issues related to distributed design architectures. First, we present a flexible framework which facilitates decomposition of the design problem. Second, we present overview of a market mechanism for generating acceptable design configurations. Finally, we integrate learning mechanisms in the design process to reduce the computational overhead.

  17. Automated reliability assessment for spectroscopic redshift measurements

    NASA Astrophysics Data System (ADS)

    Jamal, S.; Le Brun, V.; Le Fèvre, O.; Vibert, D.; Schmitt, A.; Surace, C.; Copin, Y.; Garilli, B.; Moresco, M.; Pozzetti, L.

    2018-03-01

    Context. Future large-scale surveys, such as the ESA Euclid mission, will produce a large set of galaxy redshifts (≥106) that will require fully automated data-processing pipelines to analyze the data, extract crucial information and ensure that all requirements are met. A fundamental element in these pipelines is to associate to each galaxy redshift measurement a quality, or reliability, estimate. Aim. In this work, we introduce a new approach to automate the spectroscopic redshift reliability assessment based on machine learning (ML) and characteristics of the redshift probability density function. Methods: We propose to rephrase the spectroscopic redshift estimation into a Bayesian framework, in order to incorporate all sources of information and uncertainties related to the redshift estimation process and produce a redshift posterior probability density function (PDF). To automate the assessment of a reliability flag, we exploit key features in the redshift posterior PDF and machine learning algorithms. Results: As a working example, public data from the VIMOS VLT Deep Survey is exploited to present and test this new methodology. We first tried to reproduce the existing reliability flags using supervised classification in order to describe different types of redshift PDFs, but due to the subjective definition of these flags (classification accuracy 58%), we soon opted for a new homogeneous partitioning of the data into distinct clusters via unsupervised classification. After assessing the accuracy of the new clusters via resubstitution and test predictions (classification accuracy 98%), we projected unlabeled data from preliminary mock simulations for the Euclid space mission into this mapping to predict their redshift reliability labels. Conclusions: Through the development of a methodology in which a system can build its own experience to assess the quality of a parameter, we are able to set a preliminary basis of an automated reliability assessment for spectroscopic redshift measurements. This newly-defined method is very promising for next-generation large spectroscopic surveys from the ground and in space, such as Euclid and WFIRST. A table of the reclassified VVDS redshifts and reliability is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/611/A53

  18. A programmable point-of-care device for external CSF drainage and monitoring.

    PubMed

    Simkins, Jeffrey R; Subbian, Vignesh; Beyette, Fred R

    2014-01-01

    This paper presents a prototype of a programmable cerebrospinal fluid (CSF) external drainage system that can accurately measure the dispensed fluid volume. It is based on using a miniature spectrophotometer to collect color data to inform drain rate and pressure monitoring. The prototype was machined with 1 μm dimensional accuracy. The current device can reliably monitor the total accumulated fluid volume, the drain rate, the programmed pressure, and the pressure read from the sensor. Device requirements, fabrication processes, and preliminary results with an experimental set-up are also presented.

  19. The role of the Long Duration Exposure Facility in the development of space systems

    NASA Technical Reports Server (NTRS)

    Little, Sally A.

    1992-01-01

    The Long Duration Exposure Facility (LDEF) presents the international, aerospace community with an unprecedented opportunity to examine the synergistic, long term, space environmental effects on systems and materials. The analysis of the data within appropriate environmental contexts is essential to the overall process of advancing the understanding of space environmental effects needed for the continuing development of strategies to improve the reliability and durability of space systems and to effectively deal with the future challenges that new space initiatives will likely present.

  20. Rigorous Training of Dogs Leads to High Accuracy in Human Scent Matching-To-Sample Performance

    PubMed Central

    Marchal, Sophie; Bregeras, Olivier; Puaux, Didier; Gervais, Rémi; Ferry, Barbara

    2016-01-01

    Human scent identification is based on a matching-to-sample task in which trained dogs are required to compare a scent sample collected from an object found at a crime scene to that of a suspect. Based on dogs’ greater olfactory ability to detect and process odours, this method has been used in forensic investigations to identify the odour of a suspect at a crime scene. The excellent reliability and reproducibility of the method largely depend on rigor in dog training. The present study describes the various steps of training that lead to high sensitivity scores, with dogs matching samples with 90% efficiency when the complexity of the scents presented during the task in the sample is similar to that presented in the in lineups, and specificity reaching a ceiling, with no false alarms in human scent matching-to-sample tasks. This high level of accuracy ensures reliable results in judicial human scent identification tests. Also, our data should convince law enforcement authorities to use these results as official forensic evidence when dogs are trained appropriately. PMID:26863620

  1. Information Quality Challenges of Patient-Generated Data in Clinical Practice

    PubMed Central

    West, Peter; Van Kleek, Max; Giordano, Richard; Weal, Mark; Shadbolt, Nigel

    2017-01-01

    A characteristic trend of digital health has been the dramatic increase in patient-generated data being presented to clinicians, which follows from the increased ubiquity of self-tracking practices by individuals, driven, in turn, by the proliferation of self-tracking tools and technologies. Such tools not only make self-tracking easier but also potentially more reliable by automating data collection, curation, and storage. While self-tracking practices themselves have been studied extensively in human–computer interaction literature, little work has yet looked at whether these patient-generated data might be able to support clinical processes, such as providing evidence for diagnoses, treatment monitoring, or postprocedure recovery, and how we can define information quality with respect to self-tracked data. In this article, we present the results of a literature review of empirical studies of self-tracking tools, in which we identify how clinicians perceive quality of information from such tools. In the studies, clinicians perceive several characteristics of information quality relating to accuracy and reliability, completeness, context, patient motivation, and representation. We discuss the issues these present in admitting self-tracked data as evidence for clinical decisions. PMID:29209601

  2. The reliability analysis of a separated, dual fail operational redundant strapdown IMU. [inertial measurement unit

    NASA Technical Reports Server (NTRS)

    Motyka, P.

    1983-01-01

    A methodology for quantitatively analyzing the reliability of redundant avionics systems, in general, and the dual, separated Redundant Strapdown Inertial Measurement Unit (RSDIMU), in particular, is presented. The RSDIMU is described and a candidate failure detection and isolation system presented. A Markov reliability model is employed. The operational states of the system are defined and the single-step state transition diagrams discussed. Graphical results, showing the impact of major system parameters on the reliability of the RSDIMU system, are presented and discussed.

  3. Physics-based process modeling, reliability prediction, and design guidelines for flip-chip devices

    NASA Astrophysics Data System (ADS)

    Michaelides, Stylianos

    Flip Chip on Board (FCOB) and Chip-Scale Packages (CSPs) are relatively new technologies that are being increasingly used in the electronic packaging industry. Compared to the more widely used face-up wirebonding and TAB technologies, flip-chips and most CSPs provide the shortest possible leads, lower inductance, higher frequency, better noise control, higher density, greater input/output (I/O), smaller device footprint and lower profile. However, due to the short history and due to the introduction of several new electronic materials, designs, and processing conditions, very limited work has been done to understand the role of material, geometry, and processing parameters on the reliability of flip-chip devices. Also, with the ever-increasing complexity of semiconductor packages and with the continued reduction in time to market, it is too costly to wait until the later stages of design and testing to discover that the reliability is not satisfactory. The objective of the research is to develop integrated process-reliability models that will take into consideration the mechanics of assembly processes to be able to determine the reliability of face-down devices under thermal cycling and long-term temperature dwelling. The models incorporate the time and temperature-dependent constitutive behavior of various materials in the assembly to be able to predict failure modes such as die cracking and solder cracking. In addition, the models account for process-induced defects and macro-micro features of the assembly. Creep-fatigue and continuum-damage mechanics models for the solder interconnects and fracture-mechanics models for the die have been used to determine the reliability of the devices. The results predicted by the models have been successfully validated against experimental data. The validated models have been used to develop qualification and test procedures for implantable medical devices. In addition, the research has helped develop innovative face-down devices without the underfill, based on the thorough understanding of the failure modes. Also, practical design guidelines for material, geometry and process parameters for reliable flip-chip devices have been developed.

  4. Content validity and reliability of test of gross motor development in Chilean children

    PubMed Central

    Cano-Cappellacci, Marcelo; Leyton, Fernanda Aleitte; Carreño, Joshua Durán

    2016-01-01

    ABSTRACT OBJECTIVE To validate a Spanish version of the Test of Gross Motor Development (TGMD-2) for the Chilean population. METHODS Descriptive, transversal, non-experimental validity and reliability study. Four translators, three experts and 92 Chilean children, from five to 10 years, students from a primary school in Santiago, Chile, have participated. The Committee of Experts has carried out translation, back-translation and revision processes to determine the translinguistic equivalence and content validity of the test, using the content validity index in 2013. In addition, a pilot implementation was achieved to determine test reliability in Spanish, by using the intraclass correlation coefficient and Bland-Altman method. We evaluated whether the results presented significant differences by replacing the bat with a racket, using T-test. RESULTS We obtained a content validity index higher than 0.80 for language clarity and relevance of the TGMD-2 for children. There were significant differences in the object control subtest when comparing the results with bat and racket. The intraclass correlation coefficient for reliability inter-rater, intra-rater and test-retest reliability was greater than 0.80 in all cases. CONCLUSIONS The TGMD-2 has appropriate content validity to be applied in the Chilean population. The reliability of this test is within the appropriate parameters and its use could be recommended in this population after the establishment of normative data, setting a further precedent for the validation in other Latin American countries. PMID:26815160

  5. Reliable estimation of orbit errors in spaceborne SAR interferometry. The network approach

    NASA Astrophysics Data System (ADS)

    Bähr, Hermann; Hanssen, Ramon F.

    2012-12-01

    An approach to improve orbital state vectors by orbit error estimates derived from residual phase patterns in synthetic aperture radar interferograms is presented. For individual interferograms, an error representation by two parameters is motivated: the baseline error in cross-range and the rate of change of the baseline error in range. For their estimation, two alternatives are proposed: a least squares approach that requires prior unwrapping and a less reliable gridsearch method handling the wrapped phase. In both cases, reliability is enhanced by mutual control of error estimates in an overdetermined network of linearly dependent interferometric combinations of images. Thus, systematic biases, e.g., due to unwrapping errors, can be detected and iteratively eliminated. Regularising the solution by a minimum-norm condition results in quasi-absolute orbit errors that refer to particular images. For the 31 images of a sample ENVISAT dataset, orbit corrections with a mutual consistency on the millimetre level have been inferred from 163 interferograms. The method itself qualifies by reliability and rigorous geometric modelling of the orbital error signal but does not consider interfering large scale deformation effects. However, a separation may be feasible in a combined processing with persistent scatterer approaches or by temporal filtering of the estimates.

  6. The establisment of an achievement test for determination of primary teachers’ knowledge level of earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aydin, Süleyman, E-mail: yupul@hotmail.com; Haşiloğlu, M. Akif, E-mail: mehmet.hasiloglu@hotmail.com; Kunduraci, Ayşe, E-mail: ayse-kndrc@hotmail.com

    In this study it was aimed to improve an academic achievement test to establish the students’ knowledge about the earthquake and the ways of protection from earthquakes. In the method of this study, the steps that Webb (1994) was created to improve an academic achievement test for a unit were followed. In the developmental process of multiple choice test having 25 questions, was prepared to measure the pre-service teachers’ knowledge levels about the earthquake and the ways of protection from earthquakes. The multiple choice test was presented to view of six academics (one of them was from geographic field andmore » five of them were science educator) and two expert teachers in science Prepared test was applied to 93 pre-service teachers studying in elementary education department in 2014-2015 academic years. As a result of validity and reliability of the study, the test was composed of 20 items. As a result of these applications, Pearson Moments Multiplication half-reliability coefficient was found to be 0.94. When this value is adjusted according to Spearman Brown reliability coefficient the reliability coefficient was set at 0.97.« less

  7. Producing Cochrane systematic reviews-a qualitative study of current approaches and opportunities for innovation and improvement.

    PubMed

    Turner, Tari; Green, Sally; Tovey, David; McDonald, Steve; Soares-Weiser, Karla; Pestridge, Charlotte; Elliott, Julian

    2017-08-01

    Producing high-quality, relevant systematic reviews and keeping them up to date is challenging. Cochrane is a leading provider of systematic reviews in health. For Cochrane to continue to contribute to improvements in heath, Cochrane Reviews must be rigorous, reliable and up to date. We aimed to explore existing models of Cochrane Review production and emerging opportunities to improve the efficiency and sustainability of these processes. To inform discussions about how to best achieve this, we conducted 26 interviews and an online survey with 106 respondents. Respondents highlighted the importance and challenge of creating reliable, timely systematic reviews. They described the challenges and opportunities presented by current production models, and they shared what they are doing to improve review production. They particularly highlighted significant challenges with increasing complexity of review methods; difficulty keeping authors on board and on track; and the length of time required to complete the process. Strong themes emerged about the roles of authors and Review Groups, the central actors in the review production process. The results suggest that improvements to Cochrane's systematic review production models could come from improving clarity of roles and expectations, ensuring continuity and consistency of input, enabling active management of the review process, centralising some review production steps; breaking reviews into smaller "chunks", and improving approaches to building capacity of and sharing information between authors and Review Groups. Respondents noted the important role new technologies have to play in enabling these improvements. The findings of this study will inform the development of new Cochrane Review production models and may provide valuable data for other systematic review producers as they consider how best to produce rigorous, reliable, up-to-date reviews.

  8. Attention Is Required for Knowledge-Based Sequential Grouping: Insights from the Integration of Syllables into Words.

    PubMed

    Ding, Nai; Pan, Xunyi; Luo, Cheng; Su, Naifei; Zhang, Wen; Zhang, Jianfeng

    2018-01-31

    How the brain groups sequential sensory events into chunks is a fundamental question in cognitive neuroscience. This study investigates whether top-down attention or specific tasks are required for the brain to apply lexical knowledge to group syllables into words. Neural responses tracking the syllabic and word rhythms of a rhythmic speech sequence were concurrently monitored using electroencephalography (EEG). The participants performed different tasks, attending to either the rhythmic speech sequence or a distractor, which was another speech stream or a nonlinguistic auditory/visual stimulus. Attention to speech, but not a lexical-meaning-related task, was required for reliable neural tracking of words, even when the distractor was a nonlinguistic stimulus presented cross-modally. Neural tracking of syllables, however, was reliably observed in all tested conditions. These results strongly suggest that neural encoding of individual auditory events (i.e., syllables) is automatic, while knowledge-based construction of temporal chunks (i.e., words) crucially relies on top-down attention. SIGNIFICANCE STATEMENT Why we cannot understand speech when not paying attention is an old question in psychology and cognitive neuroscience. Speech processing is a complex process that involves multiple stages, e.g., hearing and analyzing the speech sound, recognizing words, and combining words into phrases and sentences. The current study investigates which speech-processing stage is blocked when we do not listen carefully. We show that the brain can reliably encode syllables, basic units of speech sounds, even when we do not pay attention. Nevertheless, when distracted, the brain cannot group syllables into multisyllabic words, which are basic units for speech meaning. Therefore, the process of converting speech sound into meaning crucially relies on attention. Copyright © 2018 the authors 0270-6474/18/381178-11$15.00/0.

  9. Investigation of low glass transition temperature on COTS PEM's reliability for space applications

    NASA Technical Reports Server (NTRS)

    Sandor, M.; Agarwal, S.; Peters, D.; Cooper, M. S.

    2003-01-01

    Plastic Encapsulated Microelectronics (PEM) reliability is affected by many factors. Glass transition temperature (Tg) is one such factor. In this presentation issues relating to PEM reliability and the effect of low glass transition temperature epoxy mold compounds are presented.

  10. Sintered tantalum carbide coatings on graphite substrates: Highly reliable protective coatings for bulk and epitaxial growth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakamura, Daisuke; Suzumura, Akitoshi; Shigetoh, Keisuke

    2015-02-23

    Highly reliable low-cost protective coatings have been sought after for use in crucibles and susceptors for bulk and epitaxial film growth processes involving wide bandgap materials. Here, we propose a production technique for ultra-thick (50–200 μmt) tantalum carbide (TaC) protective coatings on graphite substrates, which consists of TaC slurry application and subsequent sintering processes, i.e., a wet ceramic process. Structural analysis of the sintered TaC layers indicated that they have a dense granular structure containing coarse grain with sizes of 10–50 μm. Furthermore, no cracks or pinholes penetrated through the layers, i.e., the TaC layers are highly reliable protective coatings. The analysismore » also indicated that no plastic deformation occurred during the production process, and the non-textured crystalline orientation of the TaC layers is the origin of their high reliability and durability. The TaC-coated graphite crucibles were tested in an aluminum nitride (AlN) sublimation growth process, which involves extremely corrosive conditions, and demonstrated their practical reliability and durability in the AlN growth process as a TaC-coated graphite. The application of the TaC-coated graphite materials to crucibles and susceptors for use in bulk AlN single crystal growth, bulk silicon carbide (SiC) single crystal growth, chemical vapor deposition of epitaxial SiC films, and metal-organic vapor phase epitaxy of group-III nitrides will lead to further improvements in crystal quality and reduced processing costs.« less

  11. Qualification and Reliability for MEMS and IC Packages

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza

    2004-01-01

    Advanced IC electronic packages are moving toward miniaturization from two key different approaches, front and back-end processes, each with their own challenges. Successful use of more of the back-end process front-end, e.g. microelectromechanical systems (MEMS) Wafer Level Package (WLP), enable reducing size and cost. Use of direct flip chip die is the most efficient approach if and when the issues of know good die and board/assembly are resolved. Wafer level package solve the issue of known good die by enabling package test, but it has its own limitation, e.g., the I/O limitation, additional cost, and reliability. From the back-end approach, system-in-a-package (SIAP/SIP) development is a response to an increasing demand for package and die integration of different functions into one unit to reduce size and cost and improve functionality. MEMS add another challenging dimension to electronic packaging since they include moving mechanical elements. Conventional qualification and reliability need to be modified and expanded in most cases in order to detect new unknown failures. This paper will review four standards that already released or being developed that specifically address the issues on qualification and reliability of assembled packages. Exposures to thermal cycles, monotonic bend test, mechanical shock and drop are covered in these specifications. Finally, mechanical and thermal cycle qualification data generated for MEMS accelerometer will be presented. The MEMS was an element of an inertial measurement unit (IMU) qualified for NASA Mars Exploration Rovers (MERs), Spirit and Opportunity that successfully is currently roaring the Martian surface

  12. An adaptive semantic matching paradigm for reliable and valid language mapping in individuals with aphasia.

    PubMed

    Wilson, Stephen M; Yen, Melodie; Eriksson, Dana K

    2018-04-17

    Research on neuroplasticity in recovery from aphasia depends on the ability to identify language areas of the brain in individuals with aphasia. However, tasks commonly used to engage language processing in people with aphasia, such as narrative comprehension and picture naming, are limited in terms of reliability (test-retest reproducibility) and validity (identification of language regions, and not other regions). On the other hand, paradigms such as semantic decision that are effective in identifying language regions in people without aphasia can be prohibitively challenging for people with aphasia. This paper describes a new semantic matching paradigm that uses an adaptive staircase procedure to present individuals with stimuli that are challenging yet within their competence, so that language processing can be fully engaged in people with and without language impairments. The feasibility, reliability and validity of the adaptive semantic matching paradigm were investigated in sixteen individuals with chronic post-stroke aphasia and fourteen neurologically normal participants, in comparison to narrative comprehension and picture naming paradigms. All participants succeeded in learning and performing the semantic paradigm. Test-retest reproducibility of the semantic paradigm in people with aphasia was good (Dice coefficient = 0.66), and was superior to the other two paradigms. The semantic paradigm revealed known features of typical language organization (lateralization; frontal and temporal regions) more consistently in neurologically normal individuals than the other two paradigms, constituting evidence for validity. In sum, the adaptive semantic matching paradigm is a feasible, reliable and valid method for mapping language regions in people with aphasia. © 2018 Wiley Periodicals, Inc.

  13. The R and M 2000 Process and Reliability and Maintainability Management: Attitudes of Senior Level Managers in Aeronautical Systems Division

    DTIC Science & Technology

    1988-09-01

    I .. I . . .. . - - AFIT/GLM/LSM/88S-59 THE R& M 2000 PROCESS AND RELIABILITY AND MAINTAINABILITY...respondents provided verbal responses to this question. Although one-half of these responses spoke 65 k - ’ ’ ’ i l l I l l i favorably of R& M 2000 , there were...GROUP SUB-GROUP Attitudes, Reliability, Maintainability, R& M , R&" 2000 , 05 01 I Aeronautical Systems Division, ASD 19. ABSTRACT (Continue on

  14. The Reliability Mandate: Optimizing the Use of Highly Reliable Parts, Materials, and Processes (PM&P) to Maximize System Component Reliability in the Life Cycle

    DTIC Science & Technology

    2002-06-01

    projects are converted into bricks and mortar , as Figure 5 illustrates. Making major changes in LCC after projects are turned over to production is...matter experts ( SMEs ) in the parts, materials, and processes functional area. Data gathering and analysis were conducted through structured interviews...The analysis synthesized feedback and searched for collective issues from the various SMEs on managing PM&P Program requirements, the

  15. 7 CFR 760.405 - Application process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... death documentation is not available, the participant may provide reliable records, in conjunction with verifiable beginning and ending inventory records, as proof of death. Reliable records may include..., pictures, and other similar reliable documents as determined by FSA. (f) Certification of livestock deaths...

  16. Reliable probabilities through statistical post-processing of ensemble predictions

    NASA Astrophysics Data System (ADS)

    Van Schaeybroeck, Bert; Vannitsem, Stéphane

    2013-04-01

    We develop post-processing or calibration approaches based on linear regression that make ensemble forecasts more reliable. We enforce climatological reliability in the sense that the total variability of the prediction is equal to the variability of the observations. Second, we impose ensemble reliability such that the spread around the ensemble mean of the observation coincides with the one of the ensemble members. In general the attractors of the model and reality are inhomogeneous. Therefore ensemble spread displays a variability not taken into account in standard post-processing methods. We overcome this by weighting the ensemble by a variable error. The approaches are tested in the context of the Lorenz 96 model (Lorenz 1996). The forecasts become more reliable at short lead times as reflected by a flatter rank histogram. Our best method turns out to be superior to well-established methods like EVMOS (Van Schaeybroeck and Vannitsem, 2011) and Nonhomogeneous Gaussian Regression (Gneiting et al., 2005). References [1] Gneiting, T., Raftery, A. E., Westveld, A., Goldman, T., 2005: Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation. Mon. Weather Rev. 133, 1098-1118. [2] Lorenz, E. N., 1996: Predictability - a problem partly solved. Proceedings, Seminar on Predictability ECMWF. 1, 1-18. [3] Van Schaeybroeck, B., and S. Vannitsem, 2011: Post-processing through linear regression, Nonlin. Processes Geophys., 18, 147.

  17. A novel, two-step top seeded infiltration and growth process for the fabrication of single grain, bulk (RE)BCO superconductors

    NASA Astrophysics Data System (ADS)

    Namburi, Devendra K.; Shi, Yunhua; Palmer, Kysen G.; Dennis, Anthony R.; Durrell, John H.; Cardwell, David A.

    2016-09-01

    A fundamental requirement of the fabrication of high performing, (RE)-Ba-Cu-O bulk superconductors is achieving a single grain microstructure that exhibits good flux pinning properties. The top seeded melt growth (TSMG) process is a well-established technique for the fabrication of single grain (RE)BCO bulk samples and is now applied routinely by a number of research groups around the world. The introduction of a buffer layer to the TSMG process has been demonstrated recently to improve significantly the general reliability of the process. However, a number of growth-related defects, such as porosity and the formation of micro-cracks, remain inherent to the TSMG process, and are proving difficult to eliminate by varying the melt process parameters. The seeded infiltration and growth (SIG) process has been shown to yield single grain samples that exhibit significantly improved microstructures compared to the TSMG technique. Unfortunately, however, SIG leads to other processing challenges, such as the reliability of fabrication, optimisation of RE2BaCuO5 (RE-211) inclusions (size and content) in the sample microstructure, practical oxygenation of as processed samples and, hence, optimisation of the superconducting properties of the bulk single grain. In the present paper, we report the development of a near-net shaping technique based on a novel two-step, buffer-aided top seeded infiltration and growth (BA-TSIG) process, which has been demonstrated to improve greatly the reliability of the single grain growth process and has been used to fabricate successfully bulk, single grain (RE)BCO superconductors with improved microstructures and superconducting properties. A trapped field of ˜0.84 T and a zero field current density of 60 kA cm-2 have been measured at 77 K in a bulk, YBCO single grain sample of diameter 25 mm processed by this two-step BA-TSIG technique. To the best of our knowledge, this value of trapped field is the highest value ever reported for a sample fabricated by an infiltration and growth process. In this study we report the successful fabrication of 14 YBCO samples, with diameters of up to 32 mm, by this novel technique with a success rate of greater than 92%.

  18. Multisensor Arrays for Greater Reliability and Accuracy

    NASA Technical Reports Server (NTRS)

    Immer, Christopher; Eckhoff, Anthony; Lane, John; Perotti, Jose; Randazzo, John; Blalock, Norman; Ree, Jeff

    2004-01-01

    Arrays of multiple, nominally identical sensors with sensor-output-processing electronic hardware and software are being developed in order to obtain accuracy, reliability, and lifetime greater than those of single sensors. The conceptual basis of this development lies in the statistical behavior of multiple sensors and a multisensor-array (MSA) algorithm that exploits that behavior. In addition, advances in microelectromechanical systems (MEMS) and integrated circuits are exploited. A typical sensor unit according to this concept includes multiple MEMS sensors and sensor-readout circuitry fabricated together on a single chip and packaged compactly with a microprocessor that performs several functions, including execution of the MSA algorithm. In the MSA algorithm, the readings from all the sensors in an array at a given instant of time are compared and the reliability of each sensor is quantified. This comparison of readings and quantification of reliabilities involves the calculation of the ratio between every sensor reading and every other sensor reading, plus calculation of the sum of all such ratios. Then one output reading for the given instant of time is computed as a weighted average of the readings of all the sensors. In this computation, the weight for each sensor is the aforementioned value used to quantify its reliability. In an optional variant of the MSA algorithm that can be implemented easily, a running sum of the reliability value for each sensor at previous time steps as well as at the present time step is used as the weight of the sensor in calculating the weighted average at the present time step. In this variant, the weight of a sensor that continually fails gradually decreases, so that eventually, its influence over the output reading becomes minimal: In effect, the sensor system "learns" which sensors to trust and which not to trust. The MSA algorithm incorporates a criterion for deciding whether there remain enough sensor readings that approximate each other sufficiently closely to constitute a majority for the purpose of quantifying reliability. This criterion is, simply, that if there do not exist at least three sensors having weights greater than a prescribed minimum acceptable value, then the array as a whole is deemed to have failed.

  19. Coupled Thermo-Mechanical and Photo-Chemical Degradation Mechanisms that determine the Reliability and Operational Lifetimes for CPV Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dauskardt, Reinhold H.

    This project sought to identify and characterize the coupled intrinsic photo-chemo-mechanical degradation mechanisms that determine the reliability and operational lifetimes for CPV technologies. Over a three year period, we have completed a highly successful program which has developed quantitative metrologies and detailed physics-based degradation models, providing new insight into the fundamental reliability physics necessary for improving materials, creating accelerated testing protocols, and producing more accurate lifetime predictions. The tasks for the program were separated into two focus areas shown in the figure below. Focus Area 1, led by Reinhold Dauskardt and Warren Cai with a primary collaboration with David Millermore » of NREL, studied the degradation mechanisms present in encapsulant materials. Focus Area 2, led by Reinhold Dauskardt and Ryan Brock with a primary collaboration with James Ermer and Peter Hebert of Spectrolab, studied stress development and degradation within internal CPV device interfaces. Each focus area was productive, leading to several publications, including findings on the degradation of silicone encapsulant under terrestrial UV, a model for photodegradation of silicone encapsulant adhesion, quantification and process tuning of antireflective layers on CPV, and discovery of a thermal cycling degradation mechanism present in metal gridline structures.« less

  20. Multiple electron processes of He and Ne by proton impact

    NASA Astrophysics Data System (ADS)

    Terekhin, Pavel Nikolaevich; Montenegro, Pablo; Quinto, Michele; Monti, Juan; Fojon, Omar; Rivarola, Roberto

    2016-05-01

    A detailed investigation of multiple electron processes (single and multiple ionization, single capture, transfer-ionization) of He and Ne is presented for proton impact at intermediate and high collision energies. Exclusive absolute cross sections for these processes have been obtained by calculation of transition probabilities in the independent electron and independent event models as a function of impact parameter in the framework of the continuum distorted wave-eikonal initial state theory. A binomial analysis is employed to calculate exclusive probabilities. The comparison with available theoretical and experimental results shows that exclusive probabilities are needed for a reliable description of the experimental data. The developed approach can be used for obtaining the input database for modeling multiple electron processes of charged particles passing through the matter.

  1. Currently available methodologies for the processing of intravascular ultrasound and optical coherence tomography images.

    PubMed

    Athanasiou, Lambros; Sakellarios, Antonis I; Bourantas, Christos V; Tsirka, Georgia; Siogkas, Panagiotis; Exarchos, Themis P; Naka, Katerina K; Michalis, Lampros K; Fotiadis, Dimitrios I

    2014-07-01

    Optical coherence tomography and intravascular ultrasound are the most widely used methodologies in clinical practice as they provide high resolution cross-sectional images that allow comprehensive visualization of the lumen and plaque morphology. Several methods have been developed in recent years to process the output of these imaging modalities, which allow fast, reliable and reproducible detection of the luminal borders and characterization of plaque composition. These methods have proven useful in the study of the atherosclerotic process as they have facilitated analysis of a vast amount of data. This review presents currently available intravascular ultrasound and optical coherence tomography processing methodologies for segmenting and characterizing the plaque area, highlighting their advantages and disadvantages, and discusses the future trends in intravascular imaging.

  2. Barista: A Framework for Concurrent Speech Processing by USC-SAIL

    PubMed Central

    Can, Doğan; Gibson, James; Vaz, Colin; Georgiou, Panayiotis G.; Narayanan, Shrikanth S.

    2016-01-01

    We present Barista, an open-source framework for concurrent speech processing based on the Kaldi speech recognition toolkit and the libcppa actor library. With Barista, we aim to provide an easy-to-use, extensible framework for constructing highly customizable concurrent (and/or distributed) networks for a variety of speech processing tasks. Each Barista network specifies a flow of data between simple actors, concurrent entities communicating by message passing, modeled after Kaldi tools. Leveraging the fast and reliable concurrency and distribution mechanisms provided by libcppa, Barista lets demanding speech processing tasks, such as real-time speech recognizers and complex training workflows, to be scheduled and executed on parallel (and/or distributed) hardware. Barista is released under the Apache License v2.0. PMID:27610047

  3. Enzymes in Fish and Seafood Processing

    PubMed Central

    Fernandes, Pedro

    2016-01-01

    Enzymes have been used for the production and processing of fish and seafood for several centuries in an empirical manner. In recent decades, a growing trend toward a rational and controlled application of enzymes for such goals has emerged. Underlying such pattern are, among others, the increasingly wider array of enzyme activities and enzyme sources, improved enzyme formulations, and enhanced requirements for cost-effective and environmentally friendly processes. The better use of enzyme action in fish- and seafood-related application has had a significant impact on fish-related industry. Thus, new products have surfaced, product quality has improved, more sustainable processes have been developed, and innovative and reliable analytical techniques have been implemented. Recent development in these fields are presented and discussed, and prospective developments are suggested. PMID:27458583

  4. An improved probit method for assessment of domino effect to chemical process equipment caused by overpressure.

    PubMed

    Mingguang, Zhang; Juncheng, Jiang

    2008-10-30

    Overpressure is one important cause of domino effect in accidents of chemical process equipments. Damage probability and relative threshold value are two necessary parameters in QRA of this phenomenon. Some simple models had been proposed based on scarce data or oversimplified assumption. Hence, more data about damage to chemical process equipments were gathered and analyzed, a quantitative relationship between damage probability and damage degrees of equipment was built, and reliable probit models were developed associated to specific category of chemical process equipments. Finally, the improvements of present models were evidenced through comparison with other models in literatures, taking into account such parameters: consistency between models and data, depth of quantitativeness in QRA.

  5. Instrumentation complex for Langley Research Center's National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Russell, C. H.; Bryant, C. S.

    1977-01-01

    The instrumentation discussed in the present paper was developed to ensure reliable operation for a 2.5-meter cryogenic high-Reynolds-number fan-driven transonic wind tunnel. It will incorporate four CPU's and associated analog and digital input/output equipment, necessary for acquiring research data, controlling the tunnel parameters, and monitoring the process conditions. Connected in a multipoint distributed network, the CPU's will support data base management and processing; research measurement data acquisition and display; process monitoring; and communication control. The design will allow essential processes to continue, in the case of major hardware failures, by switching input/output equipment to alternate CPU's and by eliminating nonessential functions. It will also permit software modularization by CPU activity and thereby reduce complexity and development time.

  6. Barista: A Framework for Concurrent Speech Processing by USC-SAIL.

    PubMed

    Can, Doğan; Gibson, James; Vaz, Colin; Georgiou, Panayiotis G; Narayanan, Shrikanth S

    2014-05-01

    We present Barista, an open-source framework for concurrent speech processing based on the Kaldi speech recognition toolkit and the libcppa actor library. With Barista, we aim to provide an easy-to-use, extensible framework for constructing highly customizable concurrent (and/or distributed) networks for a variety of speech processing tasks. Each Barista network specifies a flow of data between simple actors, concurrent entities communicating by message passing, modeled after Kaldi tools. Leveraging the fast and reliable concurrency and distribution mechanisms provided by libcppa, Barista lets demanding speech processing tasks, such as real-time speech recognizers and complex training workflows, to be scheduled and executed on parallel (and/or distributed) hardware. Barista is released under the Apache License v2.0.

  7. Nanowire growth process modeling and reliability models for nanodevices

    NASA Astrophysics Data System (ADS)

    Fathi Aghdam, Faranak

    Nowadays, nanotechnology is becoming an inescapable part of everyday life. The big barrier in front of its rapid growth is our incapability of producing nanoscale materials in a reliable and cost-effective way. In fact, the current yield of nano-devices is very low (around 10 %), which makes fabrications of nano-devices very expensive and uncertain. To overcome this challenge, the first and most important step is to investigate how to control nano-structure synthesis variations. The main directions of reliability research in nanotechnology can be classified either from a material perspective or from a device perspective. The first direction focuses on restructuring materials and/or optimizing process conditions at the nano-level (nanomaterials). The other direction is linked to nano-devices and includes the creation of nano-electronic and electro-mechanical systems at nano-level architectures by taking into account the reliability of future products. In this dissertation, we have investigated two topics on both nano-materials and nano-devices. In the first research work, we have studied the optimization of one of the most important nanowire growth processes using statistical methods. Research on nanowire growth with patterned arrays of catalyst has shown that the wire-to-wire spacing is an important factor affecting the quality of resulting nanowires. To improve the process yield and the length uniformity of fabricated nanowires, it is important to reduce the resource competition between nanowires during the growth process. We have proposed a physical-statistical nanowire-interaction model considering the shadowing effect and shared substrate diffusion area to determine the optimal pitch that would ensure the minimum competition between nanowires. A sigmoid function is used in the model, and the least squares estimation method is used to estimate the model parameters. The estimated model is then used to determine the optimal spatial arrangement of catalyst arrays. This work is an early attempt that uses a physical-statistical modeling approach to studying selective nanowire growth for the improvement of process yield. In the second research work, the reliability of nano-dielectrics is investigated. As electronic devices get smaller, reliability issues pose new challenges due to unknown underlying physics of failure (i.e., failure mechanisms and modes). This necessitates new reliability analysis approaches related to nano-scale devices. One of the most important nano-devices is the transistor that is subject to various failure mechanisms. Dielectric breakdown is known to be the most critical one and has become a major barrier for reliable circuit design in nano-scale. Due to the need for aggressive downscaling of transistors, dielectric films are being made extremely thin, and this has led to adopting high permittivity (k) dielectrics as an alternative to widely used SiO2 in recent years. Since most time-dependent dielectric breakdown test data on bilayer stacks show significant deviations from a Weibull trend, we have proposed two new approaches to modeling the time to breakdown of bi-layer high-k dielectrics. In the first approach, we have used a marked space-time self-exciting point process to model the defect generation rate. A simulation algorithm is used to generate defects within the dielectric space, and an optimization algorithm is employed to minimize the Kullback-Leibler divergence between the empirical distribution obtained from the real data and the one based on the simulated data to find the best parameter values and to predict the total time to failure. The novelty of the presented approach lies in using a conditional intensity for trap generation in dielectric that is a function of time, space and size of the previous defects. In addition, in the second approach, a k-out-of-n system framework is proposed to estimate the total failure time after the generation of more than one soft breakdown.

  8. Synthesis of multifilament silicon carbide fibers by chemical vapor deposition

    NASA Technical Reports Server (NTRS)

    Revankar, Vithal; Hlavacek, Vladimir

    1991-01-01

    A process for development of clean silicon carbide fiber with a small diameter and high reliability is presented. An experimental evaluation of operating conditions for SiC fibers of good mechanical properties and devising an efficient technique which will prevent welding together of individual filaments are discussed. The thermodynamic analysis of a different precursor system was analyzed vigorously. Thermodynamically optimum conditions for stoichiometric SiC deposit were obtained.

  9. Proceedings of the 26th Project Integration Meeting

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Progress made by the Flat-plate Solar Array (FSA) Project is described for the period July 1985 to April 1986. Included are reports on silicon sheet growth and characterization, silicon material, process development, high-efficienty cells, environmental isolation, engineering sciences, and reliability physics. Also included are technical and plenary presentations made at the 26th Project Integration Meeting (PIM) held on April 29 to 30 and May 1, 1986.

  10. Developing Source Selection Evaluation Criteria and Standards for Reliability and Maintainability.

    DTIC Science & Technology

    1985-09-01

    of early investment in R&M engi- neering must be carried into the source selection process. The R&M engineering policy...cotaiedtherein. Furthermore, the views expressed in the document are those of the author(s) and do not necessarily reflect the views of’the School of ...THESIS Presented to the Faculty of the School of Systems and Logistics of the Air Force Institute of

  11. Rolling Bearing Steels - A Technical and Historical Perspective

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin V.

    2012-01-01

    Starting about 1920 it becomes easier to track the growth of bearing materials technology. Until 1955, with few exceptions, comparatively little progress was made in this area. AISI 52100 and some carburizing grades (AISI 4320, AISI 9310) were adequate for most applications. The catalyst to quantum advances in high-performance rolling-element bearing steels was the advent of the aircraft gas turbine engine. With improved bearing manufacturing and steel processing together with advanced lubrication technology, the potential improvements in bearing life can be as much as 80 times that attainable in the late 1950s or as much as 400 times that attainable in 1940. This paper summarizes the chemical, metallurgical and physical aspects of bearing steels and their effect on rolling bearing life and reliability. The single most important variable that has significantly increased bearing life and reliability is vacuum processing of bearing steel. Differences between through hardened, case carburized and corrosion resistant steels are discussed. The interrelation of alloy elements and carbides and their effect on bearing life are presented. An equation relating bearing life, steel hardness and temperature is given. Life factors for various steels are suggested and discussed. A relation between compressive residual stress and bearing life is presented. The effects of retained austenite and grain size are discussed.

  12. The Geostationary Operational Satellite R Series SpaceWire Based Data System

    NASA Technical Reports Server (NTRS)

    Anderson, William; Birmingham, Michael; Krimchansky, Alexander; Lombardi, Matthew

    2016-01-01

    The Geostationary Operational Environmental Satellite R-Series Program (GOES-R, S, T, and U) mission is a joint program between National Oceanic & Atmospheric Administration (NOAA) and National Aeronautics & Space Administration (NASA) Goddard Space Flight Center (GSFC). SpaceWire was selected as the science data bus as well as command and telemetry for the GOES instruments. GOES-R, S, T, and U spacecraft have a mission data loss requirement for all data transfers between the instruments and spacecraft requiring error detection and correction at the packet level. The GOES-R Reliable Data Delivery Protocol (GRDDP) [1] was developed in house to provide a means of reliably delivering data among various on board sources and sinks. The GRDDP was presented to and accepted by the European Cooperation for Space Standardization (ECSS) and is part of the ECSS Protocol Identification Standard [2]. GOES-R development and integration is complete and the observatory is scheduled for launch November 2016. Now that instrument to spacecraft integration is complete, GOES-R Project reviewed lessons learned to determine how the GRDDP could be revised to improve the integration process. Based on knowledge gained during the instrument to spacecraft integration process the following is presented to help potential GRDDP users improve their system designs and implementation.

  13. Neural representation of form-contingent color filling-in in the early visual cortex.

    PubMed

    Hong, Sang Wook; Tong, Frank

    2017-11-01

    Perceptual filling-in exemplifies the constructive nature of visual processing. Color, a prominent surface property of visual objects, can appear to spread to neighboring areas that lack any color. We investigated cortical responses to a color filling-in illusion that effectively dissociates perceived color from the retinal input (van Lier, Vergeer, & Anstis, 2009). Observers adapted to a star-shaped stimulus with alternating red- and cyan-colored points to elicit a complementary afterimage. By presenting an achromatic outline that enclosed one of the two afterimage colors, perceptual filling-in of that color was induced in the unadapted central region. Visual cortical activity was monitored with fMRI, and analyzed using multivariate pattern analysis. Activity patterns in early visual areas (V1-V4) reliably distinguished between the two color-induced filled-in conditions, but only higher extrastriate visual areas showed the predicted correspondence with color perception. Activity patterns allowed for reliable generalization between filled-in colors and physical presentations of perceptually matched colors in areas V3 and V4, but not in earlier visual areas. These findings suggest that the perception of filled-in surface color likely requires more extensive processing by extrastriate visual areas, in order for the neural representation of surface color to become aligned with perceptually matched real colors.

  14. Internal Consistency Reliability of the Self-Report Antisocial Process Screening Device

    ERIC Educational Resources Information Center

    Poythress, Norman G.; Douglas, Kevin S.; Falkenbach, Diana; Cruise, Keith; Lee, Zina; Murrie, Daniel C.; Vitacco, Michael

    2006-01-01

    The self-report version of the Antisocial Process Screening Device (APSD) has become a popular measure for assessing psychopathic features in justice-involved adolescents. However, the internal consistency reliability of its component scales (Narcissism, Callous-Unemotional, and Impulsivity) has been questioned in several studies. This study…

  15. Silver plating ensures reliable diffusion bonding of dissimilar metals

    NASA Technical Reports Server (NTRS)

    1967-01-01

    Dissimilar metals are reliably joined by diffusion bonding when the surfaces are electroplated with silver. The process involves cleaning and etching, anodization, silver striking, and silver plating with a conventional plating bath. It minimizes the formation of detrimental intermetallic phases and provides greater tolerance of processing parameters.

  16. Sociotechnical attributes of safe and unsafe work systems.

    PubMed

    Kleiner, Brian M; Hettinger, Lawrence J; DeJoy, David M; Huang, Yuang-Hsiang; Love, Peter E D

    2015-01-01

    Theoretical and practical approaches to safety based on sociotechnical systems principles place heavy emphasis on the intersections between social-organisational and technical-work process factors. Within this perspective, work system design emphasises factors such as the joint optimisation of social and technical processes, a focus on reliable human-system performance and safety metrics as design and analysis criteria, the maintenance of a realistic and consistent set of safety objectives and policies, and regular access to the expertise and input of workers. We discuss three current approaches to the analysis and design of complex sociotechnical systems: human-systems integration, macroergonomics and safety climate. Each approach emphasises key sociotechnical systems themes, and each prescribes a more holistic perspective on work systems than do traditional theories and methods. We contrast these perspectives with historical precedents such as system safety and traditional human factors and ergonomics, and describe potential future directions for their application in research and practice. The identification of factors that can reliably distinguish between safe and unsafe work systems is an important concern for ergonomists and other safety professionals. This paper presents a variety of sociotechnical systems perspectives on intersections between social--organisational and technology--work process factors as they impact work system analysis, design and operation.

  17. A knowledge acquisition process to analyse operational problems in solid waste management facilities.

    PubMed

    Dokas, Ioannis M; Panagiotakopoulos, Demetrios C

    2006-08-01

    The available expertise on managing and operating solid waste management (SWM) facilities varies among countries and among types of facilities. Few experts are willing to record their experience, while few researchers systematically investigate the chains of events that could trigger operational failures in a facility; expertise acquisition and dissemination, in SWM, is neither popular nor easy, despite the great need for it. This paper presents a knowledge acquisition process aimed at capturing, codifying and expanding reliable expertise and propagating it to non-experts. The knowledge engineer (KE), the person performing the acquisition, must identify the events (or causes) that could trigger a failure, determine whether a specific event could trigger more than one failure, and establish how various events are related among themselves and how they are linked to specific operational problems. The proposed process, which utilizes logic diagrams (fault trees) widely used in system safety and reliability analyses, was used for the analysis of 24 common landfill operational problems. The acquired knowledge led to the development of a web-based expert system (Landfill Operation Management Advisor, http://loma.civil.duth.gr), which estimates the occurrence possibility of operational problems, provides advice and suggests solutions.

  18. Robotic tape library system level testing at NSA: Present and planned

    NASA Technical Reports Server (NTRS)

    Shields, Michael F.

    1994-01-01

    In the present of declining Defense budgets, increased pressure has been placed on the DOD to utilize Commercial Off the Shelf (COTS) solutions to incrementally solve a wide variety of our computer processing requirements. With the rapid growth in processing power, significant expansion of high performance networking, and the increased complexity of applications data sets, the requirement for high performance, large capacity, reliable and secure, and most of all affordable robotic tape storage libraries has greatly increased. Additionally, the migration to a heterogeneous, distributed computing environment has further complicated the problem. With today's open system compute servers approaching yesterday's supercomputer capabilities, the need for affordable, reliable secure Mass Storage Systems (MSS) has taken on an ever increasing importance to our processing center's ability to satisfy operational mission requirements. To that end, NSA has established an in-house capability to acquire, test, and evaluate COTS products. Its goal is to qualify a set of COTS MSS libraries, thereby achieving a modicum of standardization for robotic tape libraries which can satisfy our low, medium, and high performance file and volume serving requirements. In addition, NSA has established relations with other Government Agencies to complete this in-house effort and to maximize our research, testing, and evaluation work. While the preponderance of the effort is focused at the high end of the storage ladder, considerable effort will be extended this year and next at the server class or mid range storage systems.

  19. Psychometric assessment of the processes of change scale for sun protection.

    PubMed

    Sillice, Marie A; Babbin, Steven F; Redding, Colleen A; Rossi, Joseph S; Paiva, Andrea L; Velicer, Wayne F

    2018-01-01

    The fourteen-factor Processes of Change Scale for Sun Protection assesses behavioral and experiential strategies that underlie the process of sun protection acquisition and maintenance. Variations of this measure have been used effectively in several randomized sun protection trials, both for evaluation and as a basis for intervention. However, there are no published studies, to date, that evaluate the psychometric properties of the scale. The present study evaluated factorial invariance and scale reliability in a national sample (N = 1360) of adults involved in a Transtheoretical model tailored intervention for exercise and sun protection, at baseline. Invariance testing ranged from least to most restrictive: Configural Invariance (constraints only factor structure and zero loadings); Pattern Identity Invariance (equal factor loadings across target groups); and Strong Factorial Invariance (equal factor loadings and measurement errors). Multi-sample structural equation modeling tested the invariance of the measurement model across seven subgroups: age, education, ethnicity, gender, race, skin tone, and Stage of Change for Sun Protection. Strong factorial invariance was found across all subgroups. Internal consistency coefficient Alpha and factor rho reliability, respectively, were .83 and .80 for behavioral processes, .91 and .89 for experiential processes, and .93 and .91 for the global scale. These results provide strong empirical evidence that the scale is consistent, has internal validity and can be used in research interventions with population-based adult samples.

  20. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  1. Evaluation of reliability modeling tools for advanced fault tolerant systems

    NASA Technical Reports Server (NTRS)

    Baker, Robert; Scheper, Charlotte

    1986-01-01

    The Computer Aided Reliability Estimation (CARE III) and Automated Reliability Interactice Estimation System (ARIES 82) reliability tools for application to advanced fault tolerance aerospace systems were evaluated. To determine reliability modeling requirements, the evaluation focused on the Draper Laboratories' Advanced Information Processing System (AIPS) architecture as an example architecture for fault tolerance aerospace systems. Advantages and limitations were identified for each reliability evaluation tool. The CARE III program was designed primarily for analyzing ultrareliable flight control systems. The ARIES 82 program's primary use was to support university research and teaching. Both CARE III and ARIES 82 were not suited for determining the reliability of complex nodal networks of the type used to interconnect processing sites in the AIPS architecture. It was concluded that ARIES was not suitable for modeling advanced fault tolerant systems. It was further concluded that subject to some limitations (the difficulty in modeling systems with unpowered spare modules, systems where equipment maintenance must be considered, systems where failure depends on the sequence in which faults occurred, and systems where multiple faults greater than a double near coincident faults must be considered), CARE III is best suited for evaluating the reliability of advanced tolerant systems for air transport.

  2. Bayesian accrual prediction for interim review of clinical studies: open source R package and smartphone application.

    PubMed

    Jiang, Yu; Guarino, Peter; Ma, Shuangge; Simon, Steve; Mayo, Matthew S; Raghavan, Rama; Gajewski, Byron J

    2016-07-22

    Subject recruitment for medical research is challenging. Slow patient accrual leads to increased costs and delays in treatment advances. Researchers need reliable tools to manage and predict the accrual rate. The previously developed Bayesian method integrates researchers' experience on former trials and data from an ongoing study, providing a reliable prediction of accrual rate for clinical studies. In this paper, we present a user-friendly graphical user interface program developed in R. A closed-form solution for the total subjects that can be recruited within a fixed time is derived. We also present a built-in Android system using Java for web browsers and mobile devices. Using the accrual software, we re-evaluated the Veteran Affairs Cooperative Studies Program 558- ROBOTICS study. The application of the software in monitoring and management of recruitment is illustrated for different stages of the trial. This developed accrual software provides a more convenient platform for estimation and prediction of the accrual process.

  3. Cross-cultural adaptation, validation and reliability of the brazilian version of the Richmond Compulsive Buying Scale.

    PubMed

    Leite, Priscilla; Rangé, Bernard; Kukar-Kiney, Monika; Ridgway, Nancy; Monroe, Kent; Ribas Junior, Rodolfo; Landeira Fernandez, J; Nardi, Antonio Egidio; Silva, Adriana

    2013-03-01

    To present the process of transcultural adaptation of the Richmond Compulsive Buying Scale to Brazilian Portuguese. For the semantic adaptation step, the scale was translated to Portuguese and then back-translated to English by two professional translators and one psychologist, without any communication between them. The scale was then applied to 20 participants from the general population for language adjustments. For the construct validation step, an exploratory factor analysis was performed, using the scree plot test, principal component analysis for factor extraction, and Varimax rotation. For convergent validity, the correlation matrix was analyzed through Pearson's coefficient. The scale showed easy applicability, satisfactory internal consistency (Cronbach's alpha=.87), and a high correlation with other rating scales for compulsive buying disorder, indicating that it is suitable to be used in the assessment and diagnosis of compulsive buying disorder, as it presents psychometric validity. The Brazilian Portuguese version of the Richmond Compulsive Buying Scale has good validity and reliability.

  4. Error Control Coding Techniques for Space and Satellite Communications

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    2000-01-01

    This paper presents a concatenated turbo coding system in which a Reed-Solomom outer code is concatenated with a binary turbo inner code. In the proposed system, the outer code decoder and the inner turbo code decoder interact to achieve both good bit error and frame error performances. The outer code decoder helps the inner turbo code decoder to terminate its decoding iteration while the inner turbo code decoder provides soft-output information to the outer code decoder to carry out a reliability-based soft-decision decoding. In the case that the outer code decoding fails, the outer code decoder instructs the inner code decoder to continue its decoding iterations until the outer code decoding is successful or a preset maximum number of decoding iterations is reached. This interaction between outer and inner code decoders reduces decoding delay. Also presented in the paper are an effective criterion for stopping the iteration process of the inner code decoder and a new reliability-based decoding algorithm for nonbinary codes.

  5. An Interactive Concatenated Turbo Coding System

    NASA Technical Reports Server (NTRS)

    Liu, Ye; Tang, Heng; Lin, Shu; Fossorier, Marc

    1999-01-01

    This paper presents a concatenated turbo coding system in which a Reed-Solomon outer code is concatenated with a binary turbo inner code. In the proposed system, the outer code decoder and the inner turbo code decoder interact to achieve both good bit error and frame error performances. The outer code decoder helps the inner turbo code decoder to terminate its decoding iteration while the inner turbo code decoder provides soft-output information to the outer code decoder to carry out a reliability-based soft- decision decoding. In the case that the outer code decoding fails, the outer code decoder instructs the inner code decoder to continue its decoding iterations until the outer code decoding is successful or a preset maximum number of decoding iterations is reached. This interaction between outer and inner code decoders reduces decoding delay. Also presented in the paper are an effective criterion for stopping the iteration process of the inner code decoder and a new reliability-based decoding algorithm for nonbinary codes.

  6. Measurement in Sensory Modulation: The Sensory Processing Scale Assessment

    PubMed Central

    Miller, Lucy J.; Sullivan, Jillian C.

    2014-01-01

    OBJECTIVE. Sensory modulation issues have a significant impact on participation in daily life. Moreover, understanding phenotypic variation in sensory modulation dysfunction is crucial for research related to defining homogeneous groups and for clinical work in guiding treatment planning. We thus evaluated the new Sensory Processing Scale (SPS) Assessment. METHOD. Research included item development, behavioral scoring system development, test administration, and item analyses to evaluate reliability and validity across sensory domains. RESULTS. Items with adequate reliability (internal reliability >.4) and discriminant validity (p < .01) were retained. Feedback from the expert panel also contributed to decisions about retaining items in the scale. CONCLUSION. The SPS Assessment appears to be a reliable and valid measure of sensory modulation (scale reliability >.90; discrimination between group effect sizes >1.00). This scale has the potential to aid in differential diagnosis of sensory modulation issues. PMID:25184464

  7. The Evaluation Method of the Lightning Strike on Transmission Lines Aiming at Power Grid Reliability

    NASA Astrophysics Data System (ADS)

    Wen, Jianfeng; Wu, Jianwei; Huang, Liandong; Geng, Yinan; Yu, zhanqing

    2018-01-01

    Lightning protection of power system focuses on reducing the flashover rate, only distinguishing by the voltage level, without considering the functional differences between the transmission lines, and being lack of analysis the effect on the reliability of power grid. This will lead lightning protection design of general transmission lines is surplus but insufficient for key lines. In order to solve this problem, the analysis method of lightning striking on transmission lines for power grid reliability is given. Full wave process theory is used to analyze the lightning back striking; the leader propagation model is used to describe the process of shielding failure of transmission lines. The index of power grid reliability is introduced and the effect of transmission line fault on the reliability of power system is discussed in detail.

  8. Scheduling structural health monitoring activities for optimizing life-cycle costs and reliability of wind turbines

    NASA Astrophysics Data System (ADS)

    Hanish Nithin, Anu; Omenzetter, Piotr

    2017-04-01

    Optimization of the life-cycle costs and reliability of offshore wind turbines (OWTs) is an area of immense interest due to the widespread increase in wind power generation across the world. Most of the existing studies have used structural reliability and the Bayesian pre-posterior analysis for optimization. This paper proposes an extension to the previous approaches in a framework for probabilistic optimization of the total life-cycle costs and reliability of OWTs by combining the elements of structural reliability/risk analysis (SRA), the Bayesian pre-posterior analysis with optimization through a genetic algorithm (GA). The SRA techniques are adopted to compute the probabilities of damage occurrence and failure associated with the deterioration model. The probabilities are used in the decision tree and are updated using the Bayesian analysis. The output of this framework would determine the optimal structural health monitoring and maintenance schedules to be implemented during the life span of OWTs while maintaining a trade-off between the life-cycle costs and risk of the structural failure. Numerical illustrations with a generic deterioration model for one monitoring exercise in the life cycle of a system are demonstrated. Two case scenarios, namely to build initially an expensive and robust or a cheaper but more quickly deteriorating structures and to adopt expensive monitoring system, are presented to aid in the decision-making process.

  9. The reliability of multistory buildings with the effect of non-uniform settlements of foundation

    NASA Astrophysics Data System (ADS)

    Al'Malul, Rafik; Gadzhuntsev, Michail

    2018-03-01

    The issue is the evaluation of reliability of construction considering the influence of the variation of the support settlement, which is changing during the lifetime of constructions due to the consolidation process of the ground. Recently, the specialists give special emphasis to the necessity to develop the methods for the estimation of reliability and durability of structures. The problem, the article considers, is the determination of the reliability of multistory buildings with non-uniform changing-in-time sediments caused by the consolidation process in soils. Failure of structures may occur before the draft reaches it`s stabilizing value, because of the violations of the conditions of normal use.

  10. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Wilson, Larry W.

    1989-01-01

    The longterm goal of this research is to identify or create a model for use in analyzing the reliability of flight control software. The immediate tasks addressed are the creation of data useful to the study of software reliability and production of results pertinent to software reliability through the analysis of existing reliability models and data. The completed data creation portion of this research consists of a Generic Checkout System (GCS) design document created in cooperation with NASA and Research Triangle Institute (RTI) experimenters. This will lead to design and code reviews with the resulting product being one of the versions used in the Terminal Descent Experiment being conducted by the Systems Validations Methods Branch (SVMB) of NASA/Langley. An appended paper details an investigation of the Jelinski-Moranda and Geometric models for software reliability. The models were given data from a process that they have correctly simulated and asked to make predictions about the reliability of that process. It was found that either model will usually fail to make good predictions. These problems were attributed to randomness in the data and replication of data was recommended.

  11. Design process for applying the nonlocal thermal transport iSNB model to a Polar-Drive ICF simulation

    NASA Astrophysics Data System (ADS)

    Cao, Duc; Moses, Gregory; Delettrez, Jacques; Collins, Timothy

    2014-10-01

    A design process is presented for the nonlocal thermal transport iSNB (implicit Schurtz, Nicolai, and Busquet) model to provide reliable nonlocal thermal transport in polar-drive ICF simulations. Results from the iSNB model are known to be sensitive to changes in the SNB ``mean free path'' formula, and the latter's original form required modification to obtain realistic preheat levels. In the presented design process, SNB mean free paths are first modified until the model can match temperatures from Goncharov's thermal transport model in 1D temperature relaxation simulations. Afterwards the same mean free paths are tested in a 1D polar-drive surrogate simulation to match adiabats from Goncharov's model. After passing the two previous steps, the model can then be run in a full 2D polar-drive simulation. This research is supported by the University of Rochester Laboratory for Laser Energetics.

  12. PLACE: an open-source python package for laboratory automation, control, and experimentation.

    PubMed

    Johnson, Jami L; Tom Wörden, Henrik; van Wijk, Kasper

    2015-02-01

    In modern laboratories, software can drive the full experimental process from data acquisition to storage, processing, and analysis. The automation of laboratory data acquisition is an important consideration for every laboratory. When implementing a laboratory automation scheme, important parameters include its reliability, time to implement, adaptability, and compatibility with software used at other stages of experimentation. In this article, we present an open-source, flexible, and extensible Python package for Laboratory Automation, Control, and Experimentation (PLACE). The package uses modular organization and clear design principles; therefore, it can be easily customized or expanded to meet the needs of diverse laboratories. We discuss the organization of PLACE, data-handling considerations, and then present an example using PLACE for laser-ultrasound experiments. Finally, we demonstrate the seamless transition to post-processing and analysis with Python through the development of an analysis module for data produced by PLACE automation. © 2014 Society for Laboratory Automation and Screening.

  13. Virtual Sensor Web Architecture

    NASA Astrophysics Data System (ADS)

    Bose, P.; Zimdars, A.; Hurlburt, N.; Doug, S.

    2006-12-01

    NASA envisions the development of smart sensor webs, intelligent and integrated observation network that harness distributed sensing assets, their associated continuous and complex data sets, and predictive observation processing mechanisms for timely, collaborative hazard mitigation and enhanced science productivity and reliability. This paper presents Virtual Sensor Web Infrastructure for Collaborative Science (VSICS) Architecture for sustained coordination of (numerical and distributed) model-based processing, closed-loop resource allocation, and observation planning. VSICS's key ideas include i) rich descriptions of sensors as services based on semantic markup languages like OWL and SensorML; ii) service-oriented workflow composition and repair for simple and ensemble models; event-driven workflow execution based on event-based and distributed workflow management mechanisms; and iii) development of autonomous model interaction management capabilities providing closed-loop control of collection resources driven by competing targeted observation needs. We present results from initial work on collaborative science processing involving distributed services (COSEC framework) that is being extended to create VSICS.

  14. Pupillary dynamics reveal computational cost in sentence planning.

    PubMed

    Sevilla, Yamila; Maldonado, Mora; Shalóm, Diego E

    2014-01-01

    This study investigated the computational cost associated with grammatical planning in sentence production. We measured people's pupillary responses as they produced spoken descriptions of depicted events. We manipulated the syntactic structure of the target by training subjects to use different types of sentences following a colour cue. The results showed higher increase in pupil size for the production of passive and object dislocated sentences than for active canonical subject-verb-object sentences, indicating that more cognitive effort is associated with more complex noncanonical thematic order. We also manipulated the time at which the cue that triggered structure-building processes was presented. Differential increase in pupil diameter for more complex sentences was shown to rise earlier as the colour cue was presented earlier, suggesting that the observed pupillary changes are due to differential demands in relatively independent structure-building processes during grammatical planning. Task-evoked pupillary responses provide a reliable measure to study the cognitive processes involved in sentence production.

  15. Identifying resonance frequency deviations for high order nano-wire ring resonator filters based on a coupling strength variation

    NASA Astrophysics Data System (ADS)

    Park, Sahnggi; Kim, Kap-Joong; Kim, Duk-Jun; Kim, Gyungock

    2009-02-01

    Third order ring resonators are designed and their resonance frequency deviations are analyzed experimentally by processing them with E-beam lithography and ICP etching in a CMOS nano-Fabrication laboratory. We developed a reliable method to identify and reduce experimentally the degree of deviation of each ring resonance frequency before completion of the fabrication process. The identified deviations can be minimized by the way to be presented in this paper. It is expected that this method will provide a significant clue to make a high order multi-channel ring resonators.

  16. Determination of Ground Heat Exchangers Temperature Field in Geothermal Heat Pumps

    NASA Astrophysics Data System (ADS)

    Zhurmilova, I.; Shtym, A.

    2017-11-01

    For the heating and cooling supply of buildings and constructions geothermal heat pumps using low-potential ground energy are applied by means of ground exchangers. The process of heat transfer in a system of ground exchangers is a phenomenon of complex heat transfer. The paper presents a mathematical modeling of heat exchange processes, the temperature fields are built which are necessary for the determination of the ground array that ensures an adequate supply of low potential energy excluding the freezing of soil around the pipes in the ground heat exchangers and guaranteeing a reliable operation of geothermal heat pumps.

  17. Hybrid receiver study

    NASA Technical Reports Server (NTRS)

    Stone, M. S.; Mcadam, P. L.; Saunders, O. W.

    1977-01-01

    The results are presented of a 4 month study to design a hybrid analog/digital receiver for outer planet mission probe communication links. The scope of this study includes functional design of the receiver; comparisons between analog and digital processing; hardware tradeoffs for key components including frequency generators, A/D converters, and digital processors; development and simulation of the processing algorithms for acquisition, tracking, and demodulation; and detailed design of the receiver in order to determine its size, weight, power, reliability, and radiation hardness. In addition, an evaluation was made of the receiver's capabilities to perform accurate measurement of signal strength and frequency for radio science missions.

  18. Precision manufacturing for clinical-quality regenerative medicines.

    PubMed

    Williams, David J; Thomas, Robert J; Hourd, Paul C; Chandra, Amit; Ratcliffe, Elizabeth; Liu, Yang; Rayment, Erin A; Archer, J Richard

    2012-08-28

    Innovations in engineering applied to healthcare make a significant difference to people's lives. Market growth is guaranteed by demographics. Regulation and requirements for good manufacturing practice-extreme levels of repeatability and reliability-demand high-precision process and measurement solutions. Emerging technologies using living biological materials add complexity. This paper presents some results of work demonstrating the precision automated manufacture of living materials, particularly the expansion of populations of human stem cells for therapeutic use as regenerative medicines. The paper also describes quality engineering techniques for precision process design and improvement, and identifies the requirements for manufacturing technology and measurement systems evolution for such therapies.

  19. Real-time on-board orbit determination with DORIS

    NASA Technical Reports Server (NTRS)

    Berthias, J.-P.; Jayles, C.; Pradines, D.

    1993-01-01

    A spaceborne orbit determination system is being developed by the French Space Agency (CNES) for the SPOT 4 satellite. It processes DORIS measurements to produce an orbit with an accuracy of about 50O meters rms. In order to evaluate the reliability of the software, it was combined with the MERCATOR man/machine interface and used to process the TOPEX/Poseidon DORIS data in near real time during the validation phase of the instrument, at JPL and at CNES. This paper gives an overview of the orbit determination system and presents the results of the TOPEX/Poseidon experiment.

  20. Evaluation of the Telecommunications Protocol Processing Subsystem Using Reconfigurable Interoperable Gate Array

    NASA Technical Reports Server (NTRS)

    Pang, Jackson; Liddicoat, Albert; Ralston, Jesse; Pingree, Paula

    2006-01-01

    The current implementation of the Telecommunications Protocol Processing Subsystem Using Reconfigurable Interoperable Gate Arrays (TRIGA) is equipped with CFDP protocol and CCSDS Telemetry and Telecommand framing schemes to replace the CPU intensive software counterpart implementation for reliable deep space communication. We present the hardware/software co-design methodology used to accomplish high data rate throughput. The hardware CFDP protocol stack implementation is then compared against the two recent flight implementations. The results from our experiments show that TRIGA offers more than 3 orders of magnitude throughput improvement with less than one-tenth of the power consumption.

  1. Numerical Validation of Chemical Compositional Model for Wettability Alteration Processes

    NASA Astrophysics Data System (ADS)

    Bekbauov, Bakhbergen; Berdyshev, Abdumauvlen; Baishemirov, Zharasbek; Bau, Domenico

    2017-12-01

    Chemical compositional simulation of enhanced oil recovery and surfactant enhanced aquifer remediation processes is a complex task that involves solving dozens of equations for all grid blocks representing a reservoir. In the present work, we perform a numerical validation of the newly developed mathematical formulation which satisfies the conservation laws of mass and energy and allows applying a sequential solution approach to solve the governing equations separately and implicitly. Through its application to the numerical experiment using a wettability alteration model and comparisons with existing chemical compositional model's numerical results, the new model has proven to be practical, reliable and stable.

  2. Immunocytochemical detection of astrocytes in brain slices in combination with Nissl staining.

    PubMed

    Korzhevskii, D E; Otellin, V A

    2005-07-01

    The present study was performed to develop a simple and reliable method for the combined staining of specimens to allow the advantages of immunocytochemical detection of astrocytes and assessment of the functional state of neurons by the Nissl method to be assessed simultaneously. The protocol suggested for processing paraffin sections allows preservation of tissue structure at high quality and allows the selective identification of astrocytes with counterstaining of neurons by the Nissl method. The protocol can be used without modification for processing brain specimens from humans and various mammals--except mice and rabbits.

  3. Microstructural characterization of ultra thin copper interconnects

    NASA Astrophysics Data System (ADS)

    Yang, Hee-Dong

    The present study investigates the defects related to reliability issues, such as physical failures developed during processing and end use. In the first part of this study, kinetic analysis using the Johnson-Mehl-Avrami (JMA) model demonstrates that a self-annealing mechanism in electroplated Cu films depends on the film properties, such as thickness and the amount of crystal defects in an as-deposited state. In order to obtain the evidence of such defects, the microstructural characterization of defects in ultra thin copper interconnects using transmission electron microscopy (TEM) is presented. Examination of the defects using TEM reveals that voids filled with gas form as a lens shape along the {110} habit planes of the copper matrix. In the second part of this study, methodology and results of an electro-thermal-fatigue (ETF) testing, designed for early detection of process defects, are presented. Such ETF testing combines high-density current electrical stressing and thermal cycling to accelerate the evolution of defects in Cu interconnects. In ETF testing, the evolution of defects provides the nucleation sites for voids which open or close during thermal cycling. Then, the accumulation of voids creates the change in resistance when they reach a critical size. As a result of voids evolution, the high current density and high joule heating create a transient resistance increase. ETF testing reveals two failure modes, and the mode-I failure has the importance in detecting defects. The number of cycles to failure in ETF testing decreases with higher current density, but the rate of thermal cycling has no effect. Results from this investigation suggest that impurities in the copper electrodeposition process must be carefully controlled to achieve reliable ultra thin copper interconnects.

  4. Advanced Data Acquisition Systems

    NASA Technical Reports Server (NTRS)

    Perotti, J.

    2003-01-01

    Current and future requirements of the aerospace sensors and transducers field make it necessary for the design and development of new data acquisition devices and instrumentation systems. New designs are sought to incorporate self-health, self-calibrating, self-repair capabilities, allowing greater measurement reliability and extended calibration cycles. With the addition of power management schemes, state-of-the-art data acquisition systems allow data to be processed and presented to the users with increased efficiency and accuracy. The design architecture presented in this paper displays an innovative approach to data acquisition systems. The design incorporates: electronic health self-check, device/system self-calibration, electronics and function self-repair, failure detection and prediction, and power management (reduced power consumption). These requirements are driven by the aerospace industry need to reduce operations and maintenance costs, to accelerate processing time and to provide reliable hardware with minimum costs. The project's design architecture incorporates some commercially available components identified during the market research investigation like: Field Programmable Gate Arrays (FPGA) Programmable Analog Integrated Circuits (PAC IC) and Field Programmable Analog Arrays (FPAA); Digital Signal Processing (DSP) electronic/system control and investigation of specific characteristics found in technologies like: Electronic Component Mean Time Between Failure (MTBF); and Radiation Hardened Component Availability. There are three main sections discussed in the design architecture presented in this document. They are the following: (a) Analog Signal Module Section, (b) Digital Signal/Control Module Section and (c) Power Management Module Section. These sections are discussed in detail in the following pages. This approach to data acquisition systems has resulted in the assignment of patent rights to Kennedy Space Center under U.S. patent # 6,462,684. Furthermore, NASA KSC commercialization office has issued licensing rights to Circuit Avenue Netrepreneurs, LLC , a minority-owned business founded in 1999 located in Camden, NJ.

  5. A Time-Variant Reliability Model for Copper Bending Pipe under Seawater-Active Corrosion Based on the Stochastic Degradation Process

    PubMed Central

    Li, Mengmeng; Feng, Qiang; Yang, Dezhen

    2018-01-01

    In the degradation process, the randomness and multiplicity of variables are difficult to describe by mathematical models. However, they are common in engineering and cannot be neglected, so it is necessary to study this issue in depth. In this paper, the copper bending pipe in seawater piping systems is taken as the analysis object, and the time-variant reliability is calculated by solving the interference of limit strength and maximum stress. We did degradation experiments and tensile experiments on copper material, and obtained the limit strength at each time. In addition, degradation experiments on copper bending pipe were done and the thickness at each time has been obtained, then the response of maximum stress was calculated by simulation. Further, with the help of one kind of Monte Carlo method we propose, the time-variant reliability of copper bending pipe was calculated based on the stochastic degradation process and interference theory. Compared with traditional methods and verified by maintenance records, the results show that the time-variant reliability model based on the stochastic degradation process proposed in this paper has better applicability in the reliability analysis, and it can be more convenient and accurate to predict the replacement cycle of copper bending pipe under seawater-active corrosion. PMID:29584695

  6. A Portable Dynamic Laser Speckle System for Sensing Long-Term Changes Caused by Treatments in Painting Conservation.

    PubMed

    Pérez, Alberto J; González-Peña, Rolando J; Braga, Roberto; Perles, Ángel; Pérez-Marín, Eva; García-Diego, Fernando J

    2018-01-11

    Dynamic laser speckle (DLS) is used as a reliable sensor of activity for all types of materials. Traditional applications are based on high-rate captures (usually greater than 10 frames-per-second, fps). Even for drying processes in conservation treatments, where there is a high level of activity in the first moments after the application and slower activity after some minutes or hours, the process is based on the acquisition of images at a time rate that is the same in moments of high and low activity. In this work, we present an alternative approach to track the drying process of protective layers and other painting conservation processes that take a long time to reduce their levels of activity. We illuminate, using three different wavelength lasers, a temporary protector (cyclododecane) and a varnish, and monitor them using a low fps rate during long-term drying. The results are compared to the traditional method. This work also presents a monitoring method that uses portable equipment. The results present the feasibility of using the portable device and show the improved sensitivity of the dynamic laser speckle when sensing the long-term process for drying cyclododecane and varnish in conservation.

  7. Structure reliability design and analysis of support ring for cylinder seal

    NASA Astrophysics Data System (ADS)

    Minmin, Zhao

    2017-09-01

    In this paper, the general reliability design process of the cross-sectional dimension of the support ring is introduced, which is used for the cylinder sealing. Then, taking a certain section shape support ring as an example, the every size parameters of section are determined from the view point of reliability design. Last, the static strength and reliability of the support ring are analyzed to verify the correctness of the reliability design result.

  8. The Use of Computer Simulation Methods to Reach Data for Economic Analysis of Automated Logistic Systems

    NASA Astrophysics Data System (ADS)

    Neradilová, Hana; Fedorko, Gabriel

    2016-12-01

    Automated logistic systems are becoming more widely used within enterprise logistics processes. Their main advantage is that they allow increasing the efficiency and reliability of logistics processes. In terms of evaluating their effectiveness, it is necessary to take into account the economic aspect of the entire process. However, many users ignore and underestimate this area,which is not correct. One of the reasons why the economic aspect is overlooked is the fact that obtaining information for such an analysis is not easy. The aim of this paper is to present the possibilities of computer simulation methods for obtaining data for full-scale economic analysis implementation.

  9. Electron beam irradiation processing for industrial and medical applications

    NASA Astrophysics Data System (ADS)

    Ozer, Zehra Nur

    2017-09-01

    In recent years, electron beam processing has been widely used for medical and industrial applications. Electron beam accelerators are reliable and durable equipments that can produce ionizing radiation when it is needed for a particular commercial use. On the industrial scale, accelerators are used to generate electrons in between 0.1-100 MeV energy range. These accelerators are used mainly in plastics, automotive, wire and electric cables, semiconductors, health care, aerospace and environmental industries, as well as numerous researches. This study presents the current applications of electron beam processing in medicine and industry. Also planned study of a design for such a system in the energy range of 200-300 keV is introduced.

  10. Status of the ITER Cryodistribution

    NASA Astrophysics Data System (ADS)

    Chang, H.-S.; Vaghela, H.; Patel, P.; Rizzato, A.; Cursan, M.; Henry, D.; Forgeas, A.; Grillot, D.; Sarkar, B.; Muralidhara, S.; Das, J.; Shukla, V.; Adler, E.

    2017-12-01

    Since the conceptual design of the ITER Cryodistribution many modifications have been applied due to both system optimization and improved knowledge of the clients’ requirements. Process optimizations in the Cryoplant resulted in component simplifications whereas increased heat load in some of the superconducting magnet systems required more complicated process configuration but also the removal of a cold box was possible due to component arrangement standardization. Another cold box, planned for redundancy, has been removed due to the Tokamak in-Cryostat piping layout modification. In this proceeding we will summarize the present design status and component configuration of the ITER Cryodistribution with all changes implemented which aim at process optimization and simplification as well as operational reliability, stability and flexibility.

  11. Dose control in electron beam processing: Comparison of results from a graphite charge collector, routine dosimeters and the ISS alanine-based dosimeter

    NASA Astrophysics Data System (ADS)

    Fuochi, P. G.; Onori, S.; Casali, F.; Chirco, P.

    1993-10-01

    A 12 MeV linear accelerator is currently used for electron beam processing of power semiconductor devices for lifetime control and, on an experimental basis, for food irradiation, sludge treatment etc. In order to control the irradiation process a simple, quick and reliable method for a direct evaluation of dose and fluence in a broad electron beam has been developed. This paper presents the results obtained using a "charge collector" which measures the charge absorbed in a graphite target exposed in air. Calibration of the system with super-Fricke dosimeter and comparison of absorbed dose results obtained with plastic dosimeters and alanine pellets are discussed.

  12. Validation of the World Health Organization tool for situational analysis to assess emergency and essential surgical care at district hospitals in Ghana.

    PubMed

    Osen, Hayley; Chang, David; Choo, Shelly; Perry, Henry; Hesse, Afua; Abantanga, Francis; McCord, Colin; Chrouser, Kristin; Abdullah, Fizan

    2011-03-01

    The World Health Organization (WHO) Tool for Situational Analysis to Assess Emergency and Essential Surgical Care (hereafter called the WHO Tool) has been used in more than 25 countries and is the largest effort to assess surgical care in the world. However, it has not yet been independently validated. Test-retest reliability is one way to validate the degree to which tests instruments are free from random error. The aim of the present field study was to determine the test-retest reliability of the WHO Tool. The WHO Tool was mailed to 10 district hospitals in Ghana. Written instructions were provided along with a letter from the Ghana Health Services requesting the hospital administrator to complete the survey tool. After ensuring delivery and completion of the forms, the study team readministered the WHO Tool at the time of an on-site visit less than 1 month later. The results of the two tests were compared to calculate kappa statistics for each of the 152 questions in the WHO Tool. The kappa statistic is a statistical measure of the degree of agreement above what would be expected based on chance alone. Ten hospitals were surveyed twice over a short interval (i.e., less than 1 month). Weighted and unweighted kappa statistics were calculated for 152 questions. The median unweighted kappa for the entire survey was 0.43 (interquartile range 0-0.84). The infrastructure section (24 questions) had a median kappa of 0.81; the human resources section (13 questions) had a median kappa of 0.77; the surgical procedures section (67 questions) had a median kappa of 0.00; and the emergency surgical equipment section (48 questions) had a median kappa of 0.81. Hospital capacity survey questions related to infrastructure characteristics had high reliability. However, questions related to process of care had poor reliability and may benefit from supplemental data gathered by direct observation. Limitations to the study include the small sample size: 10 district hospitals in a single country. Consistent and high correlations calculated from the field testing within the present analysis suggest that the WHO Tool for Situational Analysis is a reliable tool where it measures structure and setting, but it should be revised for measuring process of care.

  13. Investigation of improving MEMS-type VOA reliability

    NASA Astrophysics Data System (ADS)

    Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.

    2003-12-01

    MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).

  14. Investigation of improving MEMS-type VOA reliability

    NASA Astrophysics Data System (ADS)

    Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.

    2004-01-01

    MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).

  15. Development of an instrument to understand the child protective services decision-making process, with a focus on placement decisions.

    PubMed

    Dettlaff, Alan J; Christopher Graham, J; Holzman, Jesse; Baumann, Donald J; Fluke, John D

    2015-11-01

    When children come to the attention of the child welfare system, they become involved in a decision-making process in which decisions are made that have a significant effect on their future and well-being. The decision to remove children from their families is particularly complex; yet surprisingly little is understood about this decision-making process. This paper presents the results of a study to develop an instrument to explore, at the caseworker level, the context of the removal decision, with the objective of understanding the influence of the individual and organizational factors on this decision, drawing from the Decision Making Ecology as the underlying rationale for obtaining the measures. The instrument was based on the development of decision-making scales used in prior decision-making studies and administered to child protection caseworkers in several states. Analyses included reliability analyses, principal components analyses, and inter-correlations among the resulting scales. For one scale regarding removal decisions, a principal components analysis resulted in the extraction of two components, jointly identified as caseworkers' decision-making orientation, described as (1) an internal reference to decision-making and (2) an external reference to decision-making. Reliability analyses demonstrated acceptable to high internal consistency for 9 of the 11 scales. Full details of the reliability analyses, principal components analyses, and inter-correlations among the seven scales are discussed, along with implications for practice and the utility of this instrument to support the understanding of decision-making in child welfare. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Monitoring outcomes with relational databases: does it improve quality of care?

    PubMed

    Clemmer, Terry P

    2004-12-01

    There are 3 key ingredients in improving quality of medial care: 1) using a scientific process of improvement, 2) executing the process at the lowest possible level in the organization, and 3) measuring the results of any change reliably. Relational databases when used within these guidelines are of great value in these efforts if they contain reliable information that is pertinent to the project and used in a scientific process of quality improvement by a front line team. Unfortunately, the data are frequently unreliable and/or not pertinent to the local process and is used by persons at very high levels in the organization without a scientific process and without reliable measurement of the outcome. Under these circumstances the effectiveness of relational databases in improving care is marginal at best, frequently wasteful and has the potential to be harmful. This article explores examples of these concepts.

  17. Probabilistic Finite Element Analysis & Design Optimization for Structural Designs

    NASA Astrophysics Data System (ADS)

    Deivanayagam, Arumugam

    This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.

  18. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience

    PubMed Central

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S.; Mcginnis, Issac

    2017-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions. PMID:29278255

  19. Translation, cross-cultural adaptation to Brazilian- Portuguese and reliability analysis of the instrument Rapid Entire Body Assessment-REBA

    PubMed Central

    Lamarão, Andressa M.; Costa, Lucíola C. M.; Comper, Maria L. C.; Padula, Rosimeire S.

    2014-01-01

    Background: Observational instruments, such as the Rapid Entire Body Assessment, quickly assess biomechanical risks present in the workplace. However, in order to use these instruments, it is necessary to conduct the translational/cross-cultural adaptation of the instrument and test its measurement properties. Objectives: To perform the translation and the cross-cultural adaptation to Brazilian-Portuguese and test the reliability of the REBA instrument. Method: The procedures of translation and cross-cultural adaptation to Brazilian-Portuguese were conducted following proposed guidelines that involved translation, synthesis of translations, back translation, committee review and testing of the pre-final version. In addition, reliability and the intra- and inter-rater percent agreement were obtained with the Linear Weighted Kappa Coefficient that was associated with the 95% Confidence Interval and the cross tabulation 2×2. Results : The procedures for translation and adaptation were adequate and the necessary adjustments were conducted on the instrument. The intra- and inter-rater reliability showed values of 0.104 to 0.504, respectively, ranging from very poor to moderate. The percentage agreement values ranged from 5.66% to 69.81%. The percentage agreement was closer to 100% at the item 'upper arm' (69.81%) for the Intra-rater 1 and at the items 'legs' and 'upper arm' for the Intra-rater 2 (62.26%). Conclusions: The processes of translation and cross-cultural adaptation were conducted on the REBA instrument and the Brazilian version of the instrument was obtained. However, despite the reliability of the tests used to correct the translated and adapted version, the reliability values are unacceptable according to the guidelines standard, indicating that the reliability must be re-evaluated. Therefore, caution in the interpretation of the biomechanical risks measured by this instrument should be taken. PMID:25003273

  20. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.

    PubMed

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac

    2016-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.

  1. A Study of the Readiness of Hospitals for Implementation of High Reliability Organizations Model in Tehran University of Medical Sciences.

    PubMed

    Mousavi, Seyed Mohammad Hadi; Dargahi, Hossein; Mohammadi, Sara

    2016-10-01

    Creating a safe of health care system requires the establishment of High Reliability Organizations (HROs), which reduces errors, and increases the level of safety in hospitals. This model focuses on improving reliability through higher process design, building a culture of accreditation, and leveraging human factors. The present study intends to determine the readiness of hospitals for the establishment of HROs model in Tehran University of Medical Sciences from the viewpoint of managers of these hospitals. This is a descriptive-analytical study carried out in 2013-2014. The research population consists of 105 senior and middle managers of 15 hospitals of Tehran University of Medical Sciences. The data collection tool was a 55-question researcher-made questionnaire, included six elements of HROs to assess the level of readiness for establishing HROS model from managers' point of view. The validity of the questionnaire was calculated through the content validity method using 10 experts in the area of hospitals' accreditation, and its reliability was calculated through test-retest method with a correlation coefficient of 0.90. The response rate was 90 percent. The Likert scale was used for the questions, and data analysis was conducted through SPSS version 21 Descriptive statistics was presented via tables and normal distributions of data and means. Analytical methods, including t-test, Mann-Whitney, Spearman, and Kruskal-Wallis, were used for presenting inferential statistics. The study showed that from the viewpoint of senior and middle managers of the hospitals considered in this study, these hospitals are indeed ready for acceptance and establishment of HROs model. A significant relationship was showed between HROs model and its elements with demographic details of managers like their age, work experience, management experience, and level of management. Although the studied hospitals, as viewed by their managers, are capable of attaining the goals of HROs, it seems there are a lot of challenges in this way. Therefore, it is suggested that a detailed audit is conducted among hospitals' current status regarding different characteristics of HROs, and workshops are held for medical and non-medical employees and managers of hospitals as an influencing factor; and a re-assessment process afterward, can help moving the hospitals from their current position towards an HROs culture.

  2. Reliability Prediction Analysis: Airborne System Results and Best Practices

    NASA Astrophysics Data System (ADS)

    Silva, Nuno; Lopes, Rui

    2013-09-01

    This article presents the results of several reliability prediction analysis for aerospace components, made by both methodologies, the 217F and the 217Plus. Supporting and complementary activities are described, as well as the differences concerning the results and the applications of both methodologies that are summarized in a set of lessons learned that are very useful for RAMS and Safety Prediction practitioners.The effort that is required for these activities is also an important point that is discussed, as is the end result and their interpretation/impact on the system design.The article concludes while positioning these activities and methodologies in an overall process for space and aeronautics equipment/components certification, and highlighting their advantages. Some good practices have also been summarized and some reuse rules have been laid down.

  3. NASA flight cell and battery issues

    NASA Technical Reports Server (NTRS)

    Schulze, N. R.

    1989-01-01

    The author presents the important battery and cell problems, encompassing both test failures and accidents, which were encountered during the past year. Practical issues facing programs, which have to be considered in the development of a battery program strategy, are addressed. The problems of one program, the GRO (Gamma Ray Observatory), during the past year are focused on to illustrate the fundamental types of battery problems that occur. Problems encountered by other programs are briefly mentioned to complete the accounting. Two major categories of issues are defined, namely, whose which are quality and design related, i.e., problems having inherent manufacturing-process-related aspects with an impact on cell reliability, and these which are accident triggered or man induced, i.e., those operational issues having an impact on battery and cell reliability.

  4. Effective Crack Detection in Railway Axles Using Vibration Signals and WPT Energy.

    PubMed

    Gómez, María Jesús; Corral, Eduardo; Castejón, Cristina; García-Prada, Juan Carlos

    2018-05-17

    Crack detection for railway axles is key to avoiding catastrophic accidents. Currently, non-destructive testing is used for that purpose. The present work applies vibration signal analysis to diagnose cracks in real railway axles installed on a real Y21 bogie working on a rig. Vibration signals were obtained from two wheelsets with cracks at the middle section of the axle with depths from 5.7 to 15 mm, at several conditions of load and speed. Vibration signals were processed by means of wavelet packet transform energy. Energies obtained were used to train an artificial neural network, with reliable diagnosis results. The success rate of 5.7 mm defects was 96.27%, and the reliability in detecting larger defects reached almost 100%, with a false alarm ratio lower than 5.5%.

  5. Test-retest reliability of evoked BOLD signals from a cognitive-emotive fMRI test battery.

    PubMed

    Plichta, Michael M; Schwarz, Adam J; Grimm, Oliver; Morgen, Katrin; Mier, Daniela; Haddad, Leila; Gerdes, Antje B M; Sauer, Carina; Tost, Heike; Esslinger, Christine; Colman, Peter; Wilson, Frederick; Kirsch, Peter; Meyer-Lindenberg, Andreas

    2012-04-15

    Even more than in cognitive research applications, moving fMRI to the clinic and the drug development process requires the generation of stable and reliable signal changes. The performance characteristics of the fMRI paradigm constrain experimental power and may require different study designs (e.g., crossover vs. parallel groups), yet fMRI reliability characteristics can be strongly dependent on the nature of the fMRI task. The present study investigated both within-subject and group-level reliability of a combined three-task fMRI battery targeting three systems of wide applicability in clinical and cognitive neuroscience: an emotional (face matching), a motivational (monetary reward anticipation) and a cognitive (n-back working memory) task. A group of 25 young, healthy volunteers were scanned twice on a 3T MRI scanner with a mean test-retest interval of 14.6 days. FMRI reliability was quantified using the intraclass correlation coefficient (ICC) applied at three different levels ranging from a global to a localized and fine spatial scale: (1) reliability of group-level activation maps over the whole brain and within targeted regions of interest (ROIs); (2) within-subject reliability of ROI-mean amplitudes and (3) within-subject reliability of individual voxels in the target ROIs. Results showed robust evoked activation of all three tasks in their respective target regions (emotional task=amygdala; motivational task=ventral striatum; cognitive task=right dorsolateral prefrontal cortex and parietal cortices) with high effect sizes (ES) of ROI-mean summary values (ES=1.11-1.44 for the faces task, 0.96-1.43 for the reward task, 0.83-2.58 for the n-back task). Reliability of group level activation was excellent for all three tasks with ICCs of 0.89-0.98 at the whole brain level and 0.66-0.97 within target ROIs. Within-subject reliability of ROI-mean amplitudes across sessions was fair to good for the reward task (ICCs=0.56-0.62) and, dependent on the particular ROI, also fair-to-good for the n-back task (ICCs=0.44-0.57) but lower for the faces task (ICC=-0.02-0.16). In conclusion, all three tasks are well suited to between-subject designs, including imaging genetics. When specific recommendations are followed, the n-back and reward task are also suited for within-subject designs, including pharmaco-fMRI. The present study provides task-specific fMRI reliability performance measures that will inform the optimal use, powering and design of fMRI studies using comparable tasks. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Feature extraction algorithm for space targets based on fractal theory

    NASA Astrophysics Data System (ADS)

    Tian, Balin; Yuan, Jianping; Yue, Xiaokui; Ning, Xin

    2007-11-01

    In order to offer a potential for extending the life of satellites and reducing the launch and operating costs, satellite servicing including conducting repairs, upgrading and refueling spacecraft on-orbit become much more frequently. Future space operations can be more economically and reliably executed using machine vision systems, which can meet real time and tracking reliability requirements for image tracking of space surveillance system. Machine vision was applied to the research of relative pose for spacecrafts, the feature extraction algorithm was the basis of relative pose. In this paper fractal geometry based edge extraction algorithm which can be used in determining and tracking the relative pose of an observed satellite during proximity operations in machine vision system was presented. The method gets the gray-level image distributed by fractal dimension used the Differential Box-Counting (DBC) approach of the fractal theory to restrain the noise. After this, we detect the consecutive edge using Mathematical Morphology. The validity of the proposed method is examined by processing and analyzing images of space targets. The edge extraction method not only extracts the outline of the target, but also keeps the inner details. Meanwhile, edge extraction is only processed in moving area to reduce computation greatly. Simulation results compared edge detection using the method which presented by us with other detection methods. The results indicate that the presented algorithm is a valid method to solve the problems of relative pose for spacecrafts.

  7. Process-based costing.

    PubMed

    Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee

    2003-01-01

    Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.

  8. A Model for Estimating the Reliability and Validity of Criterion-Referenced Measures.

    ERIC Educational Resources Information Center

    Edmonston, Leon P.; Randall, Robert S.

    A decision model designed to determine the reliability and validity of criterion referenced measures (CRMs) is presented. General procedures which pertain to the model are discussed as to: Measures of relationship, Reliability, Validity (content, criterion-oriented, and construct validation), and Item Analysis. The decision model is presented in…

  9. Gearbox Reliability Collaborative Projects | Wind | NREL

    Science.gov Websites

    partners. This process documented and analyzed the design process and produced a new drivetrain design that Reliability Collaborative) activities. The new design represents the next phase in the DRC project, and will design loads required for wind turbine gearbox design and testing standards. While wind turbine gearboxes

  10. The Reliability and Validity of a Performance Task for Evaluating Science Process Skills.

    ERIC Educational Resources Information Center

    Adams, Cheryll M.; Callahan, Carolyn M.

    1995-01-01

    The Diet Cola Test was designed as a process assessment of science aptitude in intermediate grade students. Investigations of the instrument's reliability and validity indicated that data did not support use of the instrument for identifying individual students' aptitude. However, results suggested the test's appropriateness for evaluating…

  11. The Impact of Process Capability on Service Reliability for Critical Infrastructure Providers

    ERIC Educational Resources Information Center

    Houston, Clemith J., Jr.

    2013-01-01

    This study investigated the relationship between organizational processes that have been identified as promoting resiliency and their impact on service reliability within the scope of critical infrastructure providers. The importance of critical infrastructure to the nation is evident from the body of research and is supported by instances where…

  12. 75 FR 35689 - System Personnel Training Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-23

    ... using realistic simulations.\\14\\ \\13\\ Id. P 1331. \\14\\ Reliability Standard PER-002-0. 9. In Order No... development process to: (1) Include formal training requirements for reliability coordinators similar to those... simulation technology such as a simulator, virtual technology, or other technology in their emergency...

  13. Limits on reliable information flows through stochastic populations.

    PubMed

    Boczkowski, Lucas; Natale, Emanuele; Feinerman, Ofer; Korman, Amos

    2018-06-06

    Biological systems can share and collectively process information to yield emergent effects, despite inherent noise in communication. While man-made systems often employ intricate structural solutions to overcome noise, the structure of many biological systems is more amorphous. It is not well understood how communication noise may affect the computational repertoire of such groups. To approach this question we consider the basic collective task of rumor spreading, in which information from few knowledgeable sources must reliably flow into the rest of the population. We study the effect of communication noise on the ability of groups that lack stable structures to efficiently solve this task. We present an impossibility result which strongly restricts reliable rumor spreading in such groups. Namely, we prove that, in the presence of even moderate levels of noise that affect all facets of the communication, no scheme can significantly outperform the trivial one in which agents have to wait until directly interacting with the sources-a process which requires linear time in the population size. Our results imply that in order to achieve efficient rumor spread a system must exhibit either some degree of structural stability or, alternatively, some facet of the communication which is immune to noise. We then corroborate this claim by providing new analyses of experimental data regarding recruitment in Cataglyphis niger desert ants. Finally, in light of our theoretical results, we discuss strategies to overcome noise in other biological systems.

  14. Dissociable Brain States Linked to Common and Creative Object Use

    PubMed Central

    Chrysikou, Evangelia G.; Thompson-Schill, Sharon L.

    2013-01-01

    Studies of conceptual processing have revealed that the prefrontal cortex is implicated in close-ended, deliberate memory retrieval, especially the left ventrolateral prefrontal regions. However, much of human thought—particularly that which is characterized as creative—requires more open-ended, spontaneous memory retrieval. To explore the neural systems that support conceptual processing under these two distinct circumstances, we obtained functional magnetic resonance images from 24 participants either while retrieving the common use of an everyday object (e.g., “blowing your nose,” in response to a picture of a tissue) or while generating a creative (i.e., uncommon but plausible) use for it (e.g., “protective padding in a package”). The patterns of activation during open- and closed-ended tasks were reliably different, with regard to the magnitude of anterior versus posterior activation. Specifically, the close-ended task (i.e., Common Use task) reliably activated regions of lateral prefrontal cortex, whereas the open-ended task (i.e., Uncommon Use task) reliably activated regions of occipito-temporal cortex. Furthermore, there was variability across subjects in the types of responses produced on the open-ended task that was associated with the magnitude of activation in the middle occipital gyrus on this task. The present experiment is the first to demonstrate a dynamic tradeoff between anterior frontal and posterior occipitotemporal regions brought about by the close- or open-ended task demands. PMID:20533561

  15. Reliability program requirements for aeronautical and space system contractors

    NASA Technical Reports Server (NTRS)

    1987-01-01

    General reliability program requirements for NASA contracts involving the design, development, fabrication, test, and/or use of aeronautical and space systems including critical ground support equipment are prescribed. The reliability program requirements require (1) thorough planning and effective management of the reliability effort; (2) definition of the major reliability tasks and their place as an integral part of the design and development process; (3) planning and evaluating the reliability of the system and its elements (including effects of software interfaces) through a program of analysis, review, and test; and (4) timely status indication by formal documentation and other reporting to facilitate control of the reliability program.

  16. Precision laser processing for micro electronics and fiber optic manufacturing

    NASA Astrophysics Data System (ADS)

    Webb, Andrew; Osborne, Mike; Foster-Turner, Gideon; Dinkel, Duane W.

    2008-02-01

    The application of laser based materials processing for precision micro scale manufacturing in the electronics and fiber optic industry is becoming increasingly widespread and accepted. This presentation will review latest laser technologies available and discuss the issues to be considered in choosing the most appropriate laser and processing parameters. High repetition rate, short duration pulsed lasers have improved rapidly in recent years in terms of both performance and reliability enabling flexible, cost effective processing of many material types including metal, silicon, plastic, ceramic and glass. Demonstrating the relevance of laser micromachining, application examples where laser processing is in use for production will be presented, including miniaturization of surface mount capacitors by applying a laser technique for demetalization of tracks in the capacitor manufacturing process and high quality laser machining of fiber optics including stripping, cleaving and lensing, resulting in optical quality finishes without the need for traditional polishing. Applications include telecoms, biomedical and sensing. OpTek Systems was formed in 2000 and provide fully integrated systems and sub contract services for laser processes. They are headquartered in the UK and are establishing a presence in North America through a laser processing facility in South Carolina and sales office in the North East.

  17. Investigation of Cd1-xMgxTe as possible materials for X and gamma ray detectors

    NASA Astrophysics Data System (ADS)

    Mycielski, Andrzej; Kochanowska, Dominika M.; Witkowska-Baran, Marta; Wardak, Aneta; Szot, Michał; Domagała, Jarosław; Witkowski, Bartłomiej S.; Jakieła, Rafał; Kowalczyk, Leszek; Witkowska, Barbara

    2018-06-01

    In recent years, a series of investigations has been devoted to a possibility of using crystals based on CdTe with addition of magnesium (Mg) for X and gamma radiation detectors. Since we have had wide technological possibilities of preparing crystals and investigating their properties, we performed crystallizations of the crystals mentioned above. Thereafter, we investigated selected properties of the obtained materials. The crystallization processes were performed by using the Low Pressure Bridgman (LPB) method. The elements used: Cd, Te, Mg were of the highest purity available at present. In order to obtain reliable conclusions the crystallization processes were carried out at identical technological conditions. The details of our technological method and the results of the investigation of physical properties of the samples are presented below.

  18. The reliability and validity of qualitative scores for the Controlled Oral Word Association Test.

    PubMed

    Ross, Thomas P; Calhoun, Emily; Cox, Tara; Wenner, Carolyn; Kono, Whitney; Pleasant, Morgan

    2007-05-01

    The reliability and validity of two qualitative scoring systems for the Controlled Oral Word Association Test [Benton, A. L., Hamsher, de S. K., & Sivan, A. B. (1983). Multilingual aplasia examination (2nd ed.). Iowa City, IA: AJA Associates] were examined in 108 healthy young adults. The scoring systems developed by Troyer et al. [Troyer, A. K., Moscovich, M., & Winocur, G. (1997). Clustering and switching as two components of verbal fluency: Evidence from younger and older healthy adults. Neuropsychology, 11, 138-146] and by Abwender et al. [Abwender, D. A., Swan, J. G., Bowerman, J. T., & Connolly, S. W. (2001a). Qualitative analysis of verbal fluency output: Review and comparison of several scoring methods. Assessment, 8, 323-336] each demonstrated excellent interrater reliability (all indices at or above r(icc)=.9). Consistent with previous research [e.g., Ross, T. P. (2003). The reliability of cluster and switch scores for the COWAT. Archives of Clinical Psychology, 18, 153-164), test-retest reliability coefficients (N=53; M interval 44.6 days) for the qualitative scores were modest to poor (r(icc)=.6 to .4 range). Correlations among COWAT scores, measures of executive functioning, verbal learning, working memory, and vocabulary were examined. The idea that qualitative scores represent distinct executive functions such as cognitive flexibility or strategy utilization was not supported. We offer the interpretation that COWAT performance may require the ability to retrieve words in a non-routine manner while suppressing habitual responses and associated processing interference, presumably due to a spread of activation across semantic or lexical networks. This interpretation, though speculative at present, implies that clustering and switching on the COWAT may not be entirely deliberate, but rather an artifact of a passive (i.e., state-dependent) process. Ideas for future research, most noticeably experimental studies using cognitive methods (e.g., priming), are discussed.

  19. A prospective study assessing agreement and reliability of a geriatric evaluation.

    PubMed

    Locatelli, Isabella; Monod, Stéfanie; Cornuz, Jacques; Büla, Christophe J; Senn, Nicolas

    2017-07-19

    The present study takes place within a geriatric program, aiming at improving the diagnosis and management of geriatric syndromes in primary care. Within this program it was of prime importance to be able to rely on a robust and reproducible geriatric consultation to use as a gold standard for evaluating a primary care brief assessment tool. The specific objective of the present study was thus assessing the agreement and reliability of a comprehensive geriatric consultation. The study was conducted at the outpatient clinic of the Service of Geriatric Medicine, University of Lausanne, Switzerland. All community-dwelling older persons aged 70 years and above were eligible. Patients were excluded if they hadn't a primary care physician, they were unable to speak French, or they were already assessed by a geriatrician within the last 12 months. A set of 9 geriatricians evaluated 20 patients. Each patient was assessed twice within a 2-month delay. Geriatric consultations were based on a structured evaluation process, leading to rating the following geriatric conditions: functional, cognitive, visual, and hearing impairment, mood disorders, risk of fall, osteoporosis, malnutrition, and urinary incontinence. Reliability and agreement estimates on each of these items were obtained using a three-way Intraclass Correlation and a three-way Observed Disagreement index. The latter allowed a decomposition of overall disagreement into disagreements due to each source of error variability (visit, rater and random). Agreement ranged between 0.62 and 0.85. For most domains, geriatrician-related error variability explained an important proportion of disagreement. Reliability ranged between 0 and 0.8. It was poor/moderate for visual impairment, malnutrition and risk of fall, and good/excellent for functional/cognitive/hearing impairment, osteoporosis, incontinence and mood disorders. Six out of nine items of the geriatric consultation described in this study (functional/cognitive/hearing impairment, osteoporosis, incontinence and mood disorders) present a good to excellent reliability and can safely be used as a reference (gold standard) to evaluate the diagnostic performance of a primary care brief assessment tool. More objective/significant measures are needed to improve reliability of malnutrition, visual impairment, and risk of fall assessment before they can serve as a safe gold standard of a primary care tool.

  20. Ultrasonographic measurements of lower trapezius muscle thickness at rest and during isometric contraction: a reliability study.

    PubMed

    Talbott, Nancy R; Witt, Dexter W

    2014-07-01

    The purpose of this study was to determine the intra-rater reliability and inter-rater reliability of ultrasound imaging (USI) thickness measurements of the lower trapezius (LT) at rest and during active contractions when the transverse process and the lamina were used as reference sites for the measurement process. Twenty healthy individuals between the ages of 22 and 32 years volunteered. With the subject prone and the shoulder in 145° of abduction, images of the LT were taken bilaterally by one examiner as the subject: (1) rested; (2) actively held the test position; and (3) actively held the test position while holding a weight. Ten subjects returned and testing was repeated by the same examiner and by a second examiner. LT thickness measurements were recorded at the level of the transverse process and at the level of the lamina. Intra-class correlation coefficients (ICC) for within session intra-rater reliability (ICC3,3) ranged from 0.951 to 0.986 for both measurement sites while between session intra-rater reliability (ICC3,2) ranged from 0.935 to 0.962. Within session inter-rater reliability (ICC2,2) ranged from 0.934 to 0.973. USI can be used to reliably measure LT thickness at rest, during active contraction and during active contraction when holding a weight. The described protocol can be utilized during shoulder examinations to provide an additional assessment tool for monitoring changes in LT thickness.

  1. Addressing Unison and Uniqueness of Reliability and Safety for Better Integration

    NASA Technical Reports Server (NTRS)

    Huang, Zhaofeng; Safie, Fayssal

    2015-01-01

    For a long time, both in theory and in practice, safety and reliability have not been clearly differentiated, which leads to confusion, inefficiency, and sometime counter-productive practices in executing each of these two disciplines. It is imperative to address the uniqueness and the unison of these two disciplines to help both disciplines become more effective and to promote a better integration of the two for enhancing safety and reliability in our products as an overall objective. There are two purposes of this paper. First, it will investigate the uniqueness and unison of each discipline and discuss the interrelationship between the two for awareness and clarification. Second, after clearly understanding the unique roles and interrelationship between the two in a product design and development life cycle, we offer suggestions to enhance the disciplines with distinguished and focused roles, to better integrate the two, and to improve unique sets of skills and tools of reliability and safety processes. From the uniqueness aspect, the paper identifies and discusses the respective uniqueness of reliability and safety from their roles, accountability, nature of requirements, technical scopes, detailed technical approaches, and analysis boundaries. It is misleading to equate unreliable to unsafe, since a safety hazard may or may not be related to the component, sub-system, or system functions, which are primarily what reliability addresses. Similarly, failing-to-function may or may not lead to hazard events. Examples will be given in the paper from aerospace, defense, and consumer products to illustrate the uniqueness and differences between reliability and safety. From the unison aspect, the paper discusses what the commonalities between reliability and safety are, and how these two disciplines are linked, integrated, and supplemented with each other to accomplish the customer requirements and product goals. In addition to understanding the uniqueness in reliability and safety, a better understanding of unison and commonalities will further help in understanding the interaction between reliability and safety. This paper discusses the unison and uniqueness of reliability and safety. It presents some suggestions for better integration of the two disciplines in terms of technical approaches, tools, techniques, and skills to enhance the role of reliability and safety in supporting a product design and development life cycle. The paper also discusses eliminating the redundant effort and minimizing the overlap of reliability and safety analyses for an efficient implementation of the two disciplines.

  2. Honing process optimization algorithms

    NASA Astrophysics Data System (ADS)

    Kadyrov, Ramil R.; Charikov, Pavel N.; Pryanichnikova, Valeria V.

    2018-03-01

    This article considers the relevance of honing processes for creating high-quality mechanical engineering products. The features of the honing process are revealed and such important concepts as the task for optimization of honing operations, the optimal structure of the honing working cycles, stepped and stepless honing cycles, simulation of processing and its purpose are emphasized. It is noted that the reliability of the mathematical model determines the quality parameters of the honing process control. An algorithm for continuous control of the honing process is proposed. The process model reliably describes the machining of a workpiece in a sufficiently wide area and can be used to operate the CNC machine CC743.

  3. Design Manual: Removal of Fluoride from Drinking Water ...

    EPA Pesticide Factsheets

    This document is an updated version of the Design Manual: Removal of Fluoride from Drinking Water Supplies by Activated Alumina (Rubel, 1984). The manual is an in-depth presentation of the steps required to design and operate a fluoride removal plant using activated alumina (AA), which is a reliable and cost-effective process for treating excess fluoride from drinking water supplies. Design Manual on removing fluoride from drinking water to support the fluoride MCL - manual

  4. Controlling Nonpoint Pollution in Virginia’s Urbanizing Areas: An Institutional Perspective.

    DTIC Science & Technology

    1986-01-20

    programs. As John Naisbitt has stated in his book, Megatrends , "The most reliable way to anticipate the future is by understanding the present.’" Therefore...Virginia’s cit- izenry must be considered. Naisbitt, John, Megatrends , Warner Books, New York, N.Y., 1984, p. xxiii. INTRODUCTION 3...available for the above biological and chemical processes to take place. Findings of the NURP (Nationwide Urban Runoff Program) studies show that

  5. Structure and Distribution of Components in the Working Layer Upon Reconditioning of Parts by Electric-Arc Metallization

    NASA Astrophysics Data System (ADS)

    Skoblo, T. S.; Vlasovets, V. M.; Moroz, V. V.

    2001-11-01

    Reliable data on the structure of the deposited layer are very important due to the considerable instability of the process of deposition of coatings by the method of electric-arc metallization and the strict requirements for reconditioned crankshafts. The present paper is devoted to the structure of coatings obtained from powder wire based on ferrochrome-aluminum with additional alloying elements introduced into the charge.

  6. The advantages of the high voltage solar array for electric propulsion

    NASA Technical Reports Server (NTRS)

    Sater, B. L.

    1973-01-01

    The high voltage solar array offers improvements in efficiency, weight, and reliability for the electric propulsion power system. Conventional power processes and problems associated with ion thruster operation using SERT 2 experience are discussed and the advantages of the HVSA concept for electric propulsion are presented. Tests conducted operating the SERT 2 thruster system in conjunction with HVSA are reported. Thruster operation was observed to be normal and in some respects improved.

  7. MYRaf: A new Approach with IRAF for Astronomical Photometric Reduction

    NASA Astrophysics Data System (ADS)

    Kilic, Y.; Shameoni Niaei, M.; Özeren, F. F.; Yesilyaprak, C.

    2016-12-01

    In this study, the design and some developments of MYRaf software for astronomical photometric reduction are presented. MYRaf software is an easy to use, reliable, and has a fast IRAF aperture photometry GUI tools. MYRaf software is an important step for the automated software process of robotic telescopes, and uses IRAF, PyRAF, matplotlib, ginga, alipy, and Sextractor with the general-purpose and high-level programming language Python and uses the QT framework.

  8. Treatment of the control mechanisms of light airplanes in the flutter clearance process

    NASA Technical Reports Server (NTRS)

    Breitbach, E. J.

    1979-01-01

    It has become more and more evident that many difficulties encountered in the course of aircraft flutter analyses can be traced to strong localized nonlinearities in the control mechanisms. To cope with these problems, more reliable mathematical models paying special attention to control system nonlinearities were established by means of modified ground vibration test procedures in combination with suitably adapted modal synthesis approaches. Three different concepts are presented.

  9. Intelligent Systems for Power Management and Distribution

    NASA Technical Reports Server (NTRS)

    Button, Robert M.

    2002-01-01

    The motivation behind an advanced technology program to develop intelligent power management and distribution (PMAD) systems is described. The program concentrates on developing digital control and distributed processing algorithms for PMAD components and systems to improve their size, weight, efficiency, and reliability. Specific areas of research in developing intelligent DC-DC converters and distributed switchgear are described. Results from recent development efforts are presented along with expected future benefits to the overall PMAD system performance.

  10. Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics

    NASA Technical Reports Server (NTRS)

    Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.

  11. 75 FR 4375 - Transmission Loading Relief Reliability Standard and Curtailment Priorities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-27

    ... Site: http://www.ferc.gov . Documents created electronically using word processing software should be... ensure operation within acceptable reliability criteria. NERC Glossary of Terms Used in Reliability Standards at 19, available at http://www.nerc.com/files/Glossary_12Feb08.pdf (NERC Glossary). An...

  12. Specific detection of the cleavage activity of mycobacterial enzymes using a quantum dot based DNA nanosensor

    NASA Astrophysics Data System (ADS)

    Jepsen, Morten Leth; Harmsen, Charlotte; Godbole, Adwait Anand; Nagaraja, Valakunja; Knudsen, Birgitta R.; Ho, Yi-Ping

    2015-12-01

    We present a quantum dot based DNA nanosensor specifically targeting the cleavage step in the reaction cycle of the essential DNA-modifying enzyme, mycobacterial topoisomerase I. The design takes advantages of the unique photophysical properties of quantum dots to generate visible fluorescence recovery upon specific cleavage by mycobacterial topoisomerase I. This report, for the first time, demonstrates the possibility to quantify the cleavage activity of the mycobacterial enzyme without the pre-processing sample purification or post-processing signal amplification. The cleavage induced signal response has also proven reliable in biological matrices, such as whole cell extracts prepared from Escherichia coli and human Caco-2 cells. It is expected that the assay may contribute to the clinical diagnostics of bacterial diseases, as well as the evaluation of treatment outcomes.We present a quantum dot based DNA nanosensor specifically targeting the cleavage step in the reaction cycle of the essential DNA-modifying enzyme, mycobacterial topoisomerase I. The design takes advantages of the unique photophysical properties of quantum dots to generate visible fluorescence recovery upon specific cleavage by mycobacterial topoisomerase I. This report, for the first time, demonstrates the possibility to quantify the cleavage activity of the mycobacterial enzyme without the pre-processing sample purification or post-processing signal amplification. The cleavage induced signal response has also proven reliable in biological matrices, such as whole cell extracts prepared from Escherichia coli and human Caco-2 cells. It is expected that the assay may contribute to the clinical diagnostics of bacterial diseases, as well as the evaluation of treatment outcomes. Electronic supplementary information (ESI) available: Characterization of the QD-based DNA Nanosensor. See DOI: 10.1039/c5nr06326d

  13. Demonstrating the Safety and Reliability of a New System or Spacecraft: Incorporating Analyses and Reviews of the Design and Processing in Determining the Number of Tests to be Conducted

    NASA Technical Reports Server (NTRS)

    Vesely, William E.; Colon, Alfredo E.

    2010-01-01

    Design Safety/Reliability is associated with the probability of no failure-causing faults existing in a design. Confidence in the non-existence of failure-causing faults is increased by performing tests with no failure. Reliability-Growth testing requirements are based on initial assurance and fault detection probability. Using binomial tables generally gives too many required tests compared to reliability-growth requirements. Reliability-Growth testing requirements are based on reliability principles and factors and should be used.

  14. Selenide isotope generator for the Galileo mission. Reliability program plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-10-01

    The reliability program plan for the Selenide Isotope Generator (SIG) program is presented. It delineates the specific tasks that will be accomplished by Teledyne Energy Systems and its suppliers during design, development, fabrication and test of deliverable Radioisotopic Thermoelectric Generators (RTG), Electrical Heated Thermoelectric Generators (ETG) and associated Ground Support Equipment (GSE). The Plan is formulated in general accordance with procedures specified in DOE Reliability Engineering Program Requirements Publication No. SNS-2, dated June 17, 1974. The Reliability Program Plan presented herein defines the total reliability effort without further reference to Government Specifications. The reliability tasks to be accomplished are delineatedmore » herein and become the basis for contract compliance to the extent specified in the SIG contract Statement of Work.« less

  15. Extended papers selected from ESSDERC 2015

    NASA Astrophysics Data System (ADS)

    Grasser, Tibor; Schmitz, Jurriaan; Lemme, Max C.

    2016-11-01

    This special issue of Solid State Electronics includes 28 papers which have been carefully selected from the best presentations given at the 45th European Solid-State Device Research Conference (ESSDERC 2015) held from September 14-18, 2015 in Graz, Austria. These papers cover a wide range of topics related to the research on solid-state devices. These topics are used also to organize the conference submissions and presentations into 7 tracks: CMOS Processes, Devices and Integration; Opto-, Power- and Microwave Devices; Modeling & Simulation; Characterization, Reliability & Yield; Advanced & Emerging Memories; MEMS, Sensors & Display Technologies; Emerging Non-CMOS Devices & Technologies.

  16. National audit of continence care: laying the foundation.

    PubMed

    Mian, Sarah; Wagg, Adrian; Irwin, Penny; Lowe, Derek; Potter, Jonathan; Pearson, Michael

    2005-12-01

    National audit provides a basis for establishing performance against national standards, benchmarking against other service providers and improving standards of care. For effective audit, clinical indicators are required that are valid, feasible to apply and reliable. This study describes the methods used to develop clinical indicators of continence care in preparation for a national audit. To describe the methods used to develop and test clinical indicators of continence care with regard to validity, feasibility and reliability. A multidisciplinary working group developed clinical indicators that measured the structure, process and outcome of care as well as case-mix variables. Literature searching, consensus workshops and a Delphi process were used to develop the indicators. The indicators were tested in 15 secondary care sites, 15 primary care sites and 15 long-term care settings. The process of development produced indicators that received a high degree of consensus within the Delphi process. Testing of the indicators demonstrated an internal reliability of 0.7 and an external reliability of 0.6. Data collection required significant investment in terms of staff time and training. The method used produced indicators that achieved a high degree of acceptance from health care professionals. The reliability of data collection was high for this audit and was similar to the level seen in other successful national audits. Data collection for the indicators was feasible to collect, however, issues of time and staffing were identified as limitations to such data collection. The study has described a systematic method for developing clinical indicators for national audit. The indicators proved robust and reliable in primary and secondary care as well as long-term care settings.

  17. Application of a distributed systems architecture for increased speed in image processing on an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Wright, Adam A.; Momin, Orko; Shin, Young Ho; Shakya, Rahul; Nepal, Kumud; Ahlgren, David J.

    2010-01-01

    This paper presents the application of a distributed systems architecture to an autonomous ground vehicle, Q, that participates in both the autonomous and navigation challenges of the Intelligent Ground Vehicle Competition. In the autonomous challenge the vehicle is required to follow a course, while avoiding obstacles and staying within the course boundaries, which are marked by white lines. For the navigation challenge, the vehicle is required to reach a set of target destinations, known as way points, with given GPS coordinates and avoid obstacles that it encounters in the process. Previously the vehicle utilized a single laptop to execute all processing activities including image processing, sensor interfacing and data processing, path planning and navigation algorithms and motor control. National Instruments' (NI) LabVIEW served as the programming language for software implementation. As an upgrade to last year's design, a NI compact Reconfigurable Input/Output system (cRIO) was incorporated to the system architecture. The cRIO is NI's solution for rapid prototyping that is equipped with a real time processor, an FPGA and modular input/output. Under the current system, the real time processor handles the path planning and navigation algorithms, the FPGA gathers and processes sensor data. This setup leaves the laptop to focus on running the image processing algorithm. Image processing as previously presented by Nepal et. al. is a multi-step line extraction algorithm and constitutes the largest processor load. This distributed approach results in a faster image processing algorithm which was previously Q's bottleneck. Additionally, the path planning and navigation algorithms are executed more reliably on the real time processor due to the deterministic nature of operation. The implementation of this architecture required exploration of various inter-system communication techniques. Data transfer between the laptop and the real time processor using UDP packets was established as the most reliable protocol after testing various options. Improvement can be made to the system by migrating more algorithms to the hardware based FPGA to further speed up the operations of the vehicle.

  18. Atomic Processes and Diagnostics of Low Pressure Krypton Plasma

    NASA Astrophysics Data System (ADS)

    Srivastava, Rajesh; Goyal, Dipti; Gangwar, Reetesh; Stafford, Luc

    2015-03-01

    Optical emission spectroscopy along with suitable collisional-radiative (CR) model is used in plasma diagnostics. Importance of reliable cross-sections for various atomic processes is shown for low pressure argon plasma. In the present work, radially-averaged Kr emission lines from the 2pi --> 1sj were recorded as a function of pressure from 1 to 50mTorr. We have developed a CR model using our fine-structure relativistic-distorted wave cross sections. The various processes considered are electron-impact excitation, ionization and their reverse processes. The required rate coefficients have been calculated from these cross-sections assuming Maxwellian energy distribution. Electron temperature obtained from the CR model is found to be in good agreement with the probe measurements. Work is supported by IAEA Vienna, DAE-BRNS Mumbai and CSIR, New Delhi.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumitrescu, Eugene; Humble, Travis S.

    The accurate and reliable characterization of quantum dynamical processes underlies efforts to validate quantum technologies, where discrimination between competing models of observed behaviors inform efforts to fabricate and operate qubit devices. We present a protocol for quantum channel discrimination that leverages advances in direct characterization of quantum dynamics (DCQD) codes. We demonstrate that DCQD codes enable selective process tomography to improve discrimination between entangling and correlated quantum dynamics. Numerical simulations show selective process tomography requires only a few measurement configurations to achieve a low false alarm rate and that the DCQD encoding improves the resilience of the protocol to hiddenmore » sources of noise. Lastly, our results show that selective process tomography with DCQD codes is useful for efficiently distinguishing sources of correlated crosstalk from uncorrelated noise in current and future experimental platforms.« less

  20. The Raid distributed database system

    NASA Technical Reports Server (NTRS)

    Bhargava, Bharat; Riedl, John

    1989-01-01

    Raid, a robust and adaptable distributed database system for transaction processing (TP), is described. Raid is a message-passing system, with server processes on each site to manage concurrent processing, consistent replicated copies during site failures, and atomic distributed commitment. A high-level layered communications package provides a clean location-independent interface between servers. The latest design of the package delivers messages via shared memory in a configuration with several servers linked into a single process. Raid provides the infrastructure to investigate various methods for supporting reliable distributed TP. Measurements on TP and server CPU time are presented, along with data from experiments on communications software, consistent replicated copy control during site failures, and concurrent distributed checkpointing. A software tool for evaluating the implementation of TP algorithms in an operating-system kernel is proposed.

Top