Sample records for reliable alternative method

  1. ASSESSING AND COMBINING RELIABILITY OF PROTEIN INTERACTION SOURCES

    PubMed Central

    LEACH, SONIA; GABOW, AARON; HUNTER, LAWRENCE; GOLDBERG, DEBRA S.

    2008-01-01

    Integrating diverse sources of interaction information to create protein networks requires strategies sensitive to differences in accuracy and coverage of each source. Previous integration approaches calculate reliabilities of protein interaction information sources based on congruity to a designated ‘gold standard.’ In this paper, we provide a comparison of the two most popular existing approaches and propose a novel alternative for assessing reliabilities which does not require a gold standard. We identify a new method for combining the resultant reliabilities and compare it against an existing method. Further, we propose an extrinsic approach to evaluation of reliability estimates, considering their influence on the downstream tasks of inferring protein function and learning regulatory networks from expression data. Results using this evaluation method show 1) our method for reliability estimation is an attractive alternative to those requiring a gold standard and 2) the new method for combining reliabilities is less sensitive to noise in reliability assignments than the similar existing technique. PMID:17990508

  2. Evaluating the Performance of the IEEE Standard 1366 Method for Identifying Major Event Days

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eto, Joseph H.; LaCommare, Kristina Hamachi; Sohn, Michael D.

    IEEE Standard 1366 offers a method for segmenting reliability performance data to isolate the effects of major events from the underlying year-to-year trends in reliability. Recent analysis by the IEEE Distribution Reliability Working Group (DRWG) has found that reliability performance of some utilities differs from the expectations that helped guide the development of the Standard 1366 method. This paper proposes quantitative metrics to evaluate the performance of the Standard 1366 method in identifying major events and in reducing year-to-year variability in utility reliability. The metrics are applied to a large sample of utility-reported reliability data to assess performance of themore » method with alternative specifications that have been considered by the DRWG. We find that none of the alternatives perform uniformly 'better' than the current Standard 1366 method. That is, none of the modifications uniformly lowers the year-to-year variability in System Average Interruption Duration Index without major events. Instead, for any given alternative, while it may lower the value of this metric for some utilities, it also increases it for other utilities (sometimes dramatically). Thus, we illustrate some of the trade-offs that must be considered in using the Standard 1366 method and highlight the usefulness of the metrics we have proposed in conducting these evaluations.« less

  3. Investigating the technical adequacy of curriculum-based measurement in written expression for students who are deaf or hard of hearing.

    PubMed

    Cheng, Shu-Fen; Rose, Susan

    2009-01-01

    This study investigated the technical adequacy of curriculum-based measures of written expression (CBM-W) in terms of writing prompts and scoring methods for deaf and hard-of-hearing students. Twenty-two students at the secondary school-level completed 3-min essays within two weeks, which were scored for nine existing and alternative curriculum-based measurement (CBM) scoring methods. The technical features of the nine scoring methods were examined for interrater reliability, alternate-form reliability, and criterion-related validity. The existing CBM scoring method--number of correct minus incorrect word sequences--yielded the highest reliability and validity coefficients. The findings from this study support the use of the CBM-W as a reliable and valid tool for assessing general writing proficiency with secondary students who are deaf or hard of hearing. The CBM alternative scoring methods that may serve as additional indicators of written expression include correct subject-verb agreements, correct clauses, and correct morphemes.

  4. Reliability of Radioisotope Stirling Convertor Linear Alternator

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin; Korovaichuk, Igor; Geng, Steven M.; Schreiber, Jeffrey G.

    2006-01-01

    Onboard radioisotope power systems being developed and planned for NASA s deep-space missions would require reliable design lifetimes of up to 14 years. Critical components and materials of Stirling convertors have been undergoing extensive testing and evaluation in support of a reliable performance for the specified life span. Of significant importance to the successful development of the Stirling convertor is the design of a lightweight and highly efficient linear alternator. Alternator performance could vary due to small deviations in the permanent magnet properties, operating temperature, and component geometries. Durability prediction and reliability of the alternator may be affected by these deviations from nominal design conditions. Therefore, it is important to evaluate the effect of these uncertainties in predicting the reliability of the linear alternator performance. This paper presents a study in which a reliability-based methodology is used to assess alternator performance. The response surface characterizing the induced open-circuit voltage performance is constructed using 3-D finite element magnetic analysis. Fast probability integration method is used to determine the probability of the desired performance and its sensitivity to the alternator design parameters.

  5. Measuring reliable change in cognition using the Edinburgh Cognitive and Behavioural ALS Screen (ECAS).

    PubMed

    Crockford, Christopher; Newton, Judith; Lonergan, Katie; Madden, Caoifa; Mays, Iain; O'Sullivan, Meabhdh; Costello, Emmet; Pinto-Grau, Marta; Vajda, Alice; Heverin, Mark; Pender, Niall; Al-Chalabi, Ammar; Hardiman, Orla; Abrahams, Sharon

    2018-02-01

    Cognitive impairment affects approximately 50% of people with amyotrophic lateral sclerosis (ALS). Research has indicated that impairment may worsen with disease progression. The Edinburgh Cognitive and Behavioural ALS Screen (ECAS) was designed to measure neuropsychological functioning in ALS, with its alternate forms (ECAS-A, B, and C) allowing for serial assessment over time. The aim of the present study was to establish reliable change scores for the alternate forms of the ECAS, and to explore practice effects and test-retest reliability of the ECAS's alternate forms. Eighty healthy participants were recruited, with 57 completing two and 51 completing three assessments. Participants were administered alternate versions of the ECAS serially (A-B-C) at four-month intervals. Intra-class correlation analysis was employed to explore test-retest reliability, while analysis of variance was used to examine the presence of practice effects. Reliable change indices (RCI) and regression-based methods were utilized to establish change scores for the ECAS alternate forms. Test-retest reliability was excellent for ALS Specific, ALS Non-Specific, and ECAS Total scores of the combined ECAS A, B, and C (all > .90). No significant practice effects were observed over the three testing sessions. RCI and regression-based methods produced similar change scores. The alternate forms of the ECAS possess excellent test-retest reliability in a healthy control sample, with no significant practice effects. The use of conservative RCI scores is recommended. Therefore, a change of ≥8, ≥4, and ≥9 for ALS Specific, ALS Non-Specific, and ECAS Total score is required for reliable change.

  6. A Comparison of Two Methods of Determining Interrater Reliability

    ERIC Educational Resources Information Center

    Fleming, Judith A.; Taylor, Janeen McCracken; Carran, Deborah

    2004-01-01

    This article offers an alternative methodology for practitioners and researchers to use in establishing interrater reliability for testing purposes. The majority of studies on interrater reliability use a traditional methodology where by two raters are compared using a Pearson product-moment correlation. This traditional method of estimating…

  7. Reliability of Summed Item Scores Using Structural Equation Modeling: An Alternative to Coefficient Alpha

    ERIC Educational Resources Information Center

    Green, Samuel B.; Yang, Yanyun

    2009-01-01

    A method is presented for estimating reliability using structural equation modeling (SEM) that allows for nonlinearity between factors and item scores. Assuming the focus is on consistency of summed item scores, this method for estimating reliability is preferred to those based on linear SEM models and to the most commonly reported estimate of…

  8. Uncertainties in obtaining high reliability from stress-strength models

    NASA Technical Reports Server (NTRS)

    Neal, Donald M.; Matthews, William T.; Vangel, Mark G.

    1992-01-01

    There has been a recent interest in determining high statistical reliability in risk assessment of aircraft components. The potential consequences are identified of incorrectly assuming a particular statistical distribution for stress or strength data used in obtaining the high reliability values. The computation of the reliability is defined as the probability of the strength being greater than the stress over the range of stress values. This method is often referred to as the stress-strength model. A sensitivity analysis was performed involving a comparison of reliability results in order to evaluate the effects of assuming specific statistical distributions. Both known population distributions, and those that differed slightly from the known, were considered. Results showed substantial differences in reliability estimates even for almost nondetectable differences in the assumed distributions. These differences represent a potential problem in using the stress-strength model for high reliability computations, since in practice it is impossible to ever know the exact (population) distribution. An alternative reliability computation procedure is examined involving determination of a lower bound on the reliability values using extreme value distributions. This procedure reduces the possibility of obtaining nonconservative reliability estimates. Results indicated the method can provide conservative bounds when computing high reliability. An alternative reliability computation procedure is examined involving determination of a lower bound on the reliability values using extreme value distributions. This procedure reduces the possibility of obtaining nonconservative reliability estimates. Results indicated the method can provide conservative bounds when computing high reliability.

  9. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    NASA Astrophysics Data System (ADS)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  10. Trustworthiness and Authenticity: Alternate Ways To Judge Authentic Assessments.

    ERIC Educational Resources Information Center

    Hipps, Jerome A.

    New methods are needed to judge the quality of alternative student assessment, methods which complement the philosophy underlying authentic assessments. This paper examines assumptions underlying validity, reliability, and objectivity, and why they are not matched to authentic assessment, concentrating on the constructivist paradigm of E. Guba and…

  11. Alternative Methods for Calculating Intercoder Reliability in Content Analysis: Kappa, Weighted Kappa and Agreement Charts Procedures.

    ERIC Educational Resources Information Center

    Kang, Namjun

    If content analysis is to satisfy the requirement of objectivity, measures and procedures must be reliable. Reliability is usually measured by the proportion of agreement of all categories identically coded by different coders. For such data to be empirically meaningful, a high degree of inter-coder reliability must be demonstrated. Researchers in…

  12. Assessing the Reliability of Curriculum-Based Measurement: An Application of Latent Growth Modeling

    ERIC Educational Resources Information Center

    Yeo, Seungsoo; Kim, Dong-Il; Branum-Martin, Lee; Wayman, Miya Miura; Espin, Christine A.

    2012-01-01

    The purpose of this study was to demonstrate the use of Latent Growth Modeling (LGM) as a method for estimating reliability of Curriculum-Based Measurement (CBM) progress-monitoring data. The LGM approach permits the error associated with each measure to differ at each time point, thus providing an alternative method for examining of the…

  13. Validation of alternative methods for toxicity testing.

    PubMed Central

    Bruner, L H; Carr, G J; Curren, R D; Chamberlain, M

    1998-01-01

    Before nonanimal toxicity tests may be officially accepted by regulatory agencies, it is generally agreed that the validity of the new methods must be demonstrated in an independent, scientifically sound validation program. Validation has been defined as the demonstration of the reliability and relevance of a test method for a particular purpose. This paper provides a brief review of the development of the theoretical aspects of the validation process and updates current thinking about objectively testing the performance of an alternative method in a validation study. Validation of alternative methods for eye irritation testing is a specific example illustrating important concepts. Although discussion focuses on the validation of alternative methods intended to replace current in vivo toxicity tests, the procedures can be used to assess the performance of alternative methods intended for other uses. Images Figure 1 PMID:9599695

  14. Identification of Reliable Components in Multivariate Curve Resolution-Alternating Least Squares (MCR-ALS): a Data-Driven Approach across Metabolic Processes.

    PubMed

    Motegi, Hiromi; Tsuboi, Yuuri; Saga, Ayako; Kagami, Tomoko; Inoue, Maki; Toki, Hideaki; Minowa, Osamu; Noda, Tetsuo; Kikuchi, Jun

    2015-11-04

    There is an increasing need to use multivariate statistical methods for understanding biological functions, identifying the mechanisms of diseases, and exploring biomarkers. In addition to classical analyses such as hierarchical cluster analysis, principal component analysis, and partial least squares discriminant analysis, various multivariate strategies, including independent component analysis, non-negative matrix factorization, and multivariate curve resolution, have recently been proposed. However, determining the number of components is problematic. Despite the proposal of several different methods, no satisfactory approach has yet been reported. To resolve this problem, we implemented a new idea: classifying a component as "reliable" or "unreliable" based on the reproducibility of its appearance, regardless of the number of components in the calculation. Using the clustering method for classification, we applied this idea to multivariate curve resolution-alternating least squares (MCR-ALS). Comparisons between conventional and modified methods applied to proton nuclear magnetic resonance ((1)H-NMR) spectral datasets derived from known standard mixtures and biological mixtures (urine and feces of mice) revealed that more plausible results are obtained by the modified method. In particular, clusters containing little information were detected with reliability. This strategy, named "cluster-aided MCR-ALS," will facilitate the attainment of more reliable results in the metabolomics datasets.

  15. Predictive Validity of Measures of the Pathfinder Scaling Algorithm on Programming Performance: Alternative Assessment Strategy for Programming Education

    ERIC Educational Resources Information Center

    Lau, Wilfred W. F.; Yuen, Allan H. K.

    2009-01-01

    Recent years have seen a shift in focus from assessment of learning to assessment for learning and the emergence of alternative assessment methods. However, the reliability and validity of these methods as assessment tools are still questionable. In this article, we investigated the predictive validity of measures of the Pathfinder Scaling…

  16. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  17. Evidence-Based Indicators of Neuropsychological Change in the Individual Patient: Relevant Concepts and Methods

    PubMed Central

    Duff, Kevin

    2012-01-01

    Repeated assessments are a relatively common occurrence in clinical neuropsychology. The current paper will review some of the relevant concepts (e.g., reliability, practice effects, alternate forms) and methods (e.g., reliable change index, standardized based regression) that are used in repeated neuropsychological evaluations. The focus will be on the understanding and application of these concepts and methods in the evaluation of the individual patient through examples. Finally, some future directions for assessing change will be described. PMID:22382384

  18. Cluster-based upper body marker models for three-dimensional kinematic analysis: Comparison with an anatomical model and reliability analysis.

    PubMed

    Boser, Quinn A; Valevicius, Aïda M; Lavoie, Ewen B; Chapman, Craig S; Pilarski, Patrick M; Hebert, Jacqueline S; Vette, Albert H

    2018-04-27

    Quantifying angular joint kinematics of the upper body is a useful method for assessing upper limb function. Joint angles are commonly obtained via motion capture, tracking markers placed on anatomical landmarks. This method is associated with limitations including administrative burden, soft tissue artifacts, and intra- and inter-tester variability. An alternative method involves the tracking of rigid marker clusters affixed to body segments, calibrated relative to anatomical landmarks or known joint angles. The accuracy and reliability of applying this cluster method to the upper body has, however, not been comprehensively explored. Our objective was to compare three different upper body cluster models with an anatomical model, with respect to joint angles and reliability. Non-disabled participants performed two standardized functional upper limb tasks with anatomical and cluster markers applied concurrently. Joint angle curves obtained via the marker clusters with three different calibration methods were compared to those from an anatomical model, and between-session reliability was assessed for all models. The cluster models produced joint angle curves which were comparable to and highly correlated with those from the anatomical model, but exhibited notable offsets and differences in sensitivity for some degrees of freedom. Between-session reliability was comparable between all models, and good for most degrees of freedom. Overall, the cluster models produced reliable joint angles that, however, cannot be used interchangeably with anatomical model outputs to calculate kinematic metrics. Cluster models appear to be an adequate, and possibly advantageous alternative to anatomical models when the objective is to assess trends in movement behavior. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Development of a nanosatellite de-orbiting system by reliability based design optimization

    NASA Astrophysics Data System (ADS)

    Nikbay, Melike; Acar, Pınar; Aslan, Alim Rüstem

    2015-12-01

    This paper presents design approaches to develop a reliable and efficient de-orbiting system for the 3USAT nanosatellite to provide a beneficial orbital decay process at the end of a mission. A de-orbiting system is initially designed by employing the aerodynamic drag augmentation principle where the structural constraints of the overall satellite system and the aerodynamic forces are taken into account. Next, an alternative de-orbiting system is designed with new considerations and further optimized using deterministic and reliability based design techniques. For the multi-objective design, the objectives are chosen to maximize the aerodynamic drag force through the maximization of the Kapton surface area while minimizing the de-orbiting system mass. The constraints are related in a deterministic manner to the required deployment force, the height of the solar panel hole and the deployment angle. The length and the number of layers of the deployable Kapton structure are used as optimization variables. In the second stage of this study, uncertainties related to both manufacturing and operating conditions of the deployable structure in space environment are considered. These uncertainties are then incorporated into the design process by using different probabilistic approaches such as Monte Carlo Simulation, the First-Order Reliability Method and the Second-Order Reliability Method. The reliability based design optimization seeks optimal solutions using the former design objectives and constraints with the inclusion of a reliability index. Finally, the de-orbiting system design alternatives generated by different approaches are investigated and the reliability based optimum design is found to yield the best solution since it significantly improves both system reliability and performance requirements.

  20. Screening of groundwater remedial alternatives for brownfield sites: a comprehensive method integrated MCDA with numerical simulation.

    PubMed

    Li, Wei; Zhang, Min; Wang, Mingyu; Han, Zhantao; Liu, Jiankai; Chen, Zhezhou; Liu, Bo; Yan, Yan; Liu, Zhu

    2018-06-01

    Brownfield sites pollution and remediation is an urgent environmental issue worldwide. The screening and assessment of remedial alternatives is especially complex owing to its multiple criteria that involves technique, economy, and policy. To help the decision-makers selecting the remedial alternatives efficiently, the criteria framework conducted by the U.S. EPA is improved and a comprehensive method that integrates multiple criteria decision analysis (MCDA) with numerical simulation is conducted in this paper. The criteria framework is modified and classified into three categories: qualitative, semi-quantitative, and quantitative criteria, MCDA method, AHP-PROMETHEE (analytical hierarchy process-preference ranking organization method for enrichment evaluation) is used to determine the priority ranking of the remedial alternatives and the solute transport simulation is conducted to assess the remedial efficiency. A case study was present to demonstrate the screening method in a brownfield site in Cangzhou, northern China. The results show that the systematic method provides a reliable way to quantify the priority of the remedial alternatives.

  1. Selection of remedial alternatives for mine sites: a multicriteria decision analysis approach.

    PubMed

    Betrie, Getnet D; Sadiq, Rehan; Morin, Kevin A; Tesfamariam, Solomon

    2013-04-15

    The selection of remedial alternatives for mine sites is a complex task because it involves multiple criteria and often with conflicting objectives. However, an existing framework used to select remedial alternatives lacks multicriteria decision analysis (MCDA) aids and does not consider uncertainty in the selection of alternatives. The objective of this paper is to improve the existing framework by introducing deterministic and probabilistic MCDA methods. The Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE) methods have been implemented in this study. The MCDA analysis involves processing inputs to the PROMETHEE methods that are identifying the alternatives, defining the criteria, defining the criteria weights using analytical hierarchical process (AHP), defining the probability distribution of criteria weights, and conducting Monte Carlo Simulation (MCS); running the PROMETHEE methods using these inputs; and conducting a sensitivity analysis. A case study was presented to demonstrate the improved framework at a mine site. The results showed that the improved framework provides a reliable way of selecting remedial alternatives as well as quantifying the impact of different criteria on selecting alternatives. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Probabilistic modelling of overflow, surcharge and flooding in urban drainage using the first-order reliability method and parameterization of local rain series.

    PubMed

    Thorndahl, S; Willems, P

    2008-01-01

    Failure of urban drainage systems may occur due to surcharge or flooding at specific manholes in the system, or due to overflows from combined sewer systems to receiving waters. To quantify the probability or return period of failure, standard approaches make use of the simulation of design storms or long historical rainfall series in a hydrodynamic model of the urban drainage system. In this paper, an alternative probabilistic method is investigated: the first-order reliability method (FORM). To apply this method, a long rainfall time series was divided in rainstorms (rain events), and each rainstorm conceptualized to a synthetic rainfall hyetograph by a Gaussian shape with the parameters rainstorm depth, duration and peak intensity. Probability distributions were calibrated for these three parameters and used on the basis of the failure probability estimation, together with a hydrodynamic simulation model to determine the failure conditions for each set of parameters. The method takes into account the uncertainties involved in the rainstorm parameterization. Comparison is made between the failure probability results of the FORM method, the standard method using long-term simulations and alternative methods based on random sampling (Monte Carlo direct sampling and importance sampling). It is concluded that without crucial influence on the modelling accuracy, the FORM is very applicable as an alternative to traditional long-term simulations of urban drainage systems.

  3. A human reliability based usability evaluation method for safety-critical software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, R. L.; Tran, T. Q.; Gertman, D. I.

    2006-07-01

    Boring and Gertman (2005) introduced a novel method that augments heuristic usability evaluation methods with that of the human reliability analysis method of SPAR-H. By assigning probabilistic modifiers to individual heuristics, it is possible to arrive at the usability error probability (UEP). Although this UEP is not a literal probability of error, it nonetheless provides a quantitative basis to heuristic evaluation. This method allows one to seamlessly prioritize and identify usability issues (i.e., a higher UEP requires more immediate fixes). However, the original version of this method required the usability evaluator to assign priority weights to the final UEP, thusmore » allowing the priority of a usability issue to differ among usability evaluators. The purpose of this paper is to explore an alternative approach to standardize the priority weighting of the UEP in an effort to improve the method's reliability. (authors)« less

  4. Reliability models: the influence of model specification in generation expansion planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stremel, J.P.

    1982-10-01

    This paper is a critical evaluation of reliability methods used for generation expansion planning. It is shown that the methods for treating uncertainty are critical for determining the relative reliability value of expansion alternatives. It is also shown that the specification of the reliability model will not favor all expansion options equally. Consequently, the model is biased. In addition, reliability models should be augmented with an economic value of reliability (such as the cost of emergency procedures or energy not served). Generation expansion evaluations which ignore the economic value of excess reliability can be shown to be inconsistent. The conclusionsmore » are that, in general, a reliability model simplifies generation expansion planning evaluations. However, for a thorough analysis, the expansion options should be reviewed for candidates which may be unduly rejected because of the bias of the reliability model. And this implies that for a consistent formulation in an optimization framework, the reliability model should be replaced with a full economic optimization which includes the costs of emergency procedures and interruptions in the objective function.« less

  5. Comparison of three methods for the detection of Trichinella spiralis infections in pigs by five European laboratories*

    PubMed Central

    Kohler, G.; Ruitenberg, E. J.

    1974-01-01

    Three methods employed in the diagnosis of trichinosis (trichinoscopy, digestion method, and immunofluorescence technique) were compared by laboratories in 5 countries of the European economic community. For this purpose, material from 32 pigs infected with 50, 150, 500, and 1 500 T. spiralis larvae was examined. With none of the three methods was it possible to detect with sufficient reliability a T. spiralis infection in pigs infected with 50 larvae. The digestion method and the immunofluorescence technique yielded more reliable results when the infection dose was 150 larvae or more. With trichinoscopy, reliable results were obtained in pigs infected with 500 and 1 500 larvae. With the digestion method and trichinoscopy, the onset of infections was detectable from 3 weeks post infection, the digestion method being more reliable; the immunofluorescence technique yielded positive results from approximately 4-6 weeks post infection. The immunofluorescence technique is applicable for epidemiological surveys. As a routine diagnostic procedure in the slaughterhouse, trichinoscopy and the digestion method are possible alternatives, the latter being more sensitive. PMID:4616776

  6. Modeling of unit operating considerations in generating-capacity reliability evaluation. Volume 1. Mathematical models, computing methods, and results. Final report. [GENESIS, OPCON and OPPLAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, A.D.; Ayoub, A.K.; Singh, C.

    1982-07-01

    Existing methods for generating capacity reliability evaluation do not explicitly recognize a number of operating considerations which may have important effects in system reliability performance. Thus, current methods may yield estimates of system reliability which differ appreciably from actual observed reliability. Further, current methods offer no means of accurately studying or evaluating alternatives which may differ in one or more operating considerations. Operating considerations which are considered to be important in generating capacity reliability evaluation include: unit duty cycles as influenced by load cycle shape, reliability performance of other units, unit commitment policy, and operating reserve policy; unit start-up failuresmore » distinct from unit running failures; unit start-up times; and unit outage postponability and the management of postponable outages. A detailed Monte Carlo simulation computer model called GENESIS and two analytical models called OPCON and OPPLAN have been developed which are capable of incorporating the effects of many operating considerations including those noted above. These computer models have been used to study a variety of actual and synthetic systems and are available from EPRI. The new models are shown to produce system reliability indices which differ appreciably from index values computed using traditional models which do not recognize operating considerations.« less

  7. Training and Maintaining System-Wide Reliability in Outcome Management.

    PubMed

    Barwick, Melanie A; Urajnik, Diana J; Moore, Julia E

    2014-01-01

    The Child and Adolescent Functional Assessment Scale (CAFAS) is widely used for outcome management, for providing real time client and program level data, and the monitoring of evidence-based practices. Methods of reliability training and the assessment of rater drift are critical for service decision-making within organizations and systems of care. We assessed two approaches for CAFAS training: external technical assistance and internal technical assistance. To this end, we sampled 315 practitioners trained by external technical assistance approach from 2,344 Ontario practitioners who had achieved reliability on the CAFAS. To assess the internal technical assistance approach as a reliable alternative training method, 140 practitioners trained internally were selected from the same pool of certified raters. Reliabilities were high for both practitioners trained by external technical assistance and internal technical assistance approaches (.909-.995, .915-.997, respectively). 1 and 3-year estimates showed some drift on several scales. High and consistent reliabilities over time and training method has implications for CAFAS training of behavioral health care practitioners, and the maintenance of CAFAS as a global outcome management tool in systems of care.

  8. Solid fat content measurement as an alternative to total polar compound analysis

    USDA-ARS?s Scientific Manuscript database

    Monitoring of oxidative degradation in frying oils is essential for determining an appropriate discard time. The most reliable method for monitoring the extent of oxidation in edible oils is the determination of total polar compounds (TPC). However, this method is time-consuming and not practical ...

  9. 75 FR 70752 - Reliability Monitoring, Enforcement and Compliance Issues; Announcement of Panelists for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-18

    ... the Regional Entities set priorities of what to audit, and are they doing a good job setting priorities? Do audits focus too much on documentation? Would alternative auditing methods also demonstrate...

  10. Validation of Alternative In Vitro Methods to Animal Testing: Concepts, Challenges, Processes and Tools.

    PubMed

    Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie

    This chapter explores the concepts, processes, tools and challenges relating to the validation of alternative methods for toxicity and safety testing. In general terms, validation is the process of assessing the appropriateness and usefulness of a tool for its intended purpose. Validation is routinely used in various contexts in science, technology, the manufacturing and services sectors. It serves to assess the fitness-for-purpose of devices, systems, software up to entire methodologies. In the area of toxicity testing, validation plays an indispensable role: "alternative approaches" are increasingly replacing animal models as predictive tools and it needs to be demonstrated that these novel methods are fit for purpose. Alternative approaches include in vitro test methods, non-testing approaches such as predictive computer models up to entire testing and assessment strategies composed of method suites, data sources and decision-aiding tools. Data generated with alternative approaches are ultimately used for decision-making on public health and the protection of the environment. It is therefore essential that the underlying methods and methodologies are thoroughly characterised, assessed and transparently documented through validation studies involving impartial actors. Importantly, validation serves as a filter to ensure that only test methods able to produce data that help to address legislative requirements (e.g. EU's REACH legislation) are accepted as official testing tools and, owing to the globalisation of markets, recognised on international level (e.g. through inclusion in OECD test guidelines). Since validation creates a credible and transparent evidence base on test methods, it provides a quality stamp, supporting companies developing and marketing alternative methods and creating considerable business opportunities. Validation of alternative methods is conducted through scientific studies assessing two key hypotheses, reliability and relevance of the test method for a given purpose. Relevance encapsulates the scientific basis of the test method, its capacity to predict adverse effects in the "target system" (i.e. human health or the environment) as well as its applicability for the intended purpose. In this chapter we focus on the validation of non-animal in vitro alternative testing methods and review the concepts, challenges, processes and tools fundamental to the validation of in vitro methods intended for hazard testing of chemicals. We explore major challenges and peculiarities of validation in this area. Based on the notion that validation per se is a scientific endeavour that needs to adhere to key scientific principles, namely objectivity and appropriate choice of methodology, we examine basic aspects of study design and management, and provide illustrations of statistical approaches to describe predictive performance of validated test methods as well as their reliability.

  11. Validity and Reliability of Visual Analog Scaling for Assessment of Hypernasality and Audible Nasal Emission in Children With Repaired Cleft Palate.

    PubMed

    Baylis, Adriane; Chapman, Kathy; Whitehill, Tara L; Group, The Americleft Speech

    2015-11-01

    To investigate the validity and reliability of multiple listener judgments of hypernasality and audible nasal emission, in children with repaired cleft palate, using visual analog scaling (VAS) and equal-appearing interval (EAI) scaling. Prospective comparative study of multiple listener ratings of hypernasality and audible nasal emission. Multisite institutional. Five trained and experienced speech-language pathologist listeners from the Americleft Speech Project. Average VAS and EAI ratings of hypernasality and audible nasal emission/turbulence for 12 video-recorded speech samples from the Americleft Speech Project. Intrarater and interrater reliability was computed, as well as linear and polynomial models of best fit. Intrarater and interrater reliability was acceptable for both rating methods; however, reliability was higher for VAS as compared to EAI ratings. When VAS ratings were plotted against EAI ratings, results revealed a stronger curvilinear relationship. The results of this study provide additional evidence that alternate rating methods such as VAS may offer improved validity and reliability over EAI ratings of speech. VAS should be considered a viable method for rating hypernasality and nasal emission in speech in children with repaired cleft palate.

  12. Regional Reliability of Quantitative Signal Targeting with Alternating Radiofrequency (STAR) Labeling of Arterial Regions (QUASAR)

    PubMed Central

    Tatewaki, Yasuko; Higano, Shuichi; Taki, Yasuyuki; Thyreau, Benjamin; Murata, Takaki; Mugikura, Shunji; Ito, Daisuke; Takase, Kei; Takahashi, Shoki

    2014-01-01

    BACKGROUND AND PURPOSE Quantitative signal targeting with alternating radiofrequency labeling of arterial regions (QUASAR) is a recent spin labeling technique that could improve the reliability of brain perfusion measurements. Although it is considered reliable for measuring gray matter as a whole, it has never been evaluated regionally. Here we assessed this regional reliability. METHODS Using a 3-Tesla Philips Achieva whole-body system, we scanned four times 10 healthy volunteers, in two sessions 2 weeks apart, to obtain QUASAR images. We computed perfusion images and ran a voxel-based analysis within all brain structures. We also calculated mean regional cerebral blood flow (rCBF) within regions of interest configured for each arterial territory distribution. RESULTS The mean CBF over whole gray matter was 37.74 with intraclass correlation coefficient (ICC) of .70. In white matter, it was 13.94 with an ICC of .30. Voxel-wise ICC and coefficient-of-variation maps showed relatively lower reliability in watershed areas and white matter especially in deeper white matter. The absolute mean rCBF values were consistent with the ones reported from PET, as was the relatively low variability in different feeding arteries. CONCLUSIONS Thus, QUASAR reliability for regional perfusion is high within gray matter, but uncertain within white matter. PMID:25370338

  13. Techniques for control of long-term reliability of complex integrated circuits. I - Reliability assurance by test vehicle qualification.

    NASA Technical Reports Server (NTRS)

    Van Vonno, N. W.

    1972-01-01

    Development of an alternate approach to the conventional methods of reliability assurance for large-scale integrated circuits. The product treated is a large-scale T squared L array designed for space applications. The concept used is that of qualification of product by evaluation of the basic processing used in fabricating the product, providing an insight into its potential reliability. Test vehicles are described which enable evaluation of device characteristics, surface condition, and various parameters of the two-level metallization system used. Evaluation of these test vehicles is performed on a lot qualification basis, with the lot consisting of one wafer. Assembled test vehicles are evaluated by high temperature stress at 300 C for short time durations. Stressing at these temperatures provides a rapid method of evaluation and permits a go/no go decision to be made on the wafer lot in a timely fashion.

  14. Reliability Evaluation Method with Weibull Distribution for Temporary Overvoltages of Substation Equipment

    NASA Astrophysics Data System (ADS)

    Okabe, Shigemitsu; Tsuboi, Toshihiro; Takami, Jun

    The power-frequency withstand voltage tests are regulated on electric power equipment in JEC by evaluating the lifetime reliability with a Weibull distribution function. The evaluation method is still controversial in terms of consideration of a plural number of faults and some alternative methods were proposed on this subject. The present paper first discusses the physical meanings of the various kinds of evaluating methods and secondly examines their effects on the power-frequency withstand voltage tests. Further, an appropriate method is investigated for an oil-filled transformer and a gas insulated switchgear with taking notice of dielectric breakdown or partial discharge mechanism under various insulating material and structure conditions and the tentative conclusion gives that the conventional method would be most pertinent under the present conditions.

  15. Examinations of electron temperature calculation methods in Thomson scattering diagnostics.

    PubMed

    Oh, Seungtae; Lee, Jong Ha; Wi, Hanmin

    2012-10-01

    Electron temperature from Thomson scattering diagnostic is derived through indirect calculation based on theoretical model. χ-square test is commonly used in the calculation, and the reliability of the calculation method highly depends on the noise level of input signals. In the simulations, noise effects of the χ-square test are examined and scale factor test is proposed as an alternative method.

  16. easyCBM Beginning Reading Measures: Grades K-1 Alternate Form Reliability and Criterion Validity with the SAT-10. Technical Report #1403

    ERIC Educational Resources Information Center

    Wray, Kraig; Lai, Cheng-Fei; Sáez, Leilani; Alonzo, Julie; Tindal, Gerald

    2013-01-01

    We report the results of an alternate form reliability and criterion validity study of kindergarten and grade 1 (N = 84-199) reading measures from the easyCBM© assessment system and Stanford Early School Achievement Test/Stanford Achievement Test, 10th edition (SESAT/SAT-­10) across 5 time points. The alternate form reliabilities ranged from…

  17. Qualitative Analysis: The Current Status.

    ERIC Educational Resources Information Center

    Cole, G. Mattney, Jr.; Waggoner, William H.

    1983-01-01

    To assist in designing/implementing qualitative analysis courses, examines reliability/accuracy of several published separation schemes, notes methods where particular difficulties arise (focusing on Groups II/III), and presents alternative schemes for the separation of these groups. Only cation analyses are reviewed. Figures are presented in…

  18. Monitoring visitor satisfaction: a comparison of comment cards and more in-depth surveys

    Treesearch

    Alan R. Graefe; James D. Absher; Robert C. Burns

    2001-01-01

    This paper compares responses to comment cards and more detailed on-site surveys at selected Corps of Engineers lakes. The results shed light on the validity, reliability, and usefulness of these alternative methods of monitoring customer satisfaction.

  19. A Comparison of Three Methods for the Analysis of Skin Flap Viability: Reliability and Validity.

    PubMed

    Tim, Carla Roberta; Martignago, Cintia Cristina Santi; da Silva, Viviane Ribeiro; Dos Santos, Estefany Camila Bonfim; Vieira, Fabiana Nascimento; Parizotto, Nivaldo Antonio; Liebano, Richard Eloin

    2018-05-01

    Objective: Technological advances have provided new alternatives to the analysis of skin flap viability in animal models; however, the interrater validity and reliability of these techniques have yet to be analyzed. The present study aimed to evaluate the interrater validity and reliability of three different methods: weight of paper template (WPT), paper template area (PTA), and photographic analysis. Approach: Sixteen male Wistar rats had their cranially based dorsal skin flap elevated. On the seventh postoperative day, the viable tissue area and the necrotic area of the skin flap were recorded using the paper template method and photo image. The evaluation of the percentage of viable tissue was performed using three methods, simultaneously and independently by two raters. The analysis of interrater reliability and viability was performed using the intraclass correlation coefficient and Bland Altman Plot Analysis was used to visualize the presence or absence of systematic bias in the evaluations of data validity. Results: The results showed that interrater reliability for WPT, measurement of PTA, and photographic analysis were 0.995, 0.990, and 0.982, respectively. For data validity, a correlation >0.90 was observed for all comparisons made between the three methods. In addition, Bland Altman Plot Analysis showed agreement between the comparisons of the methods and the presence of systematic bias was not observed. Innovation: Digital methods are an excellent choice for assessing skin flap viability; moreover, they make data use and storage easier. Conclusion: Independently from the method used, the interrater reliability and validity proved to be excellent for the analysis of skin flaps' viability.

  20. Reliability Sensitivity Analysis and Design Optimization of Composite Structures Based on Response Surface Methodology

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2003-01-01

    This report discusses the development and application of two alternative strategies in the form of global and sequential local response surface (RS) techniques for the solution of reliability-based optimization (RBO) problems. The problem of a thin-walled composite circular cylinder under axial buckling instability is used as a demonstrative example. In this case, the global technique uses a single second-order RS model to estimate the axial buckling load over the entire feasible design space (FDS) whereas the local technique uses multiple first-order RS models with each applied to a small subregion of FDS. Alternative methods for the calculation of unknown coefficients in each RS model are explored prior to the solution of the optimization problem. The example RBO problem is formulated as a function of 23 uncorrelated random variables that include material properties, thickness and orientation angle of each ply, cylinder diameter and length, as well as the applied load. The mean values of the 8 ply thicknesses are treated as independent design variables. While the coefficients of variation of all random variables are held fixed, the standard deviations of ply thicknesses can vary during the optimization process as a result of changes in the design variables. The structural reliability analysis is based on the first-order reliability method with reliability index treated as the design constraint. In addition to the probabilistic sensitivity analysis of reliability index, the results of the RBO problem are presented for different combinations of cylinder length and diameter and laminate ply patterns. The two strategies are found to produce similar results in terms of accuracy with the sequential local RS technique having a considerably better computational efficiency.

  1. Choosing the optimal wind turbine variant using the ”ELECTRE” method

    NASA Astrophysics Data System (ADS)

    Ţişcă, I. A.; Anuşca, D.; Dumitrescu, C. D.

    2017-08-01

    This paper presents a method of choosing the “optimal” alternative, both under certainty and under uncertainty, based on relevant analysis criteria. Taking into account that a product can be assimilated to a system and that the reliability of the system depends on the reliability of its components, the choice of product (the appropriate system decision) can be done using the “ELECTRE” method and depending on the level of reliability of each product. In the paper, the “ELECTRE” method is used in choosing the optimal version of a wind turbine required to equip a wind farm in western Romania. The problems to be solved are related to the current situation of wind turbines that involves reliability problems. A set of criteria has been proposed to compare two or more products from a range of available products: Operating conditions, Environmental conditions during operation, Time requirements. Using the ELECTRE hierarchical mathematical method it was established that on the basis of the obtained coefficients of concordance the optimal variant of the wind turbine and the order of preference of the variants are determined, the values chosen as limits being arbitrary.

  2. Evaluation of capillary zone electrophoresis for the determination of protein composition in therapeutic immunoglobulins and human albumins.

    PubMed

    Christians, Stefan; van Treel, Nadine Denise; Bieniara, Gabriele; Eulig-Wien, Annika; Hanschmann, Kay-Martin; Giess, Siegfried

    2016-07-01

    Capillary zone electrophoresis (CZE) provides an alternative means of separating native proteins on the basis of their inherent electrophoretic mobilities. The major advantage of CZE is the quantification by UV detection, circumventing the drawbacks of staining and densitometry in the case of gel electrophoresis methods. The data of this validation study showed that CZE is a reliable assay for the determination of protein composition in therapeutic preparations of human albumin and human polyclonal immunoglobulins. Data obtained by CZE are in line with "historical" data obtained by the compendial method, provided that peak integration is performed without time correction. The focus here was to establish a rapid and reliable test to substitute the current gel based zone electrophoresis techniques for the control of protein composition of human immunoglobulins or albumins in the European Pharmacopoeia. We believe that the more advanced and modern CZE method described here is a very good alternative to the procedures currently described in the relevant monographs. Copyright © 2016 International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  3. The reliability and internal consistency of one-shot and flicker change detection for measuring individual differences in visual working memory capacity.

    PubMed

    Pailian, Hrag; Halberda, Justin

    2015-04-01

    We investigated the psychometric properties of the one-shot change detection task for estimating visual working memory (VWM) storage capacity-and also introduced and tested an alternative flicker change detection task for estimating these limits. In three experiments, we found that the one-shot whole-display task returns estimates of VWM storage capacity (K) that are unreliable across set sizes-suggesting that the whole-display task is measuring different things at different set sizes. In two additional experiments, we found that the one-shot single-probe variant shows improvements in the reliability and consistency of K estimates. In another additional experiment, we found that a one-shot whole-display-with-click task (requiring target localization) also showed improvements in reliability and consistency. The latter results suggest that the one-shot task can return reliable and consistent estimates of VWM storage capacity (K), and they highlight the possibility that the requirement to localize the changed target is what engenders this enhancement. Through a final series of four experiments, we introduced and tested an alternative flicker change detection method that also requires the observer to localize the changing target and that generates, from response times, an estimate of VWM storage capacity (K). We found that estimates of K from the flicker task correlated with estimates from the traditional one-shot task and also had high reliability and consistency. We highlight the flicker method's ability to estimate executive functions as well as VWM storage capacity, and discuss the potential for measuring multiple abilities with the one-shot and flicker tasks.

  4. Evaluation of a Method for Rapid Detection of Listeria monocytogenes in Dry-Cured Ham Based on Impedanciometry Combined with Chromogenic Agar.

    PubMed

    Labrador, Mirian; Rota, María C; Pérez, Consuelo; Herrera, Antonio; Bayarri, Susana

    2018-05-01

    The food industry is in need of rapid, reliable methodologies for the detection of Listeria monocytogenes in ready-to-eat products, as an alternative to the International Organization of Standardization (ISO) 11290-1 reference method. The aim of this study was to evaluate impedanciometry combined with chromogenic agar culture for the detection of L. monocytogenes in dry-cured ham. The experimental setup consisted in assaying four strains of L. monocytogenes and two strains of Listeria innocua in pure culture. The method was evaluated according to the ISO 16140:2003 standard through a comparative study with the ISO reference method with 119 samples of dry-cured ham. Significant determination coefficients ( R 2 of up to 0.99) for all strains assayed in pure culture were obtained. The comparative study results had 100% accuracy, 100% specificity, and 100% sensitivity. Impedanciometry followed by chromogenic agar culture was capable of detecting 1 CFU/25 g of food. L. monocytogenes was not detected in the 65 commercial samples tested. The method evaluated herein represents a promising alternative for the food industry in its efforts to control L. monocytogenes. Overall analysis time is shorter and the method permits a straightforward analysis of a large number of samples with reliable results.

  5. The composting option for human waste disposal in the backcountry

    Treesearch

    S. C. Fay; R. H. Walke

    1977-01-01

    The disposal of human waste by composting at backcountry recreation areas is a possible alternative to methods that are considered unsafe. The literature indicates that aerobic, thermophilic composting is a reliable disposal method that can be low in cost and in maintenance. A bark-sewage mixture can be composted to produce a pathogen-free substance that might be used...

  6. Preliminary Solar Sail Design and Fabrication Assessment: Spinning Sail Blade, Square Sail Sheet

    NASA Technical Reports Server (NTRS)

    Daniels, J. B.; Dowdle, D. M.; Hahn, D. W.; Hildreth, E. N.; Lagerquist, D. R.; Mahaonoul, E. J.; Munson, J. B.; Origer, T. F.

    1977-01-01

    Blade design aspects most affecting producibility and means of measurement and control of length, scallop, fullness and straightness requirements and tolerances were extensively considered. Alternate designs of the panel seams and edge reinforcing members are believed to offer advantages of seam integrity, producibility, reliability, cost and weight. Approaches to and requirements for highly specialized metalizing methods, processes and equipment were studied and identified. Alternate methods of sail blade fabrication and related special machinery, tooling, fixtures and trade offs were examined. A preferred and recommended approach is also described. Quality control plans, inspection procedures, flow charts and special test equipment associated with the preferred manufacturing method were analyzed and are discussed.

  7. Sarma-based key-group method for rock slope reliability analyses

    NASA Astrophysics Data System (ADS)

    Yarahmadi Bafghi, A. R.; Verdel, T.

    2005-08-01

    The methods used in conducting static stability analyses have remained pertinent to this day for reasons of both simplicity and speed of execution. The most well-known of these methods for purposes of stability analysis of fractured rock masses is the key-block method (KBM).This paper proposes an extension to the KBM, called the key-group method (KGM), which combines not only individual key-blocks but also groups of collapsable blocks into an iterative and progressive analysis of the stability of discontinuous rock slopes. To take intra-group forces into account, the Sarma method has been implemented within the KGM in order to generate a Sarma-based KGM, abbreviated SKGM. We will discuss herein the hypothesis behind this new method, details regarding its implementation, and validation through comparison with results obtained from the distinct element method.Furthermore, as an alternative to deterministic methods, reliability analyses or probabilistic analyses have been proposed to take account of the uncertainty in analytical parameters and models. The FOSM and ASM probabilistic methods could be implemented within the KGM and SKGM framework in order to take account of the uncertainty due to physical and mechanical data (density, cohesion and angle of friction). We will then show how such reliability analyses can be introduced into SKGM to give rise to the probabilistic SKGM (PSKGM) and how it can be used for rock slope reliability analyses. Copyright

  8. JaCVAM-organized international validation study of the in vivo rodent alkaline comet assay for detection of genotoxic carcinogens: II. Summary of definitive validation study results.

    PubMed

    Uno, Yoshifumi; Kojima, Hajime; Omori, Takashi; Corvi, Raffaella; Honma, Masamistu; Schechtman, Leonard M; Tice, Raymond R; Beevers, Carol; De Boeck, Marlies; Burlinson, Brian; Hobbs, Cheryl A; Kitamoto, Sachiko; Kraynak, Andrew R; McNamee, James; Nakagawa, Yuzuki; Pant, Kamala; Plappert-Helbig, Ulla; Priestley, Catherine; Takasawa, Hironao; Wada, Kunio; Wirnitzer, Uta; Asano, Norihide; Escobar, Patricia A; Lovell, David; Morita, Takeshi; Nakajima, Madoka; Ohno, Yasuo; Hayashi, Makoto

    2015-07-01

    The in vivo rodent alkaline comet assay (comet assay) is used internationally to investigate the in vivo genotoxic potential of test chemicals. This assay, however, has not previously been formally validated. The Japanese Center for the Validation of Alternative Methods (JaCVAM), with the cooperation of the U.S. NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM)/the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), the European Centre for the Validation of Alternative Methods (ECVAM), and the Japanese Environmental Mutagen Society/Mammalian Mutagenesis Study Group (JEMS/MMS), organized an international validation study to evaluate the reliability and relevance of the assay for identifying genotoxic carcinogens, using liver and stomach as target organs. The ultimate goal of this exercise was to establish an Organisation for Economic Co-operation and Development (OECD) test guideline. The study protocol was optimized in the pre-validation studies, and then the definitive (4th phase) validation study was conducted in two steps. In the 1st step, assay reproducibility was confirmed among laboratories using four coded reference chemicals and the positive control ethyl methanesulfonate. In the 2nd step, the predictive capability was investigated using 40 coded chemicals with known genotoxic and carcinogenic activity (i.e., genotoxic carcinogens, genotoxic non-carcinogens, non-genotoxic carcinogens, and non-genotoxic non-carcinogens). Based on the results obtained, the in vivo comet assay is concluded to be highly capable of identifying genotoxic chemicals and therefore can serve as a reliable predictor of rodent carcinogenicity. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Reliability based design including future tests and multiagent approaches

    NASA Astrophysics Data System (ADS)

    Villanueva, Diane

    The initial stages of reliability-based design optimization involve the formulation of objective functions and constraints, and building a model to estimate the reliability of the design with quantified uncertainties. However, even experienced hands often overlook important objective functions and constraints that affect the design. In addition, uncertainty reduction measures, such as tests and redesign, are often not considered in reliability calculations during the initial stages. This research considers two areas that concern the design of engineering systems: 1) the trade-off of the effect of a test and post-test redesign on reliability and cost and 2) the search for multiple candidate designs as insurance against unforeseen faults in some designs. In this research, a methodology was developed to estimate the effect of a single future test and post-test redesign on reliability and cost. The methodology uses assumed distributions of computational and experimental errors with re-design rules to simulate alternative future test and redesign outcomes to form a probabilistic estimate of the reliability and cost for a given design. Further, it was explored how modeling a future test and redesign provides a company an opportunity to balance development costs versus performance by simultaneously designing the design and the post-test redesign rules during the initial design stage. The second area of this research considers the use of dynamic local surrogates, or surrogate-based agents, to locate multiple candidate designs. Surrogate-based global optimization algorithms often require search in multiple candidate regions of design space, expending most of the computation needed to define multiple alternate designs. Thus, focusing on solely locating the best design may be wasteful. We extended adaptive sampling surrogate techniques to locate multiple optima by building local surrogates in sub-regions of the design space to identify optima. The efficiency of this method was studied, and the method was compared to other surrogate-based optimization methods that aim to locate the global optimum using two two-dimensional test functions, a six-dimensional test function, and a five-dimensional engineering example.

  10. Secondary Ion Mass Spectrometry for Mg Tracer Diffusion: Issues and Solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuggle, Jay; Giordani, Andrew; Kulkarni, Nagraj S

    2014-01-01

    A Secondary Ion Mass Spectrometry (SIMS) method has been developed to measure stable Mg isotope tracer diffusion. This SIMS method was then used to calculate Mg self- diffusivities and the data was verified against historical data measured using radio tracers. The SIMS method has been validated as a reliable alternative to the radio-tracer technique for the measurement of Mg self-diffusion coefficients and can be used as a routine method for determining diffusion coefficients.

  11. Reliability of the Inverse Water Volumetry Method to Measure the Volume of the Upper Limb.

    PubMed

    Beek, Martinus A; te Slaa, Alexander; van der Laan, Lijckle; Mulder, Paul G H; Rutten, Harm J T; Voogd, Adri C; Luiten, Ernest J T; Gobardhan, Paul D

    2015-06-01

    Lymphedema of the upper extremity is a common side effect of lymph node dissection or irradiation of the axilla. Several techniques are being applied in order to examine the presence and severity of lymphedema. Measurement of circumference of the upper extremity is most frequently performed. An alternative is the water-displacement method. The aim of this study was to determine the reliability and the reproducibility of the "Inverse Water Volumetry apparatus" (IWV-apparatus) for the measurement of arm volumes. The IWV-apparatus is based on the water-displacement method. Measurements were performed by three breast cancer nurse practitioners on ten healthy volunteers in three weekly sessions. The intra-class correlation coefficient, defined as the ratio of the subject component to the total variance, equaled 0.99. The reliability index is calculated as 0.14 kg. This indicates that only changes in a patient's arm volume measurement of more than 0.14 kg would represent a true change in arm volume, which is about 6% of the mean arm volume of 2.3 kg. The IWV-apparatus proved to be a reliable and reproducible method to measure arm volume.

  12. Validity and reliability of bioelectrical impedance analysis and skinfold thickness in predicting body fat in military personnel.

    PubMed

    Aandstad, Anders; Holtberget, Kristian; Hageberg, Rune; Holme, Ingar; Anderssen, Sigmund A

    2014-02-01

    Previous studies show that body composition is related to injury risk and physical performance in soldiers. Thus, valid methods for measuring body composition in military personnel are needed. The frequently used body mass index method is not a valid measure of body composition in soldiers, but reliability and validity of alternative field methods are less investigated in military personnel. Thus, we carried out test and retest of skinfold (SKF), single frequency bioelectrical impedance analysis (SF-BIA), and multifrequency bioelectrical impedance analysis measurements in 65 male and female soldiers. Several validated equations were used to predict percent body fat from these methods. Dual-energy X-ray absorptiometry was also measured, and acted as the criterion method. Results showed that SF-BIA was the most reliable method in both genders. In women, SF-BIA was also the most valid method, whereas SKF or a combination of SKF and SF-BIA produced the highest validity in men. Reliability and validity varied substantially among the equations examined. The best methods and equations produced test-retest 95% limits of agreement below ±1% points, whereas the corresponding validity figures were ±3.5% points. Each investigator and practitioner must consider whether such measurement errors are acceptable for its specific use. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.

  13. Navigational Traffic Conflict Technique: A Proactive Approach to Quantitative Measurement of Collision Risks in Port Waters

    NASA Astrophysics Data System (ADS)

    Debnath, Ashim Kumar; Chin, Hoong Chor

    Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.

  14. Spurious correlations and inference in landscape genetics

    Treesearch

    Samuel A. Cushman; Erin L. Landguth

    2010-01-01

    Reliable interpretation of landscape genetic analyses depends on statistical methods that have high power to identify the correct process driving gene flow while rejecting incorrect alternative hypotheses. Little is known about statistical power and inference in individual-based landscape genetics. Our objective was to evaluate the power of causalmodelling with partial...

  15. Sensitivity of wildlife habitat models to uncertainties in GIS data

    NASA Technical Reports Server (NTRS)

    Stoms, David M.; Davis, Frank W.; Cogan, Christopher B.

    1992-01-01

    Decision makers need to know the reliability of output products from GIS analysis. For many GIS applications, it is not possible to compare these products to an independent measure of 'truth'. Sensitivity analysis offers an alternative means of estimating reliability. In this paper, we present a CIS-based statistical procedure for estimating the sensitivity of wildlife habitat models to uncertainties in input data and model assumptions. The approach is demonstrated in an analysis of habitat associations derived from a GIS database for the endangered California condor. Alternative data sets were generated to compare results over a reasonable range of assumptions about several sources of uncertainty. Sensitivity analysis indicated that condor habitat associations are relatively robust, and the results have increased our confidence in our initial findings. Uncertainties and methods described in the paper have general relevance for many GIS applications.

  16. Reliability assessment of slender concrete columns at the stability failure

    NASA Astrophysics Data System (ADS)

    Valašík, Adrián; Benko, Vladimír; Strauss, Alfred; Täubling, Benjamin

    2018-01-01

    The European Standard for designing concrete columns within the use of non-linear methods shows deficiencies in terms of global reliability, in case that the concrete columns fail by the loss of stability. The buckling failure is a brittle failure which occurs without warning and the probability of its formation depends on the columns slenderness. Experiments with slender concrete columns were carried out in cooperation with STRABAG Bratislava LTD in Central Laboratory of Faculty of Civil Engineering SUT in Bratislava. The following article aims to compare the global reliability of slender concrete columns with slenderness of 90 and higher. The columns were designed according to methods offered by EN 1992-1-1 [1]. The mentioned experiments were used as basis for deterministic nonlinear modelling of the columns and subsequent the probabilistic evaluation of structural response variability. Final results may be utilized as thresholds for loading of produced structural elements and they aim to present probabilistic design as less conservative compared to classic partial safety factor based design and alternative ECOV method.

  17. Requirements for diagnosis of malaria at different levels of the laboratory network in Africa.

    PubMed

    Long, Earl G

    2009-06-01

    The rapid increase of resistance to cheap, reliable antimalarials, the increasing cost of effective drugs, and the low specificity of clinical diagnosis has increased the need for more reliable diagnostic methods for malaria. The most commonly used and most reliable remains microscopic examination of stained blood smears, but this technique requires skilled personnel, precision instruments, and ideally a source of electricity. Microscopy has the advantage of enabling the examiner to identify the species, stage, and density of an infection. An alternative to microscopy is the rapid diagnostic test (RDT), which uses a labeled monoclonal antibody to detect circulating parasitic antigens. This test is most commonly used to detect Plasmodium falciparum infections and is available in a plastic cassette format. Both microscopy and RDTs should be available at all levels of laboratory service in endemic areas, but in peripheral laboratories with minimally trained staff, the RDT may be a more practical diagnostic method.

  18. Oxygen production using solid-state zirconia electrolyte technology

    NASA Technical Reports Server (NTRS)

    Suitor, Jerry W.; Clark, Douglas J.

    1991-01-01

    High purity oxygen is required for a number of scientific, medical, and industrial applications. Traditionally, these needs have been met by cryogenic distillation or pressure swing adsorption systems designed to separate oxygen from air. Oxygen separation from air via solid-state zirconia electrolyte technology offers an alternative to these methods. The technology has several advantages over the traditional methods, including reliability, compactness, quiet operation, high purity output, and low power consumption.

  19. Selection of reference standard during method development using the analytical hierarchy process.

    PubMed

    Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun

    2015-03-25

    Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. EMG normalization method based on grade 3 of manual muscle testing: Within- and between-day reliability of normalization tasks and application to gait analysis.

    PubMed

    Tabard-Fougère, Anne; Rose-Dulcina, Kevin; Pittet, Vincent; Dayer, Romain; Vuillerme, Nicolas; Armand, Stéphane

    2018-02-01

    Electromyography (EMG) is an important parameter in Clinical Gait Analysis (CGA), and is generally interpreted with timing of activation. EMG amplitude comparisons between individuals, muscles or days need normalization. There is no consensus on existing methods. The gold standard, maximum voluntary isometric contraction (MVIC), is not adapted to pathological populations because patients are often unable to perform an MVIC. The normalization method inspired by the isometric grade 3 of manual muscle testing (isoMMT3), which is the ability of a muscle to maintain a position against gravity, could be an interesting alternative. The aim of this study was to evaluate the within- and between-day reliability of the isoMMT3 EMG normalizing method during gait compared with the conventional MVIC method. Lower limb muscles EMG (gluteus medius, rectus femoris, tibialis anterior, semitendinosus) were recorded bilaterally in nine healthy participants (five males, aged 29.7±6.2years, BMI 22.7±3.3kgm -2 ) giving a total of 18 independent legs. Three repeated measurements of the isoMMT3 and MVIC exercises were performed with an EMG recording. EMG amplitude of the muscles during gait was normalized by these two methods. This protocol was repeated one week later. Within- and between-day reliability of normalization tasks were similar for isoMMT3 and MVIC methods. Within- and between-day reliability of gait EMG normalized by isoMMT3 was higher than with MVIC normalization. These results indicate that EMG normalization using isoMMT3 is a reliable method with no special equipment needed and will support CGA interpretation. The next step will be to evaluate this method in pathological populations. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Speech Recognition Technology for Disabilities Education

    ERIC Educational Resources Information Center

    Tang, K. Wendy; Kamoua, Ridha; Sutan, Victor; Farooq, Omer; Eng, Gilbert; Chu, Wei Chern; Hou, Guofeng

    2005-01-01

    Speech recognition is an alternative to traditional methods of interacting with a computer, such as textual input through a keyboard. An effective system can replace or reduce the reliability on standard keyboard and mouse input. This can especially assist dyslexic students who have problems with character or word use and manipulation in a textual…

  2. Occasions and the Reliability of Classroom Observations: Alternative Conceptualizations and Methods of Analysis

    ERIC Educational Resources Information Center

    Meyer, J. Patrick; Cash, Anne H.; Mashburn, Andrew

    2011-01-01

    Student-teacher interactions are dynamic relationships that change and evolve over the course of a school year. Measuring classroom quality through observations that focus on these interactions presents challenges when observations are conducted throughout the school year. Variability in observed scores could reflect true changes in the quality of…

  3. An Assessment of Propensity Score Matching as a Nonexperimental Impact Estimator: Evidence from Mexico's PROGRESA Program

    ERIC Educational Resources Information Center

    Diaz, Juan Jose; Handa, Sudhanshu

    2006-01-01

    Not all policy questions can be addressed by social experiments. Nonexperimental evaluation methods provide an alternative to experimental designs but their results depend on untestable assumptions. This paper presents evidence on the reliability of propensity score matching (PSM), which estimates treatment effects under the assumption of…

  4. Inter-Method Reliability of School Effectiveness Measures: A Comparison of Value-Added and Regression Discontinuity Estimates

    ERIC Educational Resources Information Center

    Perry, Thomas

    2017-01-01

    Value-added (VA) measures are currently the predominant approach used to compare the effectiveness of schools. Recent educational effectiveness research, however, has developed alternative approaches including the regression discontinuity (RD) design, which also allows estimation of absolute school effects. Initial research suggests RD is a viable…

  5. Environmental damage schedules: community judgments of importance and assessments of losses

    Treesearch

    Ratana Chuenpagdee; Jack L. Knetsch; Thomas C. Brown

    2001-01-01

    Available methods of valuing environmental changes are often limited in their applicability to current issues such as damage assessment and implementing regulatory controls, or may otherwise not provide reliable readings of community preferences. An alternative is to base decisions on predetermined fixed schedules of sanctions, restrictions, damage awards, and other...

  6. Pilot scale high solids anaerobic digestion of steam autoclaved municipal solid waste (MSW) pulp

    USDA-ARS?s Scientific Manuscript database

    Steam autoclaving is an efficient method for the separation and recovery of nearly all organics from MSW, yet a reliable alternative outlet for the large volume of organics produced has not yet been successfully demonstrated. The material produced by the autoclave contains a high concentration of s...

  7. Antigen-antibody biorecognition events as discriminated by noise analysis of force spectroscopy curves.

    PubMed

    Bizzarri, Anna Rita; Cannistraro, Salvatore

    2014-08-22

    Atomic force spectroscopy is able to extract kinetic and thermodynamic parameters of biomolecular complexes provided that the registered unbinding force curves could be reliably attributed to the rupture of the specific complex interactions. To this aim, a commonly used strategy is based on the analysis of the stretching features of polymeric linkers which are suitably introduced in the biomolecule-substrate immobilization procedure. Alternatively, we present a method to select force curves corresponding to specific biorecognition events, which relies on a careful analysis of the force fluctuations of the biomolecule-functionalized cantilever tip during its approach to the partner molecules immobilized on a substrate. In the low frequency region, a characteristic 1/f (α) noise with α equal to one (flickering noise) is found to replace white noise in the cantilever fluctuation power spectrum when, and only when, a specific biorecognition process between the partners occurs. The method, which has been validated on a well-characterized antigen-antibody complex, represents a fast, yet reliable alternative to the use of linkers which may involve additional surface chemistry and reproducibility concerns.

  8. A method for generating reliable atomistic models of amorphous polymers based on a random search of energy minima

    NASA Astrophysics Data System (ADS)

    Curcó, David; Casanovas, Jordi; Roca, Marc; Alemán, Carlos

    2005-07-01

    A method for generating atomistic models of dense amorphous polymers is presented. The method is organized in a two-steps procedure. First, structures are generated using an algorithm that minimizes the torsional strain. After this, a relaxation algorithm is applied to minimize the non-bonding interactions. Two alternative relaxation methods, which are based simple minimization and Concerted Rotation techniques, have been implemented. The performance of the method has been checked by simulating polyethylene, polypropylene, nylon 6, poly(L,D-lactic acid) and polyglycolic acid.

  9. JaCVAM-organized international validation study of the in vivo rodent alkaline comet assay for the detection of genotoxic carcinogens: I. Summary of pre-validation study results.

    PubMed

    Uno, Yoshifumi; Kojima, Hajime; Omori, Takashi; Corvi, Raffaella; Honma, Masamistu; Schechtman, Leonard M; Tice, Raymond R; Burlinson, Brian; Escobar, Patricia A; Kraynak, Andrew R; Nakagawa, Yuzuki; Nakajima, Madoka; Pant, Kamala; Asano, Norihide; Lovell, David; Morita, Takeshi; Ohno, Yasuo; Hayashi, Makoto

    2015-07-01

    The in vivo rodent alkaline comet assay (comet assay) is used internationally to investigate the in vivo genotoxic potential of test chemicals. This assay, however, has not previously been formally validated. The Japanese Center for the Validation of Alternative Methods (JaCVAM), with the cooperation of the U.S. NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM)/the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), the European Centre for the Validation of Alternative Methods (ECVAM), and the Japanese Environmental Mutagen Society/Mammalian Mutagenesis Study Group (JEMS/MMS), organized an international validation study to evaluate the reliability and relevance of the assay for identifying genotoxic carcinogens, using liver and stomach as target organs. The ultimate goal of this validation effort was to establish an Organisation for Economic Co-operation and Development (OECD) test guideline. The purpose of the pre-validation studies (i.e., Phase 1 through 3), conducted in four or five laboratories with extensive comet assay experience, was to optimize the protocol to be used during the definitive validation study. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Southern blotting.

    PubMed

    Brown, T

    2001-05-01

    Southern blotting is the transfer of DNA fragments from an electrophoresis gel to a membrane support (the properties and advantages of the different types of membrane, transfer buffer, and transfer method are discussed in detail), resulting in immobilization of the DNA fragments, so the membrane carries a semipermanent reproduction of the banding pattern of the gel. After immobilization, the DNA can be subjected to hybridization analysis, enabling bands with sequence similarity to a labeled probe to be identified. This appendix describes Southern blotting via upward capillary transfer of DNA from an agarose gel onto a nylon or nitrocellulose membrane, using a high-salt transfer buffer to promote binding of DNA to the membrane. With the high-salt buffer, the DNA becomes bound to the membrane during transfer but not permanently immobilized. Immobilization is achieved by UV irradiation (for nylon) or baking (for nitrocellulose). A Support Protocol describes how to calibrate a UV transilluminator for optimal UV irradiation of a nylon membrane. An alternate protocol details transfer using nylon membranes and an alkaline buffer, and is primarily used with positively charged nylon membranes. The advantage of this combination is that no post-transfer immobilization step is required, as the positively charged membrane binds DNA irreversibly under alkaline transfer conditions. The method can also be used with neutral nylon membranes but less DNA will be retained. A second alternate protocol describes a transfer method based on a different transfer-stack setup. The traditional method of upward capillary transfer of DNA from gel to membrane described in the first basic and alternate protocols has certain disadvantages, notably the fact that the gel can become crushed by the weighted filter papers and paper towels that are laid on top of it. This slows down the blotting process and may reduce the amount of DNA that can be transferred. The downward capillary method described in the second alternate protocol is therefore more rapid than the basic protocol and can result in more complete transfer. Although the ease and reliability of capillary transfer methods makes this far and away the most popular system for Southern blotting with agarose gels, it unfortunately does not work with polyacrylamide gels, whose smaller pore size impedes the transverse movement of the DNA molecules. The third alternate protocol describes an electroblotting procedure that is currently the most reliable method for transfer of DNA from a polyacrylamide gel. Dot and slot blotting are also described.

  11. Fly-by-Wireless Update

    NASA Technical Reports Server (NTRS)

    Studor, George

    2010-01-01

    The presentation reviews what is meant by the term 'fly-by-wireless', common problems and motivation, provides recent examples, and examines NASA's future and basis for collaboration. The vision is to minimize cables and connectors and increase functionality across the aerospace industry by providing reliable, lower cost, modular, and higher performance alternatives to wired data connectivity to benefit the entire vehicle/program life-cycle. Focus areas are system engineering and integration methods to reduce cables and connectors, vehicle provisions for modularity and accessibility, and a 'tool box' of alternatives to wired connectivity.

  12. Influences of donor/acceptor ratio on the optical and electrical properties of the D/A alternating model oligomers: A density functional theory study

    NASA Astrophysics Data System (ADS)

    Zheng, Hao; Zhao, Yang; Song, Ming-Xing; Wang, Jin; Chen, Li-Qiao; Sun, Lei; Bai, Fu-Quan

    2018-06-01

    We adopted an ingenious method that cut out the DA alternating oligomers from the corresponding DA alternating copolymers. From analyzing the orbital compositions of the HOMOs and LUMOs as well as the reorganization energies, we found the level of charge transfer is increased with the increasing of D/A ratio, but ionization potentials and electron affinities show a contrary trend. Moreover, with the greater ratio, the trend in the nearness of two transitions results in broadening the absorption band in the visible range. That is why experimentally improving the ratio is beneficial for the copolymers used as the p-type materials in the BHJ solar cells. This method is impossible to take the real copolymer system, however, it could provide a strategy to avoid the limitation of the theory level and perform reliable result to study the intrinsic properties of DA alternating copolymers, which can provide a guidance to experimental works.

  13. Inter-rater reliability of a food store checklist to assess availability of healthier alternatives to the energy-dense snacks and beverages commonly consumed by children.

    PubMed

    Izumi, Betty T; Findholt, Nancy E; Pickus, Hayley A; Nguyen, Thuan; Cuneo, Monica K

    2014-06-01

    Food stores have gained attention as potential intervention targets for improving children's eating habits. There is a need for valid and reliable instruments to evaluate changes in food store snack and beverage availability secondary to intervention. The aim of this study was to develop a valid, reliable, and resource-efficient instrument to evaluate the healthfulness of food store environments faced by children. The SNACZ food store checklist was developed to assess availability of healthier alternatives to the energy-dense snacks and beverages commonly consumed by children. After pretesting, two trained observers independently assessed the availability of 48 snack and beverage items in 50 food stores located near elementary and middle schools in Portland, Oregon, over a 2-week period in summer 2012. Inter-rater reliability was calculated using the kappa statistic. Overall, the instrument had mostly high inter-rater reliability. Seventy-three percent of items assessed had almost perfect or substantial reliability. Two items had moderate reliability (0.41-0.60), and no items had a reliability score less than 0.41. Eleven items occurred too infrequently to generate a kappa score. The SNACZ food store checklist is a first-step toward developing a valid and reliable tool to evaluate the healthfulness of food store environments faced by children. The tool can be used to compare availability of healthier snack and beverage alternatives across communities and measure change secondary to intervention. As a wider variety of healthier snack and beverage alternatives become available in food stores, the checklist should be updated.

  14. Assessment of four midcarpal radiologic determinations.

    PubMed

    Cho, Mickey S; Battista, Vincent; Dubin, Norman H; Pirela-Cruz, Miguel

    2006-03-01

    Several radiologic measurement methods have been described for determining static carpal alignment of the wrist. These include the scapholunate, radiolunate, and capitolunate angles. The triangulation method is an alternative radiologic measurement which we believe is easier to use and more reproducible and reliable than the above mentioned methods. The purpose of this study is to assess the intraobserver reproducibility and interobserver reliability of the triangulation method, scapholunate, radiolunate, and capitolunate angles. Twenty orthopaedic residents and staff at varying levels of training made four radiologic measurements including the scapholunate, radiolunate and capitolunate angles as well as the triangulation method on five different lateral, digitized radiographs of the wrist and forearm in neutral radioulnar deviation. Thirty days after the initial measurements, the participants repeated the four radiologic measurements using the same radiographs. The triangulation method had the best intra-and-interobserver agreement of the four methods tested. This agreement was significantly better than the capitolunate and radiolunate angles. The scapholunate angle had the next best intraobserver reproducibility and interobserver reliability. The triangulation method has the best overall observer agreement when compared to the scapholunate, radiolunate, and capitolunate angles in determining static midcarpal alignment. No comment can be made on the validity of the measurements since there is no radiographic gold standard in determining static carpal alignment.

  15. Constructing the 'Best' Reliability Data for the Job - Developing Generic Reliability Data from Alternative Sources Early in a Product's Development Phase

    NASA Technical Reports Server (NTRS)

    Kleinhammer, Roger K.; Graber, Robert R.; DeMott, D. L.

    2016-01-01

    Reliability practitioners advocate getting reliability involved early in a product development process. However, when assigned to estimate or assess the (potential) reliability of a product or system early in the design and development phase, they are faced with lack of reasonable models or methods for useful reliability estimation. Developing specific data is costly and time consuming. Instead, analysts rely on available data to assess reliability. Finding data relevant to the specific use and environment for any project is difficult, if not impossible. Instead, analysts attempt to develop the "best" or composite analog data to support the assessments. Industries, consortia and vendors across many areas have spent decades collecting, analyzing and tabulating fielded item and component reliability performance in terms of observed failures and operational use. This data resource provides a huge compendium of information for potential use, but can also be compartmented by industry, difficult to find out about, access, or manipulate. One method used incorporates processes for reviewing these existing data sources and identifying the available information based on similar equipment, then using that generic data to derive an analog composite. Dissimilarities in equipment descriptions, environment of intended use, quality and even failure modes impact the "best" data incorporated in an analog composite. Once developed, this composite analog data provides a "better" representation of the reliability of the equipment or component. It can be used to support early risk or reliability trade studies, or analytical models to establish the predicted reliability data points. It also establishes a baseline prior that may updated based on test data or observed operational constraints and failures, i.e., using Bayesian techniques. This tutorial presents a descriptive compilation of historical data sources across numerous industries and disciplines, along with examples of contents and data characteristics. It then presents methods for combining failure information from different sources and mathematical use of this data in early reliability estimation and analyses.

  16. System Analysis by Mapping a Fault-tree into a Bayesian-network

    NASA Astrophysics Data System (ADS)

    Sheng, B.; Deng, C.; Wang, Y. H.; Tang, L. H.

    2018-05-01

    In view of the limitations of fault tree analysis in reliability assessment, Bayesian Network (BN) has been studied as an alternative technology. After a brief introduction to the method for mapping a Fault Tree (FT) into an equivalent BN, equations used to calculate the structure importance degree, the probability importance degree and the critical importance degree are presented. Furthermore, the correctness of these equations is proved mathematically. Combining with an aircraft landing gear’s FT, an equivalent BN is developed and analysed. The results show that richer and more accurate information have been achieved through the BN method than the FT, which demonstrates that the BN is a superior technique in both reliability assessment and fault diagnosis.

  17. Performance Analysis of Stirling Engine-Driven Vapor Compression Heat Pump System

    NASA Astrophysics Data System (ADS)

    Kagawa, Noboru

    Stirling engine-driven vapor compression systems have many unique advantages including higher thermal efficiencies, preferable exhaust gas characteristics, multi-fuel usage, and low noise and vibration which can play an important role in alleviating environmental and energy problems. This paper introduces a design method for the systems based on reliable mathematical methods for Stirling and Rankin cycles using reliable thermophysical information for refrigerants. The model deals with a combination of a kinematic Stirling engine and a scroll compressor. Some experimental coefficients are used to formulate the model. The obtained results show the performance behavior in detail. The measured performance of the actual system coincides with the calculated results. Furthermore, the calculated results clarify the performance using alternative refrigerants for R-22.

  18. NASA Applications and Lessons Learned in Reliability Engineering

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Fuller, Raymond P.

    2011-01-01

    Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.

  19. A novel iterative scheme and its application to differential equations.

    PubMed

    Khan, Yasir; Naeem, F; Šmarda, Zdeněk

    2014-01-01

    The purpose of this paper is to employ an alternative approach to reconstruct the standard variational iteration algorithm II proposed by He, including Lagrange multiplier, and to give a simpler formulation of Adomian decomposition and modified Adomian decomposition method in terms of newly proposed variational iteration method-II (VIM). Through careful investigation of the earlier variational iteration algorithm and Adomian decomposition method, we find unnecessary calculations for Lagrange multiplier and also repeated calculations involved in each iteration, respectively. Several examples are given to verify the reliability and efficiency of the method.

  20. High performance liquid chromatography method for the determination of cinnamyl alcohol dehydrogenase activity in soybean roots.

    PubMed

    dos Santos, W D; Ferrarese, Maria de Lourdes Lucio; Ferrarese-Filho, O

    2006-01-01

    This study proposes a simple, quick and reliable method for determining the cinnamyl alcohol dehydrogenase (CAD; EC 1.1.1.195) activity in soybean (Glycine max L. Merr.) roots using reversed-phase high performance liquid chromatography (RP-HPLC). The method includes a single extraction of the tissue and conduction of the enzymatic reaction at 30 degrees C with cinnamaldehydes (coniferyl or sinapyl), substrates of CAD. Disappearance of the substrates in the reaction mixture is monitored at 340 nm (for coniferaldehyde) or 345 nm (for sinapaldehyde) by isocratic elution with methanol/acetic acid through a GLC-ODS (M) column. This HPLC technique furnishes a rapid and reliable measure of cinnamaldehyde substrates, and may be used as an alternative tool to analyze CAD activity in enzyme preparation without previous purification.

  1. Reliability-based optimization of maintenance scheduling of mechanical components under fatigue

    PubMed Central

    Beaurepaire, P.; Valdebenito, M.A.; Schuëller, G.I.; Jensen, H.A.

    2012-01-01

    This study presents the optimization of the maintenance scheduling of mechanical components under fatigue loading. The cracks of damaged structures may be detected during non-destructive inspection and subsequently repaired. Fatigue crack initiation and growth show inherent variability, and as well the outcome of inspection activities. The problem is addressed under the framework of reliability based optimization. The initiation and propagation of fatigue cracks are efficiently modeled using cohesive zone elements. The applicability of the method is demonstrated by a numerical example, which involves a plate with two holes subject to alternating stress. PMID:23564979

  2. Assessing the persistence, bioaccumulation potential and toxicity of brominated flame retardants: data availability and quality for 36 alternative brominated flame retardants.

    PubMed

    Stieger, Greta; Scheringer, Martin; Ng, Carla A; Hungerbühler, Konrad

    2014-12-01

    Polybrominated diphenylethers (PBDEs) and hexabromocyclododecane (HBCDD) are major brominated flame retardants (BFRs) that are now banned or under restrictions in many countries because of their persistence, bioaccumulation potential and toxicity (PBT properties). However, there is a wide range of alternative BFRs, such as decabromodiphenyl ethane and tribromophenol, that are increasingly used as replacements, but which may possess similar hazardous properties. This necessitates hazard and risk assessments of these compounds. For a set of 36 alternative BFRs, we searched 25 databases for chemical property data that are needed as input for a PBT assessment. These properties are degradation half-life, bioconcentration factor (BCF), octanol-water partition coefficient (Kow), and toxic effect concentrations in aquatic organisms. For 17 of the 36 substances, no data at all were found for these properties. Too few persistence data were available to even assess the quality of these data in a systematic way. The available data for Kow and toxicity show surprisingly high variability, which makes it difficult to identify the most reliable values. We propose methods for systematic evaluations of PBT-related chemical property data that should be performed before data are included in publicly available databases. Using these methods, we evaluated the data for Kow and toxicity in more detail and identified several inaccurate values. For most of the 36 alternative BFRs, the amount and the quality of the PBT-related property data need to be improved before reliable hazard and risk assessments of these substances can be performed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Estimating canopy cover from standard forest inventory measurements in western Oregon

    Treesearch

    Anne McIntosh; Andrew Gray; Steven. Garman

    2012-01-01

    Reliable measures of canopy cover are important in the management of public and private forests. However, direct sampling of canopy cover is both labor- and time-intensive. More efficient methods for estimating percent canopy cover could be empirically derived relationships between more readily measured stand attributes and canopy cover or, alternatively, the use of...

  4. RSE-40: An Alternate Scoring System for the Rosenberg Self-Esteem Scale (RSE).

    ERIC Educational Resources Information Center

    Wallace, Gaylen R.

    The Rosenberg Self-Esteem Inventory (RSE) is a 10-item scale purporting to measure self-esteem using self-acceptance and self-worth statements. This analysis covers concerns about the degree to which the RSE items represent a particular content universe, the RSE's applicability, factor analytic methods used, and the RSE's reliability and validity.…

  5. 40 CFR 146.10 - Plugging and abandoning Class I, II, III, IV, and V wells.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... cement in a manner which will not allow the movement of fluids either into or between underground sources... sources of drinking water. (2) Placement of the cement plugs shall be accomplished by one of the following... alternative method approved by the Director, which will reliably provide a comparable level of protection to...

  6. Making Meaningful Measurement in Survey Research: A Demonstration of the Utility of the Rasch Model. IR Applications. Volume 28

    ERIC Educational Resources Information Center

    Royal, Kenneth D.

    2010-01-01

    Quality measurement is essential in every form of research, including institutional research and assessment. This paper addresses the erroneous assumptions institutional researchers often make with regard to survey research and provides an alternative method to producing more valid and reliable measures. Rasch measurement models are discussed and…

  7. Length and Rate of Individual Participation in Various Activities on Recreation Sites and Areas

    Treesearch

    Gary L. Tyre; George A. James

    1971-01-01

    While statistically reliable methods exist for estimating recreation use on large areas, they often prove prohibitively expensive. Inexpensive alternatives involving the length and rate of individual participation in specific activites are presented, together with data and statistics on the recreational use of three large areas on the National Forests. This...

  8. Mission Reliability Estimation for Repairable Robot Teams

    NASA Technical Reports Server (NTRS)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the current design paradigm of building a minimal number of highly robust robots may not be the best way to design robots for extended missions.

  9. A comparison of the reliability of the trochanteric prominence angle test and the alternative method in healthy subjects.

    PubMed

    Yoon, Tae-Lim; Park, Kyung-Mi; Choi, Sil-Ah; Lee, Ji-Hyun; Jeong, Hyo-Jung; Cynn, Heon-Seock

    2014-04-01

    A wide range of intra- and inter-rater reliabilities of the trochanteric prominence angle test (TPAT) has been reported. We introduced the transcondylar angle test (TCAT) as an alternative to the TPAT and using a smartphone as a reliable measurement tool for femoral neck anteversion (FNA) measurement. The reliabilities of the TPAT and the TCAT, the reliability of using a smartphone as a clinical measurement tool, and the correlation between the difference value of medial knee joint space (KJS) between rest and tested positions and the difference value between the TPAT and TCAT were assessed. Two physical therapists independently determined the reliabilities of the TPAT with a digital inclinometer, the TCAT with a digital inclinometer, and the TCAT with a smartphone in 19 hips of 10 healthy subjects (5 male and 5 female, 22.2 ± 1.69 years). The medial KJS in rest and the tested position were assessed using a sonography. The intra-class correlation coefficients (ICC) for the intra-rater reliabilities of TPAT with a digital inclinometer (ICC = 0.92), TCAT with a digital inclinometer (ICC = 0.94) and a smartphone (ICC = 0.95) in both testers were substantial. The inter-rater reliability of TPAT with a digital inclinometer was fair (ICC = 0.48) while TCAT with a digital inclinometer (ICC = 0.89) and a smartphone (ICC = 0.85) were substantial. The correlation between the difference value of medial KJS between rest and tested positions and the difference value between TPAT and TCAT was low and statistically non-significant (r = 0.114; p = 0.325). The TCAT would be more reliable than the TPAT in inter-rater test. Using a smartphone is a clinically comparable measuring tool to a digital inclinometer. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. A Z-number-based decision making procedure with ranking fuzzy numbers method

    NASA Astrophysics Data System (ADS)

    Mohamad, Daud; Shaharani, Saidatull Akma; Kamis, Nor Hanimah

    2014-12-01

    The theory of fuzzy set has been in the limelight of various applications in decision making problems due to its usefulness in portraying human perception and subjectivity. Generally, the evaluation in the decision making process is represented in the form of linguistic terms and the calculation is performed using fuzzy numbers. In 2011, Zadeh has extended this concept by presenting the idea of Z-number, a 2-tuple fuzzy numbers that describes the restriction and the reliability of the evaluation. The element of reliability in the evaluation is essential as it will affect the final result. Since this concept can still be considered as new, available methods that incorporate reliability for solving decision making problems is still scarce. In this paper, a decision making procedure based on Z-numbers is proposed. Due to the limitation of its basic properties, Z-numbers will be first transformed to fuzzy numbers for simpler calculations. A method of ranking fuzzy number is later used to prioritize the alternatives. A risk analysis problem is presented to illustrate the effectiveness of this proposed procedure.

  11. Aqueous sodium chloride induced intergranular corrosion of Al-Li-Cu alloys

    NASA Technical Reports Server (NTRS)

    Pizzo, P. P.; Daeschner, D. L.

    1986-01-01

    Two methods have been explored to assess the susceptibility of Al-Li-Cu alloys to intergranular corrosion in aqueous sodium chloride solution. They are: (1) constant extension rate testing with and without alternate-immersion preexposure and (2) metallographic examination after exposure to a NaCl-H2O2 corrosive solution per Mil-H-6088F. Intergranular corrosion was found to occur in both powder and ingot metallurgy alloys of similar composition, using both methods. Underaging rendered the alloys most susceptible. The results correlate to stress-corrosion data generated in conventional time-to-failure and crack growth-rate tests. Alternate-immersion preexposure may be a reliable means to assess stress corrosion susceptibility of Al-Li-Cu alloys.

  12. Alternative Fuels Data Center: Minnesota School District Finds Cost

    Science.gov Websites

    Savings, Cold-Weather Reliability with Propane Buses Minnesota School District Finds Cost Center: Minnesota School District Finds Cost Savings, Cold-Weather Reliability with Propane Buses on Facebook Tweet about Alternative Fuels Data Center: Minnesota School District Finds Cost Savings, Cold

  13. Strategic planning decision making using fuzzy SWOT-TOPSIS with reliability factor

    NASA Astrophysics Data System (ADS)

    Mohamad, Daud; Afandi, Nur Syamimi; Kamis, Nor Hanimah

    2015-10-01

    Strategic planning is a process of decision making and action for long-term activities in an organization. The Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis has been commonly used to help organizations in strategizing their future direction by analyzing internal and external environment. However, SWOT analysis has some limitations as it is unable to prioritize appropriately the multiple alternative strategic decisions. Some efforts have been made to solve this problem by incorporating Multi Criteria Decision Making (MCDM) methods. Nevertheless, another important aspect has raised concerns on obtaining the decision that is the reliability of the information. Decision makers evaluate differently depending on their level of confidence or sureness in the evaluation. This study proposes a decision making procedure for strategic planning using SWOT-TOPSIS method by incorporating the reliability factor of the evaluation based on Z-number. An example using a local authority in the east coast of Malaysia is illustrated to determine the strategic options ranking and to prioritize factors in each SWOT category.

  14. Getting It Right Matters: Climate Spectra and Their Estimation

    NASA Astrophysics Data System (ADS)

    Privalsky, Victor; Yushkov, Vladislav

    2018-06-01

    In many recent publications, climate spectra estimated with different methods from observed, GCM-simulated, and reconstructed time series contain many peaks at time scales from a few years to many decades and even centuries. However, respective spectral estimates obtained with the autoregressive (AR) and multitapering (MTM) methods showed that spectra of climate time series are smooth and contain no evidence of periodic or quasi-periodic behavior. Four order selection criteria for the autoregressive models were studied and proven sufficiently reliable for 25 time series of climate observations at individual locations or spatially averaged at local-to-global scales. As time series of climate observations are short, an alternative reliable nonparametric approach is Thomson's MTM. These results agree with both the earlier climate spectral analyses and the Markovian stochastic model of climate.

  15. Robust Coefficients Alpha and Omega and Confidence Intervals with Outlying Observations and Missing Data Methods and Software

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Yuan, Ke-Hai

    2016-01-01

    Cronbach's coefficient alpha is a widely used reliability measure in social, behavioral, and education sciences. It is reported in nearly every study that involves measuring a construct through multiple items. With non-tau-equivalent items, McDonald's omega has been used as a popular alternative to alpha in the literature. Traditional estimation…

  16. Robust Coefficients Alpha and Omega and Confidence Intervals with Outlying Observations and Missing Data: Methods and Software

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Yuan, Ke-Hai

    2016-01-01

    Cronbach's coefficient alpha is a widely used reliability measure in social, behavioral, and education sciences. It is reported in nearly every study that involves measuring a construct through multiple items. With non-tau-equivalent items, McDonald's omega has been used as a popular alternative to alpha in the literature. Traditional estimation…

  17. Towards early software reliability prediction for computer forensic tools (case study).

    PubMed

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  18. Alternate Forms Reliability of the Behavioral Relaxation Scale: Preliminary Results

    ERIC Educational Resources Information Center

    Lundervold, Duane A.; Dunlap, Angel L.

    2006-01-01

    Alternate forms reliability of the Behavioral Relaxation Scale (BRS; Poppen,1998), a direct observation measure of relaxed behavior, was examined. A single BRS score, based on long duration observation (5-minute), has been found to be a valid measure of relaxation and is correlated with self-report and some physiological measures. Recently,…

  19. Following Phaedrus: Alternate Choices in Surmounting the Reliability/Validity Dilemma

    ERIC Educational Resources Information Center

    Slomp, David H.; Fuite, Jim

    2004-01-01

    Specialists in the field of large-scale, high-stakes writing assessment have, over the last forty years alternately discussed the issue of maximizing either reliability or validity in test design. Factors complicating the debate--such as Messick's (1989) expanded definition of validity, and the ethical implications of testing--are explored. An…

  20. A Correction Equation for Jump Height Measured Using the Just Jump System.

    PubMed

    McMahon, John J; Jones, Paul A; Comfort, Paul

    2016-05-01

    To determine the concurrent validity and reliability of the popular Just Jump system (JJS) for determining jump height and, if necessary, provide a correction equation for future reference. Eighteen male college athletes performed 3 bilateral countermovement jumps (CMJs) on 2 JJSs (alternative method) that were placed on top of a force platform (criterion method). Two JJSs were used to establish consistency between systems. Jump height was calculated from flight time obtained from the JJS and force platform. Intraclass correlation coefficients (ICCs) demonstrated excellent within-session reliability of the CMJ height measurement derived from both the JJS (ICC = .96, P < .001) and the force platform (ICC = .96, P < .001). Dependent t tests revealed that the JJS yielded a significantly greater CMJ jump height (0.46 ± 0.09 m vs 0.33 ± 0.08 m) than the force platform (P < .001, Cohen d = 1.39, power = 1.00). There was, however, an excellent relationship between CMJ heights derived from the JJS and force platform (r = .998, P < .001, power = 1.00), with a coefficient of determination (R2) of .995. Therefore, the following correction equation was produced: Criterion jump height = (0.8747 × alternative jump height) - 0.0666. The JJS provides a reliable but overestimated measure of jump height. It is suggested, therefore, that practitioners who use the JJS as part of future work apply the correction equation presented in this study to resultant jump-height values.

  1. Equivalency testing of TTC Tergitol 7 agar (ISO 9308-1:2000) with five culture media for the detection of E. coli in water samples in Greece.

    PubMed

    Mavridou, A; Smeti, E; Mandilara, G; Mandilara, G; Boufa, P; Vagiona-Arvanitidou, M; Vantarakis, A; Vassilandonopoulou, G; Pappa, O; Roussia, V; Tzouanopoulos, A; Livadara, M; Aisopou, I; Maraka, V; Nikolaou, E; Mandilara, G

    2010-01-01

    In this study ten laboratories in Greece compared the performance of reference method TTC Tergitol 7 Agar (with the additional test of beta-glucuronidase production) with five alternative methods, to detect E. coli in water, in line with European Water Directive recommendations. The samples were prepared by spiking drinking water with sewage effluent following a standard protocol. Chlorinated and non-chlorinated samples were used. The statistical analysis was based on the mean relative difference of confirmed counts and was performed in line with ISO 17994. The results showed that in total, three of the alternative methods (Chromocult Coliform agar, Membrane Lauryl Sulfate agar and Trypton Bilex-glucuronidase medium) were not different from TTC Tergitol 7 agar (TTC Tergitol 7 agar vs Chromocult Coliform agar, 294 samples, mean RD% 5.55; vs MLSA, 302 samples, mean RD% 1; vs TBX, 297 samples, mean RD% -2.78). The other two alternative methods (Membrane Faecal coliform medium and Colilert 18/ Quantitray) gave significantly higher counts than TTC Tergitol 7 agar (TTC Tergitol 7 agar vs MFc, 303 samples, mean RD% 8.81; vs Colilert-18/Quantitray, 76 samples, mean RD% 18.91). In other words, the alternative methods generated performance that was as reliable as, or even better than, the reference method. This study will help laboratories in Greece overcome culture and counting problems deriving from the EU reference method for E. coli counts in water samples.

  2. Meta-analysis of Odds Ratios: Current Good Practices

    PubMed Central

    Chang, Bei-Hung; Hoaglin, David C.

    2016-01-01

    Background Many systematic reviews of randomized clinical trials lead to meta-analyses of odds ratios. The customary methods of estimating an overall odds ratio involve weighted averages of the individual trials’ estimates of the logarithm of the odds ratio. That approach, however, has several shortcomings, arising from assumptions and approximations, that render the results unreliable. Although the problems have been documented in the literature for many years, the conventional methods persist in software and applications. A well-developed alternative approach avoids the approximations by working directly with the numbers of subjects and events in the arms of the individual trials. Objective We aim to raise awareness of methods that avoid the conventional approximations, can be applied with widely available software, and produce more-reliable results. Methods We summarize the fixed-effect and random-effects approaches to meta-analysis; describe conventional, approximate methods and alternative methods; apply the methods in a meta-analysis of 19 randomized trials of endoscopic sclerotherapy in patients with cirrhosis and esophagogastric varices; and compare the results. We demonstrate the use of SAS, Stata, and R software for the analysis. Results In the example, point estimates and confidence intervals for the overall log-odds-ratio differ between the conventional and alternative methods, in ways that can affect inferences. Programming is straightforward in the three software packages; an appendix gives the details. Conclusions The modest additional programming required should not be an obstacle to adoption of the alternative methods. Because their results are unreliable, use of the conventional methods for meta-analysis of odds ratios should be discontinued. PMID:28169977

  3. Electropyroelectric technique: A methodology free of fitting procedures for thermal effusivity determination in liquids.

    PubMed

    Ivanov, R; Marin, E; Villa, J; Gonzalez, E; Rodríguez, C I; Olvera, J E

    2015-06-01

    This paper describes an alternative methodology to determine the thermal effusivity of a liquid sample using the recently proposed electropyroelectric technique, without fitting the experimental data with a theoretical model and without having to know the pyroelectric sensor related parameters, as in most previous reported approaches. The method is not absolute, because a reference liquid with known thermal properties is needed. Experiments have been performed that demonstrate the high reliability and accuracy of the method with measurement uncertainties smaller than 3%.

  4. Reliability and equivalence of alternate forms for the Symbol Digit Modalities Test: implications for multiple sclerosis clinical trials.

    PubMed

    Benedict, Ralph H B; Smerbeck, Audrey; Parikh, Rajavi; Rodgers, Jonathan; Cadavid, Diego; Erlanger, David

    2012-09-01

    Cognitive impairment is common in multiple sclerosis (MS), but is seldom assessed in clinical trials investigating the effects of disease-modifying therapies. The Symbol Digit Modalities Test (SDMT) is a particularly promising tool due to its sensitivity and robust correlation with brain magnetic resonance imaging (MRI) and vocational disability. Unfortunately, there are no validated alternate SDMT forms, which are needed to mitigate practice effects. The aim of the study was to assess the reliability and equivalence of SDMT alternate forms. Twenty-five healthy participants completed each of five alternate versions of the SDMT - the standard form, two versions from the Rao Brief Repeatable Battery, and two forms specifically designed for this study. Order effects were controlled using a Latin-square research design. All five versions of the SDMT produced mean values within 3 raw score points of one another. Three forms were very consistent, and not different by conservative statistical tests. The SDMT test-retest reliability using these forms was good to excellent, with all r values exceeding 0.80. For the first time, we find good evidence that at least three alternate versions of the SDMT are of equivalent difficulty in healthy adults. The forms are reliable, and can be implemented in clinical trials emphasizing cognitive outcomes.

  5. We need more replication research - A case for test-retest reliability.

    PubMed

    Leppink, Jimmie; Pérez-Fuster, Patricia

    2017-06-01

    Following debates in psychology on the importance of replication research, we have also started to see pleas for a more prominent role for replication research in medical education. To enable replication research, it is of paramount importance to carefully study the reliability of the instruments we use. Cronbach's alpha has been the most widely used estimator of reliability in the field of medical education, notably as some kind of quality label of test or questionnaire scores based on multiple items or of the reliability of assessment across exam stations. However, as this narrative review outlines, Cronbach's alpha or alternative reliability statistics may complement but not replace psychometric methods such as factor analysis. Moreover, multiple-item measurements should be preferred above single-item measurements, and when using single-item measurements, coefficients as Cronbach's alpha should not be interpreted as indicators of the reliability of a single item when that item is administered after fundamentally different activities, such as learning tasks that differ in content. Finally, if we want to follow up on recent pleas for more replication research, we have to start studying the test-retest reliability of the instruments we use.

  6. SkinEthic Laboratories, a company devoted to develop and produce in vitro alternative methods to animal use.

    PubMed

    de Brugerolle, Anne

    2007-01-01

    SkinEthic Laboratories is a France-based biotechnology company recognised as the world leader in tissue engineering. SkinEthic is devoted to develop and produce reliable and robust in vitro alternative methods to animal use in cosmetic, chemical and pharmaceutical industries. SkinEthic models provide relevant tools for efficacy and safety screening tests in order to support an integrated decision-making during research and development phases. Some screening tests are referenced and validated as alternatives to animal use (Episkin), others are in the process of validation under ECVAM and OECD guidelines. SkinEthic laboratories provide a unique and joined experience of more than 20 years from Episkin SNC and SkinEthic SA. Their unique cell culture process allows in vitro reconstructed human tissues with well characterized histology, functionality and ultrastructure features to be mass produced. Our product line includes skin models: a reconstructed human epidermis with a collagen layer, Episkin, reconstructed human epidermis without or with melanocytes (with a tanning degree from phototype II to VI) and a reconstructed human epithelium, i.e. cornea, and other mucosa, i.e. oral, gingival, oesophageal and vaginal. Our philosophy is based on 3 main commitments: to support our customers by providing robust and reliable models, to ensure training and education in using validated protocols, allowing a large array of raw materials, active ingredients and finished products in solid, liquid, powder, cream or gel form to be screened, and, to provide a dedicated service to our partners.

  7. Complementary and Alternative Medicine: Italian Validation of a Questionnaire on Nurses' Personal and Professional Use, Knowledge, and Attitudes.

    PubMed

    Belletti, Giada; Shorofi, Seyed Afshin; Arbon, Paul; Dal Molin, Alberto

    2017-08-01

    Patients are showing an increasing interest in the use of complementary and alternative medicine (CAM). Most nurses are open to the adoption of CAM into clinical nursing practice, but they may experience a lack of knowledge about the safe and effective use of these therapies. Several studies concerning nurses' knowledge and attitudes toward CAM have been published, but only in one, the authors (Shorofi and Arbon) used a validated questionnaire. In Italy, there are no validated questionnaires to investigate this aspect of nursing practice. To test the psychometric properties of the Italian Shorofi and Arbon questionnaire for use with Italian nurses. A forward-backward translation method was used to translate the questionnaire from English to Italian. Content validity, face validity and reliability were established. This study examined the potential usefulness of the Shorofi and Arbon questionnaire for the evaluation of CAM knowledge of Italian speaking nurses, which showed good content validity and good reliability.

  8. Modern methodology of designing target reliability into rotating mechanical components

    NASA Technical Reports Server (NTRS)

    Kececioglu, D. B.; Chester, L. B.

    1973-01-01

    Experimentally determined distributional cycles-to-failure versus maximum alternating nominal strength (S-N) diagrams, and distributional mean nominal strength versus maximum alternating nominal strength (Goodman) diagrams are presented. These distributional S-N and Goodman diagrams are for AISI 4340 steel, R sub c 35/40 hardness, round, cylindrical specimens 0.735 in. in diameter and 6 in. long with a circumferential groove 0.145 in. radius for a theoretical stress concentration = 1.42 and 0.034 in. radius for a stress concentration = 2.34. The specimens are subjected to reversed bending and steady torque in specially built, three complex-fatigue research machines. Based on these results, the effects on the distributional S-N and Goodman diagrams and on service life of superimposing steady torque on reversed bending are established, as well as the effect of various stress concentrations. In addition a computer program for determining the three-parameter Weibull distribution representing the cycles-to-failure data, and two methods for calculating the reliability of components subjected to cumulative fatigue loads are given.

  9. Reliable estimation of orbit errors in spaceborne SAR interferometry. The network approach

    NASA Astrophysics Data System (ADS)

    Bähr, Hermann; Hanssen, Ramon F.

    2012-12-01

    An approach to improve orbital state vectors by orbit error estimates derived from residual phase patterns in synthetic aperture radar interferograms is presented. For individual interferograms, an error representation by two parameters is motivated: the baseline error in cross-range and the rate of change of the baseline error in range. For their estimation, two alternatives are proposed: a least squares approach that requires prior unwrapping and a less reliable gridsearch method handling the wrapped phase. In both cases, reliability is enhanced by mutual control of error estimates in an overdetermined network of linearly dependent interferometric combinations of images. Thus, systematic biases, e.g., due to unwrapping errors, can be detected and iteratively eliminated. Regularising the solution by a minimum-norm condition results in quasi-absolute orbit errors that refer to particular images. For the 31 images of a sample ENVISAT dataset, orbit corrections with a mutual consistency on the millimetre level have been inferred from 163 interferograms. The method itself qualifies by reliability and rigorous geometric modelling of the orbital error signal but does not consider interfering large scale deformation effects. However, a separation may be feasible in a combined processing with persistent scatterer approaches or by temporal filtering of the estimates.

  10. Reliability of infarct volumetry: Its relevance and the improvement by a software-assisted approach.

    PubMed

    Friedländer, Felix; Bohmann, Ferdinand; Brunkhorst, Max; Chae, Ju-Hee; Devraj, Kavi; Köhler, Yvette; Kraft, Peter; Kuhn, Hannah; Lucaciu, Alexandra; Luger, Sebastian; Pfeilschifter, Waltraud; Sadler, Rebecca; Liesz, Arthur; Scholtyschik, Karolina; Stolz, Leonie; Vutukuri, Rajkumar; Brunkhorst, Robert

    2017-08-01

    Despite the efficacy of neuroprotective approaches in animal models of stroke, their translation has so far failed from bench to bedside. One reason is presumed to be a low quality of preclinical study design, leading to bias and a low a priori power. In this study, we propose that the key read-out of experimental stroke studies, the volume of the ischemic damage as commonly measured by free-handed planimetry of TTC-stained brain sections, is subject to an unrecognized low inter-rater and test-retest reliability with strong implications for statistical power and bias. As an alternative approach, we suggest a simple, open-source, software-assisted method, taking advantage of automatic-thresholding techniques. The validity and the improvement of reliability by an automated method to tMCAO infarct volumetry are demonstrated. In addition, we show the probable consequences of increased reliability for precision, p-values, effect inflation, and power calculation, exemplified by a systematic analysis of experimental stroke studies published in the year 2015. Our study reveals an underappreciated quality problem in translational stroke research and suggests that software-assisted infarct volumetry might help to improve reproducibility and therefore the robustness of bench to bedside translation.

  11. Cronbach's [Alpha], Revelle's [Beta], and McDonald's [Omega][sub H]: Their Relations with Each Other and Two Alternative Conceptualizations of Reliability

    ERIC Educational Resources Information Center

    Zinbarg, Richard E.; Revelle, William; Yovel, Iftah; Li, Wen

    2005-01-01

    We make theoretical comparisons among five coefficients--Cronbach's [alpha], Revelle's [beta], McDonald's [omega][sub h], and two alternative conceptualizations of reliability. Though many end users and psychometricians alike may not distinguish among these five coefficients, we demonstrate formally their nonequivalence. Specifically, whereas…

  12. 40 CFR 75.42 - Reliability criteria.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Reliability criteria. 75.42 Section 75...) CONTINUOUS EMISSION MONITORING Alternative Monitoring Systems § 75.42 Reliability criteria. To demonstrate reliability equal to or better than the continuous emission monitoring system, the owner or operator shall...

  13. Jansen-MIDAS: A multi-level photomicrograph segmentation software based on isotropic undecimated wavelets.

    PubMed

    de Siqueira, Alexandre Fioravante; Cabrera, Flávio Camargo; Nakasuga, Wagner Massayuki; Pagamisse, Aylton; Job, Aldo Eloizo

    2018-01-01

    Image segmentation, the process of separating the elements within a picture, is frequently used for obtaining information from photomicrographs. Segmentation methods should be used with reservations, since incorrect results can mislead when interpreting regions of interest (ROI). This decreases the success rate of extra procedures. Multi-Level Starlet Segmentation (MLSS) and Multi-Level Starlet Optimal Segmentation (MLSOS) were developed to be an alternative for general segmentation tools. These methods gave rise to Jansen-MIDAS, an open-source software. A scientist can use it to obtain several segmentations of hers/his photomicrographs. It is a reliable alternative to process different types of photomicrographs: previous versions of Jansen-MIDAS were used to segment ROI in photomicrographs of two different materials, with an accuracy superior to 89%. © 2017 Wiley Periodicals, Inc.

  14. Modeling panel detection frequencies by queuing system theory: an application in gas chromatography olfactometry.

    PubMed

    Bult, Johannes H F; van Putten, Bram; Schifferstein, Hendrik N J; Roozen, Jacques P; Voragen, Alphons G J; Kroeze, Jan H A

    2004-10-01

    In continuous vigilance tasks, the number of coincident panel responses to stimuli provides an index of stimulus detectability. To determine whether this number is due to chance, panel noise levels have been approximated by the maximum coincidence level obtained in stimulus-free conditions. This study proposes an alternative method by which to assess noise levels, derived from queuing system theory (QST). Instead of critical coincidence levels, QST modeling estimates the duration of coinciding responses in the absence of stimuli. The proposed method has the advantage over previous approaches that it yields more reliable noise estimates and allows for statistical testing. The method was applied in an olfactory detection experiment using 16 panelists in stimulus-present and stimulus-free conditions. We propose that QST may be used as an alternative to signal detection theory for analyzing data from continuous vigilance tasks.

  15. Value engineering on the designed operator work tools for brick and rings wells production

    NASA Astrophysics Data System (ADS)

    Ayu Bidiawati J., R.; Muchtiar, Yesmizarti; Wariza, Ragil Okta

    2017-06-01

    Operator working tools in making brick and ring wells were designed and made, and the value engineering was calculated to identify and develop the function of these tools in obtaining the balance between cost, reliability and appearance. This study focused on the value of functional components of the tools and attempted to increase the difference between the costs incurred by the generated values. The purpose of this study was to determine the alternatives of tools design and to determine the performance of each alternative. The technique was developed using FAST method that consisted of five stages: information, creative, analytical, development and presentation stage. The results of the analysis concluded that the designed tools have higher value and better function description. There were four alternative draft improvements for operator working tools. The best alternative was determined based on the rank by using matrix evaluation. Best performance was obtained by the alternative II, amounting to 98.92 with a value of 0.77.

  16. Analyzing the Reliability of the easyCBM Reading Comprehension Measures: Grade 5. Technical Report #1204

    ERIC Educational Resources Information Center

    Park, Bitnara Jasmine; Irvin, P. Shawn; Lai, Cheng-Fei; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    In this technical report, we present the results of a reliability study of the fifth-grade multiple choice reading comprehension measures available on the easyCBM learning system conducted in the spring of 2011. Analyses include split-half reliability, alternate form reliability, person and item reliability as derived from Rasch analysis,…

  17. Analyzing the Reliability of the easyCBM Reading Comprehension Measures: Grade 2. Technical Report #1201

    ERIC Educational Resources Information Center

    Lai, Cheng-Fei; Irvin, P. Shawn; Alonzo, Julie; Park, Bitnara Jasmine; Tindal, Gerald

    2012-01-01

    In this technical report, we present the results of a reliability study of the second-grade multiple choice reading comprehension measures available on the easyCBM learning system conducted in the spring of 2011. Analyses include split-half reliability, alternate form reliability, person and item reliability as derived from Rasch analysis,…

  18. Analyzing the Reliability of the easyCBM Reading Comprehension Measures: Grade 4. Technical Report #1203

    ERIC Educational Resources Information Center

    Park, Bitnara Jasmine; Irvin, P. Shawn; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald

    2012-01-01

    In this technical report, we present the results of a reliability study of the fourth-grade multiple choice reading comprehension measures available on the easyCBM learning system conducted in the spring of 2011. Analyses include split-half reliability, alternate form reliability, person and item reliability as derived from Rasch analysis,…

  19. Analyzing the Reliability of the easyCBM Reading Comprehension Measures: Grade 6. Technical Report #1205

    ERIC Educational Resources Information Center

    Irvin, P. Shawn; Alonzo, Julie; Park, Bitnara Jasmine; Lai, Cheng-Fei; Tindal, Gerald

    2012-01-01

    In this technical report, we present the results of a reliability study of the sixth-grade multiple choice reading comprehension measures available on the easyCBM learning system conducted in the spring of 2011. Analyses include split-half reliability, alternate form reliability, person and item reliability as derived from Rasch analysis,…

  20. Analyzing the Reliability of the easyCBM Reading Comprehension Measures: Grade 7. Technical Report #1206

    ERIC Educational Resources Information Center

    Irvin, P. Shawn; Alonzo, Julie; Lai, Cheng-Fei; Park, Bitnara Jasmine; Tindal, Gerald

    2012-01-01

    In this technical report, we present the results of a reliability study of the seventh-grade multiple choice reading comprehension measures available on the easyCBM learning system conducted in the spring of 2011. Analyses include split-half reliability, alternate form reliability, person and item reliability as derived from Rasch analysis,…

  1. Analyzing the Reliability of the easyCBM Reading Comprehension Measures: Grade 3. Technical Report #1202

    ERIC Educational Resources Information Center

    Lai, Cheng-Fei; Irvin, P. Shawn; Park, Bitnara Jasmine; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    In this technical report, we present the results of a reliability study of the third-grade multiple choice reading comprehension measures available on the easyCBM learning system conducted in the spring of 2011. Analyses include split-half reliability, alternate form reliability, person and item reliability as derived from Rasch analysis,…

  2. A multiplex primer design algorithm for target amplification of continuous genomic regions.

    PubMed

    Ozturk, Ahmet Rasit; Can, Tolga

    2017-06-19

    Targeted Next Generation Sequencing (NGS) assays are cost-efficient and reliable alternatives to Sanger sequencing. For sequencing of very large set of genes, the target enrichment approach is suitable. However, for smaller genomic regions, the target amplification method is more efficient than both the target enrichment method and Sanger sequencing. The major difficulty of the target amplification method is the preparation of amplicons, regarding required time, equipment, and labor. Multiplex PCR (MPCR) is a good solution for the mentioned problems. We propose a novel method to design MPCR primers for a continuous genomic region, following the best practices of clinically reliable PCR design processes. On an experimental setup with 48 different combinations of factors, we have shown that multiple parameters might effect finding the first feasible solution. Increasing the length of the initial primer candidate selection sequence gives better results whereas waiting for a longer time to find the first feasible solution does not have a significant impact. We generated MPCR primer designs for the HBB whole gene, MEFV coding regions, and human exons between 2000 bp to 2100 bp-long. Our benchmarking experiments show that the proposed MPCR approach is able produce reliable NGS assay primers for a given sequence in a reasonable amount of time.

  3. The Effect of Achievement Test Selection on Identification of Learning Disabilities within a Patterns of Strengths and Weaknesses Framework

    PubMed Central

    Miciak, Jeremy; Taylor, Pat; Denton, Carolyn A.; Fletcher, Jack M.

    2014-01-01

    Purpose Few empirical investigations have evaluated learning disabilities (LD) identification methods based on a pattern of cognitive strengths and weaknesses (PSW). This study investigated the reliability of LD classification decisions of the concordance/discordance method (C/DM) across different psychoeducational assessment batteries. Methods C/DM criteria were applied to assessment data from 177 second grade students based on two psychoeducational assessment batteries. The achievement tests were different, but were highly correlated and measured the same latent construct. Resulting LD identifications were then evaluated for agreement across batteries on LD status and the academic domain of eligibility. Results The two batteries identified a similar number of participants as having LD (80 and 74). However, indices of agreement for classification decisions were low (kappa = .29), especially for percent positive agreement (62%). The two batteries demonstrated agreement on the academic domain of eligibility for only 25 participants. Conclusions Cognitive discrepancy frameworks for LD identification are inherently unstable because of imperfect reliability and validity at the observed level. Methods premised on identifying a PSW profile may never achieve high reliability because of these underlying psychometric factors. An alternative is to directly assess academic skills to identify students in need of intervention. PMID:25243467

  4. Identification of FVIII gene mutations in patients with hemophilia A using new combinatorial sequencing by hybridization

    PubMed Central

    Chetta, M.; Drmanac, A.; Santacroce, R.; Grandone, E.; Surrey, S.; Fortina, P.; Margaglione, M.

    2008-01-01

    BACKGROUND: Standard methods of mutation detection are time consuming in Hemophilia A (HA) rendering their application unavailable in some analysis such as prenatal diagnosis. OBJECTIVES: To evaluate the feasibility of combinatorial sequencing-by-hybridization (cSBH) as an alternative and reliable tool for mutation detection in FVIII gene. PATIENTS/METHODS: We have applied a new method of cSBH that uses two different colors for detection of multiple point mutations in the FVIII gene. The 26 exons encompassing the HA gene were analyzed in 7 newly diagnosed Italian patients and in 19 previously characterized individuals with FVIII deficiency. RESULTS: Data show that, when solution-phase TAMRA and QUASAR labeled 5-mer oligonucleotide sets mixed with unlabeled target PCR templates are co-hybridized in the presence of DNA ligase to universal 6-mer oligonucleotide probe-based arrays, a number of mutations can be successfully detected. The technique was reliable also in identifying a mutant FVIII allele in an obligate heterozygote. A novel missense mutation (Leu1843Thr) in exon 16 and three novel neutral polymorphisms are presented with an updated protocol for 2-color cSBH. CONCLUSIONS: cSBH is a reliable tool for mutation detection in FVIII gene and may represent a complementary method for the genetic screening of HA patients. PMID:20300295

  5. Validity and reliability of wii fit balance board for the assessment of balance of healthy young adults and the elderly.

    PubMed

    Chang, Wen-Dien; Chang, Wan-Yi; Lee, Chia-Lun; Feng, Chi-Yen

    2013-10-01

    [Purpose] Balance is an integral part of human ability. The smart balance master system (SBM) is a balance test instrument with good reliability and validity, but it is expensive. Therefore, we modified a Wii Fit balance board, which is a convenient balance assessment tool, and analyzed its reliability and validity. [Subjects and Methods] We recruited 20 healthy young adults and 20 elderly people, and administered 3 balance tests. The correlation coefficient and intraclass correlation of both instruments were analyzed. [Results] There were no statistically significant differences in the 3 tests between the Wii Fit balance board and the SBM. The Wii Fit balance board had a good intraclass correlation (0.86-0.99) for the elderly people and positive correlations (r = 0.58-0.86) with the SBM. [Conclusions] The Wii Fit balance board is a balance assessment tool with good reliability and high validity for elderly people, and we recommend it as an alternative tool for assessing balance ability.

  6. Consistency of clinical biomechanical measures between three different institutions: implications for multi-center biomechanical and epidemiological research.

    PubMed

    Myer, Gregory D; Wordeman, Samuel C; Sugimoto, Dai; Bates, Nathaniel A; Roewer, Benjamin D; Medina McKeon, Jennifer M; DiCesare, Christopher A; Di Stasi, Stephanie L; Barber Foss, Kim D; Thomas, Staci M; Hewett, Timothy E

    2014-05-01

    Multi-center collaborations provide a powerful alternative to overcome the inherent limitations to single-center investigations. Specifically, multi-center projects can support large-scale prospective, longitudinal studies that investigate relatively uncommon outcomes, such as anterior cruciate ligament injury. This project was conceived to assess within- and between-center reliability of an affordable, clinical nomogram utilizing two-dimensional video methods to screen for risk of knee injury. The authors hypothesized that the two-dimensional screening methods would provide good-to-excellent reliability within and between institutions for assessment of frontal and sagittal plane biomechanics. Nineteen female, high school athletes participated. Two-dimensional video kinematics of the lower extremity during a drop vertical jump task were collected on all 19 study participants at each of the three facilities. Within-center and between-center reliability were assessed with intra- and inter-class correlation coefficients. Within-center reliability of the clinical nomogram variables was consistently excellent, but between-center reliability was fair-to-good. Within-center intra-class correlation coefficient for all nomogram variables combined was 0.98, while combined between-center inter-class correlation coefficient was 0.63. Injury risk screening protocols were reliable within and repeatable between centers. These results demonstrate the feasibility of multi-site biomechanical studies and establish a framework for further dissemination of injury risk screening algorithms. Specifically, multi-center studies may allow for further validation and optimization of two-dimensional video screening tools. 2b.

  7. Choosing a reliability inspection plan for interval censored data

    DOE PAGES

    Lu, Lu; Anderson-Cook, Christine Michaela

    2017-04-19

    Reliability test plans are important for producing precise and accurate assessment of reliability characteristics. This paper explores different strategies for choosing between possible inspection plans for interval censored data given a fixed testing timeframe and budget. A new general cost structure is proposed for guiding precise quantification of total cost in inspection test plan. Multiple summaries of reliability are considered and compared as the criteria for choosing the best plans using an easily adapted method. Different cost structures and representative true underlying reliability curves demonstrate how to assess different strategies given the logistical constraints and nature of the problem. Resultsmore » show several general patterns exist across a wide variety of scenarios. Given the fixed total cost, plans that inspect more units with less frequency based on equally spaced time points are favored due to the ease of implementation and consistent good performance across a large number of case study scenarios. Plans with inspection times chosen based on equally spaced probabilities offer improved reliability estimates for the shape of the distribution, mean lifetime, and failure time for a small fraction of population only for applications with high infant mortality rates. The paper uses a Monte Carlo simulation based approach in addition to the common evaluation based on the asymptotic variance and offers comparison and recommendation for different applications with different objectives. Additionally, the paper outlines a variety of different reliability metrics to use as criteria for optimization, presents a general method for evaluating different alternatives, as well as provides case study results for different common scenarios.« less

  8. Reliability of capturing foot parameters using digital scanning and the neutral suspension casting technique

    PubMed Central

    2011-01-01

    Background A clinical study was conducted to determine the intra and inter-rater reliability of digital scanning and the neutral suspension casting technique to measure six foot parameters. The neutral suspension casting technique is a commonly utilised method for obtaining a negative impression of the foot prior to orthotic fabrication. Digital scanning offers an alternative to the traditional plaster of Paris techniques. Methods Twenty one healthy participants volunteered to take part in the study. Six casts and six digital scans were obtained from each participant by two raters of differing clinical experience. The foot parameters chosen for investigation were cast length (mm), forefoot width (mm), rearfoot width (mm), medial arch height (mm), lateral arch height (mm) and forefoot to rearfoot alignment (degrees). Intraclass correlation coefficients (ICC) with 95% confidence intervals (CI) were calculated to determine the intra and inter-rater reliability. Measurement error was assessed through the calculation of the standard error of the measurement (SEM) and smallest real difference (SRD). Results ICC values for all foot parameters using digital scanning ranged between 0.81-0.99 for both intra and inter-rater reliability. For neutral suspension casting technique inter-rater reliability values ranged from 0.57-0.99 and intra-rater reliability values ranging from 0.36-0.99 for rater 1 and 0.49-0.99 for rater 2. Conclusions The findings of this study indicate that digital scanning is a reliable technique, irrespective of clinical experience, with reduced measurement variability in all foot parameters investigated when compared to neutral suspension casting. PMID:21375757

  9. Choosing a reliability inspection plan for interval censored data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Lu; Anderson-Cook, Christine Michaela

    Reliability test plans are important for producing precise and accurate assessment of reliability characteristics. This paper explores different strategies for choosing between possible inspection plans for interval censored data given a fixed testing timeframe and budget. A new general cost structure is proposed for guiding precise quantification of total cost in inspection test plan. Multiple summaries of reliability are considered and compared as the criteria for choosing the best plans using an easily adapted method. Different cost structures and representative true underlying reliability curves demonstrate how to assess different strategies given the logistical constraints and nature of the problem. Resultsmore » show several general patterns exist across a wide variety of scenarios. Given the fixed total cost, plans that inspect more units with less frequency based on equally spaced time points are favored due to the ease of implementation and consistent good performance across a large number of case study scenarios. Plans with inspection times chosen based on equally spaced probabilities offer improved reliability estimates for the shape of the distribution, mean lifetime, and failure time for a small fraction of population only for applications with high infant mortality rates. The paper uses a Monte Carlo simulation based approach in addition to the common evaluation based on the asymptotic variance and offers comparison and recommendation for different applications with different objectives. Additionally, the paper outlines a variety of different reliability metrics to use as criteria for optimization, presents a general method for evaluating different alternatives, as well as provides case study results for different common scenarios.« less

  10. An advanced probabilistic structural analysis method for implicit performance functions

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  11. Comparison of sampling methodologies for nutrient monitoring in streams: uncertainties, costs and implications for mitigation

    NASA Astrophysics Data System (ADS)

    Audet, J.; Martinsen, L.; Hasler, B.; de Jonge, H.; Karydi, E.; Ovesen, N. B.; Kronvang, B.

    2014-07-01

    Eutrophication of aquatic ecosystems caused by excess concentrations of nitrogen and phosphorus may have harmful consequences for biodiversity and poses a health risk to humans via the water supplies. Reduction of nitrogen and phosphorus losses to aquatic ecosystems involves implementation of costly measures, and reliable monitoring methods are therefore essential to select appropriate mitigation strategies and to evaluate their effects. Here, we compare the performances and costs of three methodologies for the monitoring of nutrients in rivers: grab sampling, time-proportional sampling and passive sampling using flow proportional samplers. Assuming time-proportional sampling to be the best estimate of the "true" nutrient load, our results showed that the risk of obtaining wrong total nutrient load estimates by passive samplers is high despite similar costs as the time-proportional sampling. Our conclusion is that for passive samplers to provide a reliable monitoring alternative, further development is needed. Grab sampling was the cheapest of the three methods and was more precise and accurate than passive sampling. We conclude that although monitoring employing time-proportional sampling is costly, its reliability precludes unnecessarily high implementation expenses.

  12. Comparison of sampling methodologies for nutrient monitoring in streams: uncertainties, costs and implications for mitigation

    NASA Astrophysics Data System (ADS)

    Audet, J.; Martinsen, L.; Hasler, B.; de Jonge, H.; Karydi, E.; Ovesen, N. B.; Kronvang, B.

    2014-11-01

    Eutrophication of aquatic ecosystems caused by excess concentrations of nitrogen and phosphorus may have harmful consequences for biodiversity and poses a health risk to humans via water supplies. Reduction of nitrogen and phosphorus losses to aquatic ecosystems involves implementation of costly measures, and reliable monitoring methods are therefore essential to select appropriate mitigation strategies and to evaluate their effects. Here, we compare the performances and costs of three methodologies for the monitoring of nutrients in rivers: grab sampling; time-proportional sampling; and passive sampling using flow-proportional samplers. Assuming hourly time-proportional sampling to be the best estimate of the "true" nutrient load, our results showed that the risk of obtaining wrong total nutrient load estimates by passive samplers is high despite similar costs as the time-proportional sampling. Our conclusion is that for passive samplers to provide a reliable monitoring alternative, further development is needed. Grab sampling was the cheapest of the three methods and was more precise and accurate than passive sampling. We conclude that although monitoring employing time-proportional sampling is costly, its reliability precludes unnecessarily high implementation expenses.

  13. Reliability of Pressure Ulcer Rates: How Precisely Can We Differentiate Among Hospital Units, and Does the Standard Signal‐Noise Reliability Measure Reflect This Precision?

    PubMed Central

    Cramer, Emily

    2016-01-01

    Abstract Hospital performance reports often include rankings of unit pressure ulcer rates. Differentiating among units on the basis of quality requires reliable measurement. Our objectives were to describe and apply methods for assessing reliability of hospital‐acquired pressure ulcer rates and evaluate a standard signal‐noise reliability measure as an indicator of precision of differentiation among units. Quarterly pressure ulcer data from 8,199 critical care, step‐down, medical, surgical, and medical‐surgical nursing units from 1,299 US hospitals were analyzed. Using beta‐binomial models, we estimated between‐unit variability (signal) and within‐unit variability (noise) in annual unit pressure ulcer rates. Signal‐noise reliability was computed as the ratio of between‐unit variability to the total of between‐ and within‐unit variability. To assess precision of differentiation among units based on ranked pressure ulcer rates, we simulated data to estimate the probabilities of a unit's observed pressure ulcer rate rank in a given sample falling within five and ten percentiles of its true rank, and the probabilities of units with ulcer rates in the highest quartile and highest decile being identified as such. We assessed the signal‐noise measure as an indicator of differentiation precision by computing its correlations with these probabilities. Pressure ulcer rates based on a single year of quarterly or weekly prevalence surveys were too susceptible to noise to allow for precise differentiation among units, and signal‐noise reliability was a poor indicator of precision of differentiation. To ensure precise differentiation on the basis of true differences, alternative methods of assessing reliability should be applied to measures purported to differentiate among providers or units based on quality. © 2016 The Authors. Research in Nursing & Health published by Wiley Periodicals, Inc. PMID:27223598

  14. Power processing for electric propulsion

    NASA Technical Reports Server (NTRS)

    Finke, R. C.; Herron, B. G.; Gant, G. D.

    1975-01-01

    The inclusion of electric thruster systems in spacecraft design is considered. The propulsion requirements of such spacecraft dictate a wide range of thruster power levels and operational lifetimes, which must be matched by lightweight, efficient, and reliable thruster power processing systems. Electron bombardment ion thruster requirements are presented, and the performance characteristics of present power processing systems are reviewed. Design philosophies and alternatives in areas such as inverter type, arc protection, and control methods are discussed along with future performance potentials for meeting goals in the areas of power process or weight (10 kg/kW), efficiency (approaching 92 percent), reliability (0.96 for 15,000 hr), and thermal control capability (0.3 to 5 AU).

  15. Earth Observatory Satellite system definition study. Report no. 4: Management approach recommendations

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A management approach for the Earth Observatory Satellite (EOS) which will meet the challenge of a constrained cost environment is presented. Areas of consideration are contracting techniques, test philosophy, reliability and quality assurance requirements, commonality options, and documentation and control requirements. The various functional areas which were examined for cost reduction possibilities are identified. The recommended management approach is developed to show the primary and alternative methods.

  16. On the use of temperature for online condition monitoring of geared systems - A review

    NASA Astrophysics Data System (ADS)

    Touret, T.; Changenet, C.; Ville, F.; Lalmi, M.; Becquerelle, S.

    2018-02-01

    Gear unit condition monitoring is a key factor for mechanical system reliability management. When they are subjected to failure, gears and bearings may generate excessive vibration, debris and heat. Vibratory, acoustic or debris analyses are proven approaches to perform condition monitoring. An alternative to those methods is to use temperature as a condition indicator to detect gearbox failure. The review focuses on condition monitoring studies which use this thermal approach. According to the failure type and the measurement method, it exists a distinction whether it is contact (e.g. thermocouple) or non-contact temperature sensor (e.g. thermography). Capabilities and limitations of this approach are discussed. It is shown that the use of temperature for condition monitoring has a clear potential as an alternative to vibratory or acoustic health monitoring.

  17. The inter-rater reliability and prognostic value of coma scales in Nepali children with acute encephalitis syndrome.

    PubMed

    Ray, Stephen; Rayamajhi, Ajit; Bonnett, Laura J; Solomon, Tom; Kneen, Rachel; Griffiths, Michael J

    2018-02-01

    Background Acute encephalitis syndrome (AES) is a common cause of coma in Nepali children. The Glasgow coma scale (GCS) is used to assess the level of coma in these patients and predict outcome. Alternative coma scales may have better inter-rater reliability and prognostic value in encephalitis in Nepali children, but this has not been studied. The Adelaide coma scale (ACS), Blantyre coma scale (BCS) and the Alert, Verbal, Pain, Unresponsive scale (AVPU) are alternatives to the GCS which can be used. Methods Children aged 1-14 years who presented to Kanti Children's Hospital, Kathmandu with AES between September 2010 and November 2011 were recruited. All four coma scales (GCS, ACS, BCS and AVPU) were applied on admission, 48 h later and on discharge. Inter-rater reliability (unweighted kappa) was measured for each. Correlation and agreement between total coma score and outcome (Liverpool outcome score) was measured by Spearman's rank and Bland-Altman plot. The prognostic value of coma scales alone and in combination with physiological variables was investigated in a subgroup (n = 22). A multivariable logistic regression model was fitted by backward stepwise. Results Fifty children were recruited. Inter-rater reliability using the variables scales was fair to moderate. However, the scales poorly predicted clinical outcome. Combining the scales with physiological parameters such as systolic blood pressure improved outcome prediction. Conclusion This is the first study to compare four coma scales in Nepali children with AES. The scales exhibited fair to moderate inter-rater reliability. However, the study is inadequately powered to answer the question on the relationship between coma scales and outcome. Further larger studies are required.

  18. Adaptation of the ToxRTool to Assess the Reliability of Toxicology Studies Conducted with Genetically Modified Crops and Implications for Future Safety Testing.

    PubMed

    Koch, Michael S; DeSesso, John M; Williams, Amy Lavin; Michalek, Suzanne; Hammond, Bruce

    2016-01-01

    To determine the reliability of food safety studies carried out in rodents with genetically modified (GM) crops, a Food Safety Study Reliability Tool (FSSRTool) was adapted from the European Centre for the Validation of Alternative Methods' (ECVAM) ToxRTool. Reliability was defined as the inherent quality of the study with regard to use of standardized testing methodology, full documentation of experimental procedures and results, and the plausibility of the findings. Codex guidelines for GM crop safety evaluations indicate toxicology studies are not needed when comparability of the GM crop to its conventional counterpart has been demonstrated. This guidance notwithstanding, animal feeding studies have routinely been conducted with GM crops, but their conclusions on safety are not always consistent. To accurately evaluate potential risks from GM crops, risk assessors need clearly interpretable results from reliable studies. The development of the FSSRTool, which provides the user with a means of assessing the reliability of a toxicology study to inform risk assessment, is discussed. Its application to the body of literature on GM crop food safety studies demonstrates that reliable studies report no toxicologically relevant differences between rodents fed GM crops or their non-GM comparators.

  19. The concurrent validity and reliability of a low-cost, high-speed camera-based method for measuring the flight time of vertical jumps.

    PubMed

    Balsalobre-Fernández, Carlos; Tejero-González, Carlos M; del Campo-Vecino, Juan; Bavaresco, Nicolás

    2014-02-01

    Flight time is the most accurate and frequently used variable when assessing the height of vertical jumps. The purpose of this study was to analyze the validity and reliability of an alternative method (i.e., the HSC-Kinovea method) for measuring the flight time and height of vertical jumping using a low-cost high-speed Casio Exilim FH-25 camera (HSC). To this end, 25 subjects performed a total of 125 vertical jumps on an infrared (IR) platform while simultaneously being recorded with a HSC at 240 fps. Subsequently, 2 observers with no experience in video analysis analyzed the 125 videos independently using the open-license Kinovea 0.8.15 software. The flight times obtained were then converted into vertical jump heights, and the intraclass correlation coefficient (ICC), Bland-Altman plot, and Pearson correlation coefficient were calculated for those variables. The results showed a perfect correlation agreement (ICC = 1, p < 0.0001) between both observers' measurements of flight time and jump height and a highly reliable agreement (ICC = 0.997, p < 0.0001) between the observers' measurements of flight time and jump height using the HSC-Kinovea method and those obtained using the IR system, thus explaining 99.5% (p < 0.0001) of the differences (shared variance) obtained using the IR platform. As a result, besides requiring no previous experience in the use of this technology, the HSC-Kinovea method can be considered to provide similarly valid and reliable measurements of flight time and vertical jump height as more expensive equipment (i.e., IR). As such, coaches from many sports could use the HSC-Kinovea method to measure the flight time and height of their athlete's vertical jumps.

  20. Implementation of the 3Rs (refinement, reduction, and replacement): validation and regulatory acceptance considerations for alternative toxicological test methods.

    PubMed

    Schechtman, Leonard M

    2002-01-01

    Toxicological testing in the current regulatory environment is steeped in a history of using animals to answer questions about the safety of products to which humans are exposed. That history forms the basis for the testing strategies that have evolved to satisfy the needs of the regulatory bodies that render decisions that affect, for the most part, virtually all phases of premarket product development and evaluation and, to a lesser extent, postmarketing surveillance. Only relatively recently have the levels of awareness of, and responsiveness to, animal welfare issues reached current proportions. That paradigm shift, although sluggish, has nevertheless been progressive. New and alternative toxicological methods for hazard evaluation and risk assessment have now been adopted and are being viewed as a means to address those issues in a manner that considers humane treatment of animals yet maintains scientific credibility and preserves the goal of ensuring human safety. To facilitate this transition, regulatory agencies and regulated industry must work together toward improved approaches. They will need assurance that the methods will be reliable and the results comparable with, or better than, those derived from the current classical methods. That confidence will be a function of the scientific validation and resultant acceptance of any given method. In the United States, to fulfill this need, the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) and its operational center, the National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM), have been constituted as prescribed in federal law. Under this mandate, ICCVAM has developed a process and established criteria for the scientific validation and regulatory acceptance of new and alternative methods. The role of ICCVAM in the validation and acceptance process and the criteria instituted toward that end are described. Also discussed are the participation of the US Food and Drug Administration (FDA) in the ICCVAM process and that agency's approach to the application and implementation of ICCVAM-recommended methods.

  1. Improved Accuracy of the Inherent Shrinkage Method for Fast and More Reliable Welding Distortion Calculations

    NASA Astrophysics Data System (ADS)

    Mendizabal, A.; González-Díaz, J. B.; San Sebastián, M.; Echeverría, A.

    2016-07-01

    This paper describes the implementation of a simple strategy adopted for the inherent shrinkage method (ISM) to predict welding-induced distortion. This strategy not only makes it possible for the ISM to reach accuracy levels similar to the detailed transient analysis method (considered the most reliable technique for calculating welding distortion) but also significantly reduces the time required for these types of calculations. This strategy is based on the sequential activation of welding blocks to account for welding direction and transient movement of the heat source. As a result, a significant improvement in distortion prediction is achieved. This is demonstrated by experimentally measuring and numerically analyzing distortions in two case studies: a vane segment subassembly of an aero-engine, represented with 3D-solid elements, and a car body component, represented with 3D-shell elements. The proposed strategy proves to be a good alternative for quickly estimating the correct behaviors of large welded components and may have important practical applications in the manufacturing industry.

  2. Technological advances in bovine mastitis diagnosis: an overview.

    PubMed

    Duarte, Carla M; Freitas, Paulo P; Bexiga, Ricardo

    2015-11-01

    Bovine mastitis is an economic burden for dairy farmers and preventive control measures are crucial for the sustainability of any dairy business. The identification of etiological agents is necessary in controlling the disease, reducing risk of chronic infections and targeting antimicrobial therapy. The suitability of a detection method for routine diagnosis depends on several factors, including specificity, sensitivity, cost, time in producing results, and suitability for large-scale sampling of milk. This article focuses on current methodologies for identification of mastitis pathogens and for detection of inflammation, as well as the advantages and disadvantages of different methods. Emerging technologies, such as transcriptome and proteome analyses and nano- and microfabrication of portable devices, offer promising, sensitive methods for advanced detection of mastitis pathogens and biomarkers of inflammation. The demand for alternative, fast, and reliable diagnostic procedures is rising as farms become bigger. Several examples of technological and scientific advances are summarized which have given rise to more sensitive, reliable and faster diagnostic results. © 2015 The Author(s).

  3. Life Cycle Assessment for desalination: a review on methodology feasibility and reliability.

    PubMed

    Zhou, Jin; Chang, Victor W-C; Fane, Anthony G

    2014-09-15

    As concerns of natural resource depletion and environmental degradation caused by desalination increase, research studies of the environmental sustainability of desalination are growing in importance. Life Cycle Assessment (LCA) is an ISO standardized method and is widely applied to evaluate the environmental performance of desalination. This study reviews more than 30 desalination LCA studies since 2000s and identifies two major issues in need of improvement. The first is feasibility, covering three elements that support the implementation of the LCA to desalination, including accounting methods, supporting databases, and life cycle impact assessment approaches. The second is reliability, addressing three essential aspects that drive uncertainty in results, including the incompleteness of the system boundary, the unrepresentativeness of the database, and the omission of uncertainty analysis. This work can serve as a preliminary LCA reference for desalination specialists, but will also strengthen LCA as an effective method to evaluate the environment footprint of desalination alternatives. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Comparative study between the hand-wrist method and cervical vertebral maturation method for evaluation skeletal maturity in cleft patients.

    PubMed

    Manosudprasit, Montian; Wangsrimongkol, Tasanee; Pisek, Poonsak; Chantaramungkorn, Melissa

    2013-09-01

    To test the measure of agreement between use of the Skeletal Maturation Index (SMI) method of Fishman using hand-wrist radiographs and the Cervical Vertebral Maturation Index (CVMI) method for assessing skeletal maturity of the cleft patients. Hand-wrist and lateral cephalometric radiographs of 60 cleft subjects (35 females and 25 males, age range: 7-16 years) were used. Skeletal age was assessed using an adjustment to the SMI method of Fishman to compare with the CVMI method of Hassel and Farman. Agreement between skeletal age assessed by both methods and the intra- and inter-examiner reliability of both methods were tested by weighted kappa analysis. There was good agreement between the two methods with a kappa value of 0.80 (95% CI = 0.66-0.88, p-value <0.001). Reliability of intra- and inter-examiner of both methods was very good with kappa value ranging from 0.91 to 0.99. The CVMI method can be used as an alternative to the SMI method in skeletal age assessment in cleft patients with the benefit of no need of an additional radiograph and avoiding extra-radiation exposure. Comparing the two methods, the present study found better agreement from peak of adolescence onwards.

  5. Estimating the Population Size of Female Sex Worker Population in Tehran, Iran: Application of Direct Capture-Recapture Method.

    PubMed

    Karami, Manoochehr; Khazaei, Salman; Poorolajal, Jalal; Soltanian, Alireza; Sajadipoor, Mansour

    2017-08-01

    There is no reliable estimate of the size of female sex workers (FSWs). This study aimed to estimate the size of FSWs in south of Tehran, Iran in 2016 using direct capture-recapture method. In the capture phase, the hangouts of FSWs were mapped as their meeting places. FSWs who agreed to participate in the study tagged with a T-shirt. The recapture phase was implemented at the same places tagging FSWs with a blue bracelet. The total estimated size of FSWs was 690 (95% CI 633, 747). About 89.43% of FSWs experienced sexual intercourse prior to age 20. The prevalence of human immunodeficiency virus infection among FSWs was 4.60%. The estimated population size of FSWs was much more than our expectation. This issue must be the focus of special attention for planning prevention strategies. However, alternative estimates require to estimating the number FSWs, reliably.

  6. Alternative Fuels Data Center: Propane Rolls on as Reliable Fleet Fuel

    Science.gov Websites

    AddThis.com... March 6, 2015 Propane Rolls on as Reliable Fleet Fuel " If we can save the district money alternative fuels program for our buses as a way to save money and clean up the air and environment for our can save the district money and prevent pollution for our kids' sake in the process, I don't see a

  7. An alternative to the balance error scoring system: using a low-cost balance board to improve the validity/reliability of sports-related concussion balance testing.

    PubMed

    Chang, Jasper O; Levy, Susan S; Seay, Seth W; Goble, Daniel J

    2014-05-01

    Recent guidelines advocate sports medicine professionals to use balance tests to assess sensorimotor status in the management of concussions. The present study sought to determine whether a low-cost balance board could provide a valid, reliable, and objective means of performing this balance testing. Criterion validity testing relative to a gold standard and 7 day test-retest reliability. University biomechanics laboratory. Thirty healthy young adults. Balance ability was assessed on 2 days separated by 1 week using (1) a gold standard measure (ie, scientific grade force plate), (2) a low-cost Nintendo Wii Balance Board (WBB), and (3) the Balance Error Scoring System (BESS). Validity of the WBB center of pressure path length and BESS scores were determined relative to the force plate data. Test-retest reliability was established based on intraclass correlation coefficients. Composite scores for the WBB had excellent validity (r = 0.99) and test-retest reliability (R = 0.88). Both the validity (r = 0.10-0.52) and test-retest reliability (r = 0.61-0.78) were lower for the BESS. These findings demonstrate that a low-cost balance board can provide improved balance testing accuracy/reliability compared with the BESS. This approach provides a potentially more valid/reliable, yet affordable, means of assessing sports-related concussion compared with current methods.

  8. Reliability of the Roussel Uclaf Causality Assessment Method for Assessing Causality in Drug-Induced Liver Injury*

    PubMed Central

    Rochon, James; Protiva, Petr; Seeff, Leonard B.; Fontana, Robert J.; Liangpunsakul, Suthat; Watkins, Paul B.; Davern, Timothy; McHutchison, John G.

    2013-01-01

    The Roussel Uclaf Causality Assessment Method (RUCAM) was developed to quantify the strength of association between a liver injury and the medication implicated as causing the injury. However, its reliability in a research setting has never been fully explored. The aim of this study was to determine test-retest and interrater reliabilities of RUCAM in retrospectively-identified cases of drug induced liver injury. The Drug-Induced Liver Injury Network is enrolling well-defined cases of hepatotoxicity caused by isoniazid, phenytoin, clavulanate/amoxicillin, or valproate occurring since 1994. Each case was adjudicated by three reviewers working independently; after an interval of at least 5 months, cases were readjudicated by the same reviewers. A total of 40 drug-induced liver injury cases were enrolled including individuals treated with isoniazid (nine), phenytoin (five), clavulanate/amoxicillin (15), and valproate (11). Mean ± standard deviation age at protocol-defined onset was 44.8 ± 19.5 years; patients were 68% female and 78% Caucasian. Cases were classified as hepatocellular (44%), mixed (28%), or cholestatic (28%). Test-retest differences ranged from −7 to +8 with complete agreement in only 26% of cases. On average, the maximum absolute difference among the three reviewers was 3.1 on the first adjudication and 2.7 on the second, although much of this variability could be attributed to differences between the enrolling investigator and the external reviewers. The test-retest reliability by the same assessors was 0.54 (upper 95% confidence limit = 0.77); the interrater reliability was 0.45 (upper 95% confidence limit = 0.58). Categorizing the RUCAM to a five-category scale improved these reliabilities but only marginally. Conclusion The mediocre reliability of the RUCAM is problematic for future studies of drug-induced liver injury. Alternative methods, including modifying the RUCAM, developing drug-specific instruments, or causality assessment based on expert opinion, may be more appropriate. PMID:18798340

  9. Alternative Test Methods for Electronic Parts

    NASA Technical Reports Server (NTRS)

    Plante, Jeannette

    2004-01-01

    It is common practice within NASA to test electronic parts at the manufacturing lot level to demonstrate, statistically, that parts from the lot tested will not fail in service using generic application conditions. The test methods and the generic application conditions used have been developed over the years through cooperation between NASA, DoD, and industry in order to establish a common set of standard practices. These common practices, found in MIL-STD-883, MIL-STD-750, military part specifications, EEE-INST-002, and other guidelines are preferred because they are considered to be effective and repeatable and their results are usually straightforward to interpret. These practices can sometimes be unavailable to some NASA projects due to special application conditions that must be addressed, such as schedule constraints, cost constraints, logistical constraints, or advances in the technology that make the historical standards an inappropriate choice for establishing part performance and reliability. Alternate methods have begun to emerge and to be used by NASA programs to test parts individually or as part of a system, especially when standard lot tests cannot be applied. Four alternate screening methods will be discussed in this paper: Highly accelerated life test (HALT), forward voltage drop tests for evaluating wire-bond integrity, burn-in options during or after highly accelerated stress test (HAST), and board-level qualification.

  10. Reliable Thermoelectric Module Design under Opposing Requirements from Structural and Thermoelectric Considerations

    NASA Astrophysics Data System (ADS)

    Karri, Naveen K.; Mo, Changki

    2018-06-01

    Structural reliability of thermoelectric generation (TEG) systems still remains an issue, especially for applications such as large-scale industrial or automobile exhaust heat recovery, in which TEG systems are subject to dynamic loads and thermal cycling. Traditional thermoelectric (TE) system design and optimization techniques, focused on performance alone, could result in designs that may fail during operation as the geometric requirements for optimal performance (especially the power) are often in conflict with the requirements for mechanical reliability. This study focused on reducing the thermomechanical stresses in a TEG system without compromising the optimized system performance. Finite element simulations were carried out to study the effect of TE element (leg) geometry such as leg length and cross-sectional shape under constrained material volume requirements. Results indicated that the element length has a major influence on the element stresses whereas regular cross-sectional shapes have minor influence. The impact of TE element stresses on the mechanical reliability is evaluated using brittle material failure theory based on Weibull analysis. An alternate couple configuration that relies on the industry practice of redundant element design is investigated. Results showed that the alternate configuration considerably reduced the TE element and metallization stresses, thereby enhancing the structural reliability, with little trade-off in the optimized performance. The proposed alternate configuration could serve as a potential design modification for improving the reliability of systems optimized for thermoelectric performance.

  11. A New Green Method for the Quantitative Analysis of Enrofloxacin by Fourier-Transform Infrared Spectroscopy.

    PubMed

    Rebouças, Camila Tavares; Kogawa, Ana Carolina; Salgado, Hérida Regina Nunes

    2018-05-18

    Background: A green analytical chemistry method was developed for quantification of enrofloxacin in tablets. The drug, a second-generation fluoroquinolone, was first introduced in veterinary medicine for the treatment of various bacterial species. Objective: This study proposed to develop, validate, and apply a reliable, low-cost, fast, and simple IR spectroscopy method for quantitative routine determination of enrofloxacin in tablets. Methods: The method was completely validated according to the International Conference on Harmonisation guidelines, showing accuracy, precision, selectivity, robustness, and linearity. Results: It was linear over the concentration range of 1.0-3.0 mg with correlation coefficients >0.9999 and LOD and LOQ of 0.12 and 0.36 mg, respectively. Conclusions: Now that this IR method has met performance qualifications, it can be adopted and applied for the analysis of enrofloxacin tablets for production process control. The validated method can also be utilized to quantify enrofloxacin in tablets and thus is an environmentally friendly alternative for the routine analysis of enrofloxacin in quality control. Highlights: A new green method for the quantitative analysis of enrofloxacin by Fourier-Transform Infrared spectroscopy was validated. It is a fast, clean and low-cost alternative for the evaluation of enrofloxacin tablets.

  12. Monitoring uterine activity during labor: a comparison of 3 methods.

    PubMed

    Euliano, Tammy Y; Nguyen, Minh Tam; Darmanjian, Shalom; McGorray, Susan P; Euliano, Neil; Onkala, Allison; Gregg, Anthony R

    2013-01-01

    Tocodynamometry (Toco; strain gauge technology) provides contraction frequency and approximate duration of labor contractions but suffers frequent signal dropout, necessitating repositioning by a nurse, and may fail in obese patients. The alternative invasive intrauterine pressure catheter (IUPC) is more reliable and adds contraction pressure information but requires ruptured membranes and introduces small risks of infection and abruption. Electrohysterography (EHG) reports the electrical activity of the uterus through electrodes placed on the maternal abdomen. This study compared all 3 methods of contraction detection simultaneously in laboring women. Upon consent, laboring women were monitored simultaneously with Toco, EHG, and IUPC. Contraction curves were generated in real-time for the EHG, and all 3 curves were stored electronically. A contraction detection algorithm was used to compare frequency and timing between methods. Seventy-three subjects were enrolled in the study; 14 were excluded due to hardware failure of 1 or more of the devices (n = 12) or inadequate data collection duration (n = 2). In comparison with the gold-standard IUPC, EHG performed significantly better than Toco with regard to the Contractions Consistency Index (CCI). The mean CCI for EHG was 0.88 ± 0.17 compared with 0.69 ± 0.27 for Toco (P < .0001). In contrast to Toco, EHG was not significantly affected by obesity. Toco does not correlate well with the gold-standard IUPC and fails more frequently in obese patients. EHG provides a reliable noninvasive alternative, regardless of body habitus. Copyright © 2013 Mosby, Inc. All rights reserved.

  13. Monitoring uterine activity during labor: a comparison of three methods

    PubMed Central

    EULIANO, Tammy Y.; NGUYEN, Minh Tam; DARMANJIAN, Shalom; MCGORRAY, Susan P.; EULIANO, Neil; ONKALA, Allison; GREGG, Anthony R.

    2012-01-01

    Objective Tocodynamometry (Toco—strain gauge technology) provides contraction frequency and approximate duration of labor contractions, but suffers frequent signal dropout necessitating re-positioning by a nurse, and may fail in obese patients. The alternative invasive intrauterine pressure catheter (IUPC) is more reliable and adds contraction pressure information, but requires ruptured membranes and introduces small risks of infection and abruption. Electrohysterography (EHG) reports the electrical activity of the uterus through electrodes placed on the maternal abdomen. This study compared all three methods of contraction detection simultaneously in laboring women. Study Design Upon consent, laboring women were monitored simultaneously with Toco, EHG, and IUPC. Contraction curves were generated in real-time for the EHG and all three curves were stored electronically. A contraction detection algorithm was used to compare frequency and timing between methods. Seventy-three subjects were enrolled in the study; 14 were excluded due to hardware failure of one or more of the devices (12) or inadequate data collection duration(2). Results In comparison with the gold-standard IUPC, EHG performed significantly better than Toco with regard to Contractions Consistency Index (CCI). The mean CCI for EHG was 0.88 ± 0.17 compared to 0.69 ± 0.27 for Toco (p<.0001). In contrast to Toco, EHG was not significantly affected by obesity. Conclusion Toco does not correlate well with the gold-standard IUPC and fails more frequently in obese patients. EHG provides a reliable non-invasive alternative regardless of body habitus. PMID:23122926

  14. Rapid detection of Listeria monocytogenes in raw milk and soft cheese by a redox potential measurement based method combined with real-time PCR.

    PubMed

    Erdősi, Orsolya; Szakmár, Katalin; Reichart, Olivér; Szili, Zsuzsanna; László, Noémi; Székely Körmöczy, Péter; Laczay, Péter

    2014-09-01

    The incidence of outbreaks of foodborne listeriosis has indicated the need for a reliable and rapid detection of the microbe in different foodstuffs. A method combining redox potential measurement and real-time polymerase chain reaction (PCR) was developed to detect Listeria monocytogenes in artificially contaminated raw milk and soft cheese. Food samples of 25 g or 25 ml were homogenised in 225 ml of Listeria Enrichment Broth (LEB) with Oxford supplement, and the redox potential measurement technique was applied. For Listeria species the measuring time was maximum 34 h. The absence of L. monocytogenes could reliably be proven by the redox potential measurement method, but Listeria innocua and Bacillus subtilis could not be differentiated from L. monocytogenes on the basis of the redox curves. The presence of L. monocytogenes had to be confirmed by real-time PCR. The combination of these two methods proved to detect < 10 cfu/g of L. monocytogenes in a cost- and time-effective manner. This method can potentially be used as an alternative to the standard nutrient method for the rapid detection of L. monocytogenes in food.

  15. A fast and reliable readout method for quantitative analysis of surface-enhanced Raman scattering nanoprobes on chip surface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Hyejin; Jeong, Sinyoung; Ko, Eunbyeol

    2015-05-15

    Surface-enhanced Raman scattering techniques have been widely used for bioanalysis due to its high sensitivity and multiplex capacity. However, the point-scanning method using a micro-Raman system, which is the most common method in the literature, has a disadvantage of extremely long measurement time for on-chip immunoassay adopting a large chip area of approximately 1-mm scale and confocal beam point of ca. 1-μm size. Alternative methods such as sampled spot scan with high confocality and large-area scan method with enlarged field of view and low confocality have been utilized in order to minimize the measurement time practically. In this study, wemore » analyzed the two methods in respect of signal-to-noise ratio and sampling-led signal fluctuations to obtain insights into a fast and reliable readout strategy. On this basis, we proposed a methodology for fast and reliable quantitative measurement of the whole chip area. The proposed method adopted a raster scan covering a full area of 100 μm × 100 μm region as a proof-of-concept experiment while accumulating signals in the CCD detector for single spectrum per frame. One single scan with 10 s over 100 μm × 100 μm area yielded much higher sensitivity compared to sampled spot scanning measurements and no signal fluctuations attributed to sampled spot scan. This readout method is able to serve as one of key technologies that will bring quantitative multiplexed detection and analysis into practice.« less

  16. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 1. Technical Report #1216

    ERIC Educational Resources Information Center

    Anderson, Daniel; Park, Jasmine, Bitnara; Lai, Cheng-Fei; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest/and alternate form) and G-Theory/D-Study research on the easy CBM reading measures, grades 1-5. Data were gathered in the spring 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest. Due…

  17. A Bayesian-Based EDA Tool for Nano-circuits Reliability Calculations

    NASA Astrophysics Data System (ADS)

    Ibrahim, Walid; Beiu, Valeriu

    As the sizes of (nano-)devices are aggressively scaled deep into the nanometer range, the design and manufacturing of future (nano-)circuits will become extremely complex and inevitably will introduce more defects while their functioning will be adversely affected by transient faults. Therefore, accurately calculating the reliability of future designs will become a very important aspect for (nano-)circuit designers as they investigate several design alternatives to optimize the trade-offs between the conflicting metrics of area-power-energy-delay versus reliability. This paper introduces a novel generic technique for the accurate calculation of the reliability of future nano-circuits. Our aim is to provide both educational and research institutions (as well as the semiconductor industry at a later stage) with an accurate and easy to use tool for closely comparing the reliability of different design alternatives, and for being able to easily select the design that best fits a set of given (design) constraints. Moreover, the reliability model generated by the tool should empower designers with the unique opportunity of understanding the influence individual gates play on the design’s overall reliability, and identifying those (few) gates which impact the design’s reliability most significantly.

  18. High-throughput Titration of Luciferase-expressing Recombinant Viruses

    PubMed Central

    Garcia, Vanessa; Krishnan, Ramya; Davis, Colin; Batenchuk, Cory; Le Boeuf, Fabrice; Abdelbary, Hesham; Diallo, Jean-Simon

    2014-01-01

    Standard plaque assays to determine infectious viral titers can be time consuming, are not amenable to a high volume of samples, and cannot be done with viruses that do not form plaques. As an alternative to plaque assays, we have developed a high-throughput titration method that allows for the simultaneous titration of a high volume of samples in a single day. This approach involves infection of the samples with a Firefly luciferase tagged virus, transfer of the infected samples onto an appropriate permissive cell line, subsequent addition of luciferin, reading of plates in order to obtain luminescence readings, and finally the conversion from luminescence to viral titers. The assessment of cytotoxicity using a metabolic viability dye can be easily incorporated in the workflow in parallel and provide valuable information in the context of a drug screen. This technique provides a reliable, high-throughput method to determine viral titers as an alternative to a standard plaque assay. PMID:25285536

  19. Aptamer-Based Analysis: A Promising Alternative for Food Safety Control

    PubMed Central

    Amaya-González, Sonia; de-los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J.; Lobo-Castañón, Maria Jesús

    2013-01-01

    Ensuring food safety is nowadays a top priority of authorities and professional players in the food supply chain. One of the key challenges to determine the safety of food and guarantee a high level of consumer protection is the availability of fast, sensitive and reliable analytical methods to identify specific hazards associated to food before they become a health problem. The limitations of existing methods have encouraged the development of new technologies, among them biosensors. Success in biosensor design depends largely on the development of novel receptors with enhanced affinity to the target, while being stable and economical. Aptamers fulfill these characteristics, and thus have surfaced as promising alternatives to natural receptors. This Review describes analytical strategies developed so far using aptamers for the control of pathogens, allergens, adulterants, toxins and other forbidden contaminants to ensure food safety. The main progresses to date are presented, highlighting potential prospects for the future. PMID:24287543

  20. Comparing methodologies for the allocation of overhead and capital costs to hospital services.

    PubMed

    Tan, Siok Swan; van Ineveld, Bastianus Martinus; Redekop, William Ken; Hakkaart-van Roijen, Leona

    2009-06-01

    Typically, little consideration is given to the allocation of indirect costs (overheads and capital) to hospital services, compared to the allocation of direct costs. Weighted service allocation is believed to provide the most accurate indirect cost estimation, but the method is time consuming. To determine whether hourly rate, inpatient day, and marginal mark-up allocation are reliable alternatives for weighted service allocation. The cost approaches were compared independently for appendectomy, hip replacement, cataract, and stroke in representative general hospitals in The Netherlands for 2005. Hourly rate allocation and inpatient day allocation produce estimates that are not significantly different from weighted service allocation. Hourly rate allocation may be a strong alternative to weighted service allocation for hospital services with a relatively short inpatient stay. The use of inpatient day allocation would likely most closely reflect the indirect cost estimates obtained by the weighted service method.

  1. Circuit-Detour Design and Implementation - Enhancing the Southern California's Seismic Network Reliability through Redundant Network Paths

    NASA Astrophysics Data System (ADS)

    Watkins, M.; Busby, R.; Rico, H.; Johnson, M.; Hauksson, E.

    2003-12-01

    We provide enhanced network robustness by apportioning redundant data communications paths for seismic stations in the field. By providing for more than one telemetry route, either physical or logical, network operators can improve availability of seismic data while experiencing occasional network outages, and also during the loss of key gateway interfaces such as a router or central processor. This is especially important for seismic stations in sparsely populated regions where a loss of a single site may result in a significant gap in the network's monitoring capability. A number of challenges arise in the application of a circuit-detour mechanism. One requirement is that it fits well within the existing framework of our real-time system processing. It is also necessary to craft a system that is not needlessly complex to maintain or implement, particularly during a crisis. The method that we use for circuit-detours does not require the reconfiguration of dataloggers or communications equipment in the field. Remote network configurations remain static, changes are only required at the central site. We have implemented standardized procedures to detour circuits on similar transport mediums, such as virtual circuits on the same leased line; as well as physically different communications pathways, such as a microwave link backed up by a leased line. The lessons learned from these improvements in reliability, and optimization efforts could be applied to other real-time seismic networks. A fundamental tenant of most seismic networks is that they are reliable and have a high percentage of real-time data availability. A reasonable way to achieve these expectations is to provide alternate means of delivering data to the central processing sites, with a simple method for utilizing these alternate paths.

  2. Land use mapping from CBERS-2 images with open source tools by applying different classification algorithms

    NASA Astrophysics Data System (ADS)

    Sanhouse-García, Antonio J.; Rangel-Peraza, Jesús Gabriel; Bustos-Terrones, Yaneth; García-Ferrer, Alfonso; Mesas-Carrascosa, Francisco J.

    2016-02-01

    Land cover classification is often based on different characteristics between their classes, but with great homogeneity within each one of them. This cover is obtained through field work or by mean of processing satellite images. Field work involves high costs; therefore, digital image processing techniques have become an important alternative to perform this task. However, in some developing countries and particularly in Casacoima municipality in Venezuela, there is a lack of geographic information systems due to the lack of updated information and high costs in software license acquisition. This research proposes a low cost methodology to develop thematic mapping of local land use and types of coverage in areas with scarce resources. Thematic mapping was developed from CBERS-2 images and spatial information available on the network using open source tools. The supervised classification method per pixel and per region was applied using different classification algorithms and comparing them among themselves. Classification method per pixel was based on Maxver algorithms (maximum likelihood) and Euclidean distance (minimum distance), while per region classification was based on the Bhattacharya algorithm. Satisfactory results were obtained from per region classification, where overall reliability of 83.93% and kappa index of 0.81% were observed. Maxver algorithm showed a reliability value of 73.36% and kappa index 0.69%, while Euclidean distance obtained values of 67.17% and 0.61% for reliability and kappa index, respectively. It was demonstrated that the proposed methodology was very useful in cartographic processing and updating, which in turn serve as a support to develop management plans and land management. Hence, open source tools showed to be an economically viable alternative not only for forestry organizations, but for the general public, allowing them to develop projects in economically depressed and/or environmentally threatened areas.

  3. Evaluation of the fast orthogonal search method for forecasting chloride levels in the Deltona groundwater supply (Florida, USA)

    NASA Astrophysics Data System (ADS)

    El-Jaat, Majda; Hulley, Michael; Tétreault, Michel

    2018-02-01

    Despite the broad impact and importance of saltwater intrusion in coastal aquifers, little research has been directed towards forecasting saltwater intrusion in areas where the source of saltwater is uncertain. Saline contamination in inland groundwater supplies is a concern for numerous communities in the southern US including the city of Deltona, Florida. Furthermore, conventional numerical tools for forecasting saltwater contamination are heavily dependent on reliable characterization of the physical characteristics of underlying aquifers, information that is often absent or challenging to obtain. To overcome these limitations, a reliable alternative data-driven model for forecasting salinity in a groundwater supply was developed for Deltona using the fast orthogonal search (FOS) method. FOS was applied on monthly water-demand data and corresponding chloride concentrations at water supply wells. Groundwater salinity measurements from Deltona water supply wells were applied to evaluate the forecasting capability and accuracy of the FOS model. Accurate and reliable groundwater salinity forecasting is necessary to support effective and sustainable coastal-water resource planning and management. The available (27) water supply wells for Deltona were randomly split into three test groups for the purposes of FOS model development and performance assessment. Based on four performance indices (RMSE, RSR, NSEC, and R), the FOS model proved to be a reliable and robust forecaster of groundwater salinity. FOS is relatively inexpensive to apply, is not based on rigorous physical characterization of the water supply aquifer, and yields reliable estimates of groundwater salinity in active water supply wells.

  4. Coefficient Alpha: A Reliability Coefficient for the 21st Century?

    ERIC Educational Resources Information Center

    Yang, Yanyun; Green, Samuel B.

    2011-01-01

    Coefficient alpha is almost universally applied to assess reliability of scales in psychology. We argue that researchers should consider alternatives to coefficient alpha. Our preference is for structural equation modeling (SEM) estimates of reliability because they are informative and allow for an empirical evaluation of the assumptions…

  5. UP TO 100,000 RELIABLE STRONG GRAVITATIONAL LENSES IN FUTURE DARK ENERGY EXPERIMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Serjeant, S.

    2014-09-20

    The Euclid space telescope will observe ∼10{sup 5} strong galaxy-galaxy gravitational lens events in its wide field imaging survey over around half the sky, but identifying the gravitational lenses from their observed morphologies requires solving the difficult problem of reliably separating the lensed sources from contaminant populations, such as tidal tails, as well as presenting challenges for spectroscopic follow-up redshift campaigns. Here I present alternative selection techniques for strong gravitational lenses in both Euclid and the Square Kilometre Array, exploiting the strong magnification bias present in the steep end of the Hα luminosity function and the H I mass function.more » Around 10{sup 3} strong lensing events are detectable with this method in the Euclid wide survey. While only ∼1% of the total haul of Euclid lenses, this sample has ∼100% reliability, known source redshifts, high signal-to-noise, and a magnification-based selection independent of assumptions of lens morphology. With the proposed Square Kilometre Array dark energy survey, the numbers of reliable strong gravitational lenses with source redshifts can reach 10{sup 5}.« less

  6. Validity and Reliability of Wii Fit Balance Board for the Assessment of Balance of Healthy Young Adults and the Elderly

    PubMed Central

    Chang, Wen-Dien; Chang, Wan-Yi; Lee, Chia-Lun; Feng, Chi-Yen

    2013-01-01

    [Purpose] Balance is an integral part of human ability. The smart balance master system (SBM) is a balance test instrument with good reliability and validity, but it is expensive. Therefore, we modified a Wii Fit balance board, which is a convenient balance assessment tool, and analyzed its reliability and validity. [Subjects and Methods] We recruited 20 healthy young adults and 20 elderly people, and administered 3 balance tests. The correlation coefficient and intraclass correlation of both instruments were analyzed. [Results] There were no statistically significant differences in the 3 tests between the Wii Fit balance board and the SBM. The Wii Fit balance board had a good intraclass correlation (0.86–0.99) for the elderly people and positive correlations (r = 0.58–0.86) with the SBM. [Conclusions] The Wii Fit balance board is a balance assessment tool with good reliability and high validity for elderly people, and we recommend it as an alternative tool for assessing balance ability. PMID:24259769

  7. The reliability of Fishman method of skeletal maturation for age estimation in children of South Indian population.

    PubMed

    Mohammed, Rezwana Begum; Kalyan, V Siva; Tircouveluri, Saritha; Vegesna, Goutham Chakravarthy; Chirla, Anil; Varma, D Maruthi

    2014-07-01

    Determining the age of a person in the absence of documentary evidence of birth is essential for legal and medico-legal purpose. Fishman method of skeletal maturation is widely used for this purpose; however, the reliability of this method for people with all geographic locations is not well-established. In this study, we assessed various stages of carpal and metacarpal bone maturation and tested the reliability of Fishman method of skeletal maturation to estimate the age in South Indian population. We also evaluated the correlation between the chronological age (CA) and predicted age based on the Fishman method of skeletal maturation. Digital right hand-wrist radiographs of 330 individuals aged 9-20 years were obtained and the skeletal maturity stage for each subject was determined using Fishman method. The skeletal maturation indicator scores were obtained and analyzed with reference to CA and sex. Data was analyzed using the SPSS software package (version 12, SPSS Inc., Chicago, IL, USA). The study subjects had a tendency toward late maturation with the mean skeletal age (SA) estimated being significantly lowers (P < 0.05) than the mean CA at various skeletal maturity stages. Nevertheless, significant correlation was observed in this study between SA and CA for males (r = 0.82) and females (r = 0.85). Interestingly, female subjects were observed to be advanced in SA compared with males. Fishman method of skeletal maturation can be used as an alternative tool for the assessment of mean age of an individual of unknown CA in South Indian children.

  8. Development, Reliability, and Equivalence of an Alternate Form for the CQ Duty Performance-Based Measure

    DTIC Science & Technology

    2017-10-01

    distinguishes between known- groups of healthy control soldiers and those with traumatic brain injury. As such, the CQDT shows promise in helping to inform...be reliably administered and distinguishes between known- groups of healthy control soldiers and those with traumatic brain injury. As such, the CQDT...healthy controls and SM with mild TBI. If we succeed in developing an equivalent alternate form, the CQD may be used to both identify executive

  9. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 2. Technical Report #1217

    ERIC Educational Resources Information Center

    Anderson, Daniel; Lai, Cheg-Fei; Park, Bitnara Jasmine; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest an alternate form) and G-Theory/D-Study on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from the convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest. Due to…

  10. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Reading Assessments: Grade 5. Technical Report #1220

    ERIC Educational Resources Information Center

    Lai, Cheng-Fei; Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  11. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Passage Reading Fluency Assessments: Grade 4. Technical Report #1219

    ERIC Educational Resources Information Center

    Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  12. Earth Observing System (EOS) Advanced Microwave Sounding Unit-A (AMSU-A) Spares Program Plan

    NASA Technical Reports Server (NTRS)

    Chapman, Weldon

    1994-01-01

    This plan specifies the spare components to be provided for the EOS/AMSU-A instrument and the general spares philosophy for their procurement. It also address key components not recommended for spares, as well as the schedule and method for obtaining the spares. The selected spares list was generated based on component criticality, reliability, repairability, and availability. An alternative spares list is also proposed based on more stringent fiscal constraints.

  13. Active Control of Forebody Vortices on a schematic Aircraft Model

    DTIC Science & Technology

    2001-06-01

    coeffi- The system comprised two miniature soleniod on/off cient (C, = 0.0013) was sufficient to reliably switch pneumatic valves to control the flow to...method and time-average rolling moment, pitching moment, and normal force. Nomenclature T duration a valve is open during the alternating blow- b wing...reasonably high reduced frequency of the valves , and the tubes that delivered the air to the (* =0.16). Having established that the forebody vor- nozzles

  14. IDENTIFYING COMPLEMENTARY AND ALTERNATIVE MEDICINE USAGE INFORMATION FROM INTERNET RESOURCES: A SYSTEMATIC REVIEW

    PubMed Central

    Sharma, V.; Holmes, J.H.; Sarkar, I.N.

    2016-01-01

    SUMMARY Objective Identify and highlight research issues and methods used in studying Complementary and Alternative Medicine (CAM) information needs, access, and exchange over the Internet. Methods A literature search was conducted using Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines from PubMed to identify articles that have studied Internet use in the CAM context. Additional searches were conducted at Nature.com and Google Scholar. Results The Internet provides a major medium for attaining CAM information and can also serve as an avenue for conducting CAM related surveys. Based on the literature analyzed in this review, there seems to be significant interest in developing methodologies for identifying CAM treatments, including the analysis of search query data and social media platform discussions. Several studies have also underscored the challenges in developing approaches for identifying the reliability of CAM-related information on the Internet, which may not be supported with reliable sources. The overall findings of this review suggest that there are opportunities for developing approaches for making available accurate information and developing ways to restrict the spread and sale of potentially harmful CAM products and information. Conclusions Advances in Internet research are yet to be used in context of understanding CAM prevalence and perspectives. Such approaches may provide valuable insights into the current trends and needs in context of CAM use and spread. PMID:27352304

  15. Processes and Procedures for Estimating Score Reliability and Precision

    ERIC Educational Resources Information Center

    Bardhoshi, Gerta; Erford, Bradley T.

    2017-01-01

    Precision is a key facet of test development, with score reliability determined primarily according to the types of error one wants to approximate and demonstrate. This article identifies and discusses several primary forms of reliability estimation: internal consistency (i.e., split-half, KR-20, a), test-retest, alternate forms, interscorer, and…

  16. Validity and Reliability of the 8-Item Work Limitations Questionnaire.

    PubMed

    Walker, Timothy J; Tullar, Jessica M; Diamond, Pamela M; Kohl, Harold W; Amick, Benjamin C

    2017-12-01

    Purpose To evaluate factorial validity, scale reliability, test-retest reliability, convergent validity, and discriminant validity of the 8-item Work Limitations Questionnaire (WLQ) among employees from a public university system. Methods A secondary analysis using de-identified data from employees who completed an annual Health Assessment between the years 2009-2015 tested research aims. Confirmatory factor analysis (CFA) (n = 10,165) tested the latent structure of the 8-item WLQ. Scale reliability was determined using a CFA-based approach while test-retest reliability was determined using the intraclass correlation coefficient. Convergent/discriminant validity was tested by evaluating relations between the 8-item WLQ with health/performance variables for convergent validity (health-related work performance, number of chronic conditions, and general health) and demographic variables for discriminant validity (gender and institution type). Results A 1-factor model with three correlated residuals demonstrated excellent model fit (CFI = 0.99, TLI = 0.99, RMSEA = 0.03, and SRMR = 0.01). The scale reliability was acceptable (0.69, 95% CI 0.68-0.70) and the test-retest reliability was very good (ICC = 0.78). Low-to-moderate associations were observed between the 8-item WLQ and the health/performance variables while weak associations were observed between the demographic variables. Conclusions The 8-item WLQ demonstrated sufficient reliability and validity among employees from a public university system. Results suggest the 8-item WLQ is a usable alternative for studies when the more comprehensive 25-item WLQ is not available.

  17. Herbal hepatotoxicity and WHO global introspection method.

    PubMed

    Teschke, Rolf; Eickhoff, Axel; Wolff, Albrecht; Frenzel, Christian; Schulze, Johannes

    2013-01-01

    Herbal hepatotoxicity is a rare but highly disputed disease because numerous confounding variables may complicate accurate causality assessment. Case evaluation is even more difficult when the WHO global introspection method (WHO method) is applied as diagnostic algorithm. This method lacks liver specificity, hepatotoxicity validation, and quantitative items, basic qualifications required for a sound evaluation of hepatotoxicity cases. Consequently, there are no data available for reliability, sensitivity, specificity, positive and negative predictive value. Its scope is also limited by the fact that it cannot discriminate between a positive and a negative causality attribution, thereby stimulating case overdiagnosing and overreporting. The WHO method ignores uncertainties regarding daily dose, temporal association, start, duration, and end of herbal use, time to onset of the adverse reaction, and course of liver values after herb discontinuation. Insufficiently considered or ignored are comedications, preexisting liver diseases, alternative explanations upon clinical assessment, and exclusion of infections by hepatitis A-C, cytomegalovirus (CMV), Epstein-Barr virus (EBV), herpes simplex virus (HSV), and varicella zoster virus (VZV). We clearly prefer as alternative the scale of CIOMS (Council for International Organizations of Medical Sciences) which is structured, quantitative, liver specific, and validated for hepatotoxicity. In conclusion, causality of herbal hepatotoxicity is best assessed by the liver specific CIOMS scale validated for hepatotoxicity rather than the obsolete WHO method that is liver unspecific and not validated for hepatotoxicity. CIOMS based assessments will ensure the correct diagnosis and exclude alternative diagnosis that may require other specific therapies.

  18. A Confirmatory Factor Analysis of the Structure of Statistics Anxiety Measure: An examination of four alternative models

    PubMed Central

    Vahedi, Shahram; Farrokhi, Farahman

    2011-01-01

    Objective The aim of this study is to explore the confirmatory factor analysis results of the Persian adaptation of Statistics Anxiety Measure (SAM), proposed by Earp. Method The validity and reliability assessments of the scale were performed on 298 college students chosen randomly from Tabriz University in Iran. Confirmatory factor analysis (CFA) was carried out to determine the factor structures of the Persian adaptation of SAM. Results As expected, the second order model provided a better fit to the data than the three alternative models. Conclusions Hence, SAM provides an equally valid measure for use among college students. The study both expands and adds support to the existing body of math anxiety literature. PMID:22952530

  19. Alternative accounting in maternal and infant global health.

    PubMed

    Adams, Vincanne; Craig, Sienna R; Samen, Arlene

    2015-03-18

    Efforts to augment accountability through the use of metrics, and especially randomised controlled trial or other statistical methods place an increased burden on small nongovernmental organisations (NGOs) doing global health. In this paper, we explore how one small NGO works to generate forms of accountability and evidence that may not conform to new metrics trends but nevertheless deserve attention and scrutiny for being effective, practical and reliable in the area of maternal and infant health. Through an analysis of one NGO and, in particular, its organisational and ethical principles for creating a network of safety for maternal and child health, we argue that alternative forms of (ac)counting like these might provide useful evidence of another kind of successful global health work.

  20. Scaling Impacts in Life Support Architecture and Technology Selection

    NASA Technical Reports Server (NTRS)

    Lange, Kevin

    2016-01-01

    For long-duration space missions outside of Earth orbit, reliability considerations will drive higher levels of redundancy and/or on-board spares for life support equipment. Component scaling will be a critical element in minimizing overall launch mass while maintaining an acceptable level of system reliability. Building on an earlier reliability study (AIAA 2012-3491), this paper considers the impact of alternative scaling approaches, including the design of technology assemblies and their individual components to maximum, nominal, survival, or other fractional requirements. The optimal level of life support system closure is evaluated for deep-space missions of varying duration using equivalent system mass (ESM) as the comparative basis. Reliability impacts are included in ESM by estimating the number of component spares required to meet a target system reliability. Common cause failures are included in the analysis. ISS and ISS-derived life support technologies are considered along with selected alternatives. This study focusses on minimizing launch mass, which may be enabling for deep-space missions.

  1. 5 CFR 841.410 - Contents of petition for appeal.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... sufficient and reliable for factors 10 through 13 unless the appealing agency is able to demonstrate, through sufficient and reliable data relating to its employees or former employees, the use of alternative factors is...

  2. An Examination of Test-Retest, Alternate Form Reliability, and Generalizability Theory Study of the easyCBM Word and Passage Reading Fluency Assessments: Grade 3. Technical Report #1218

    ERIC Educational Resources Information Center

    Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald

    2012-01-01

    This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…

  3. The development and validation of a two-tiered multiple-choice instrument to identify alternative conceptions in earth science

    NASA Astrophysics Data System (ADS)

    Mangione, Katherine Anna

    This study was to determine reliability and validity for a two-tiered, multiple- choice instrument designed to identify alternative conceptions in earth science. Additionally, this study sought to identify alternative conceptions in earth science held by preservice teachers, to investigate relationships between self-reported confidence scores and understanding of earth science concepts, and to describe relationships between content knowledge and alternative conceptions and planning instruction in the science classroom. Eighty-seven preservice teachers enrolled in the MAT program participated in this study. Sixty-eight participants were female, twelve were male, and seven chose not to answer. Forty-seven participants were in the elementary certification program, five were in the middle school certification program, and twenty-nine were pursuing secondary certification. Results indicate that the two-tiered, multiple-choice format can be a reliable and valid method for identifying alternative conceptions. Preservice teachers in all certification areas who participated in this study may possess common alternative conceptions previously identified in the literature. Alternative conceptions included: all rivers flow north to south, the shadow of the Earth covers the Moon causing lunar phases, the Sun is always directly overhead at noon, weather can be predicted by animal coverings, and seasons are caused by the Earth's proximity to the Sun. Statistical analyses indicated differences, however not all of them significant, among all subgroups according to gender and certification area. Generally males outperformed females and preservice teachers pursuing middle school certification had higher scores on the questionnaire followed by those obtaining secondary certification. Elementary preservice teachers scored the lowest. Additionally, self-reported scores of confidence in one's answers and understanding of the earth science concept in question were analyzed. There was a slight positive correlation between overall score and both confidence and understanding. Responses on the questionnaire were investigated with respect to pedagogical choices. Evidence suggests that content knowledge and having alternative conceptions or science fragments may impact a teacher's pedagogical choices. Through careful development of instruments like ACES-Q II-R and other two- tiered, multiple-choice instruments, educators and researchers car not only identify possible alternative conceptions, they can raise an awareness of alternative conceptions held by children and adults.

  4. 33 CFR 154.2180 - Alternative testing program-Generally.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Control Systems Alternative Analyzer and Pressure Sensor Reliability Testing § 154.2180 Alternative... and pressure sensor safety testing requirements provided by 33 CFR 154.2150(c) and 33 CFR 154.2250(c... 33 CFR 154.2181. (d) All pressure sensors/switches used in a VCS must be tested for safety system...

  5. Reliability of Pressure Ulcer Rates: How Precisely Can We Differentiate Among Hospital Units, and Does the Standard Signal-Noise Reliability Measure Reflect This Precision?

    PubMed

    Staggs, Vincent S; Cramer, Emily

    2016-08-01

    Hospital performance reports often include rankings of unit pressure ulcer rates. Differentiating among units on the basis of quality requires reliable measurement. Our objectives were to describe and apply methods for assessing reliability of hospital-acquired pressure ulcer rates and evaluate a standard signal-noise reliability measure as an indicator of precision of differentiation among units. Quarterly pressure ulcer data from 8,199 critical care, step-down, medical, surgical, and medical-surgical nursing units from 1,299 US hospitals were analyzed. Using beta-binomial models, we estimated between-unit variability (signal) and within-unit variability (noise) in annual unit pressure ulcer rates. Signal-noise reliability was computed as the ratio of between-unit variability to the total of between- and within-unit variability. To assess precision of differentiation among units based on ranked pressure ulcer rates, we simulated data to estimate the probabilities of a unit's observed pressure ulcer rate rank in a given sample falling within five and ten percentiles of its true rank, and the probabilities of units with ulcer rates in the highest quartile and highest decile being identified as such. We assessed the signal-noise measure as an indicator of differentiation precision by computing its correlations with these probabilities. Pressure ulcer rates based on a single year of quarterly or weekly prevalence surveys were too susceptible to noise to allow for precise differentiation among units, and signal-noise reliability was a poor indicator of precision of differentiation. To ensure precise differentiation on the basis of true differences, alternative methods of assessing reliability should be applied to measures purported to differentiate among providers or units based on quality. © 2016 The Authors. Research in Nursing & Health published by Wiley Periodicals, Inc. © 2016 The Authors. Research in Nursing & Health published by Wiley Periodicals, Inc.

  6. Adaptation strategies for water supply management in a drought prone Mediterranean river basin: Application of outranking method.

    PubMed

    Kumar, Vikas; Del Vasto-Terrientes, Luis; Valls, Aida; Schuhmacher, Marta

    2016-01-01

    The regional water allocation planning is one of those complex decision problems where holistic approach to water supply management considering different criteria would be valuable. However, multi-criteria decision making with diverse indicators measured on different scales and uncertainty levels is difficult to solve. Objective of this paper is to develop scenarios for the future imbalances in water supply and demand for a water stressed Mediterranean area of Northern Spain (Tarragona) and to test the applicability and suitability of an outranking method ELECTRE-III-H for evaluating sectoral water allocation policies. This study is focused on the use of alternative water supply scenarios to fulfil the demand of water from three major sectors: domestic, industrial and agricultural. A detail scenario planning for regional water demand and supply has been discussed. For each future scenario of climate change, the goal is to obtain a ranking of a set of possible actions with regards to different types of indicators (costs, water stress and environmental impact). The analytical method used is based on outranking models for decision aid with hierarchical structures of criteria and ranking alternatives using partial preorders based on pairwise preference relations. We compare several adaptation measures including alternative water sources (reclaimed water and desalination); inter basin water transfer and sectoral demand management coming from industry, agriculture and domestic sectors and tested the sustainability of management actions for different climate change scenarios. Results have shown use of alternative water resources as the most reliable alternative with medium reclaimed water reuse in industry and agriculture and low to medium use of desalination water in domestic and industrial sectors as the best alternative. The proposed method has several advantages such as the management of heterogeneous scales of measurement without requiring any artificial transformation and the management of uncertainty by means of comparisons at a qualitative level in terms of the decision maker preferences. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Priority Determination of Underwater Tourism Site Development in Gorontalo Province using Analytical Hierarchy Process (AHP)

    NASA Astrophysics Data System (ADS)

    Rohandi, M.; Tuloli, M. Y.; Jassin, R. T.

    2018-02-01

    This research aims to determine the development of priority of underwater tourism in Gorontalo province using the Analytical Hierarchy Process (AHP) method which is one of DSS methods applying Multi-Attribute Decision Making (MADM). This method used 5 criteria and 28 alternatives to determine the best priority of underwater tourism site development in Gorontalo province. Based on the AHP calculation it appeared that the best priority development of underwater tourism site is Pulau Cinta whose total AHP score is 0.489 or 48.9%. This DSS produced a reliable result, faster solution, time-saving, and low cost for the decision makers to obtain the best underwater tourism site to be developed.

  8. Assessment of skeletal maturation based on cervical vertebrae in CBCT.

    PubMed

    Shim, Jocelyne J; Heo, Giseon; Lagravère, Manuel O

    2012-12-01

    Diagnosis of skeletal age in adolescents helps orthodontists select and time treatments. Currently this is done using lateral cephalometric radiographs. This study evaluates the application of the conventional method in cone-beam computer tomographic (CBCT) images to bring forth assessment of skeletal maturation in three-dimensions. Ninety-eight lateral cephalometric radiographs and CBCT scans were collected from orthodontic patients between 11 to 17 years of age over an 18-month period. CBCT scans were examined in seven sagittal slices based on cervical vertebral maturation staging (CVMS). Collected CVMS values were compared with those from corresponding lateral cephalometric radiograph. CVMS measured from CBCT and lateral cephalometric radiographs were the same on average. However, they were not consistent with each other and scored interclass correlation coefficient of 0.155 in validity test. Interoperator reliability was weak (0.581). Adaptation of cervical vertebrae maturation staging in CBCT requires further clarifications or modifications to become consistent with lateral cephalometric examinations and to become a reliable method. Alternatively, a completely new method may be developed consisting of maturational indicators or landmarks unique to CBCT imaging. Copyright © 2012. Published by Elsevier Masson SAS.

  9. Detection Copy Number Variants from NGS with Sparse and Smooth Constraints.

    PubMed

    Zhang, Yue; Cheung, Yiu-Ming; Xu, Bo; Su, Weifeng

    2017-01-01

    It is known that copy number variations (CNVs) are associated with complex diseases and particular tumor types, thus reliable identification of CNVs is of great potential value. Recent advances in next generation sequencing (NGS) data analysis have helped manifest the richness of CNV information. However, the performances of these methods are not consistent. Reliably finding CNVs in NGS data in an efficient way remains a challenging topic, worthy of further investigation. Accordingly, we tackle the problem by formulating CNVs identification into a quadratic optimization problem involving two constraints. By imposing the constraints of sparsity and smoothness, the reconstructed read depth signal from NGS is anticipated to fit the CNVs patterns more accurately. An efficient numerical solution tailored from alternating direction minimization (ADM) framework is elaborated. We demonstrate the advantages of the proposed method, namely ADM-CNV, by comparing it with six popular CNV detection methods using synthetic, simulated, and empirical sequencing data. It is shown that the proposed approach can successfully reconstruct CNV patterns from raw data, and achieve superior or comparable performance in detection of the CNVs compared to the existing counterparts.

  10. The reliability of Fishman method of skeletal maturation for age estimation in children of South Indian population

    PubMed Central

    Mohammed, Rezwana Begum; Kalyan, V. Siva; Tircouveluri, Saritha; Vegesna, Goutham Chakravarthy; Chirla, Anil; Varma, D. Maruthi

    2014-01-01

    Introduction: Determining the age of a person in the absence of documentary evidence of birth is essential for legal and medico-legal purpose. Fishman method of skeletal maturation is widely used for this purpose; however, the reliability of this method for people with all geographic locations is not well-established. Aims and Objectives: In this study, we assessed various stages of carpal and metacarpal bone maturation and tested the reliability of Fishman method of skeletal maturation to estimate the age in South Indian population. We also evaluated the correlation between the chronological age (CA) and predicted age based on the Fishman method of skeletal maturation. Materials and Methods: Digital right hand-wrist radiographs of 330 individuals aged 9-20 years were obtained and the skeletal maturity stage for each subject was determined using Fishman method. The skeletal maturation indicator scores were obtained and analyzed with reference to CA and sex. Data was analyzed using the SPSS software package (version 12, SPSS Inc., Chicago, IL, USA). Results: The study subjects had a tendency toward late maturation with the mean skeletal age (SA) estimated being significantly lowers (P < 0.05) than the mean CA at various skeletal maturity stages. Nevertheless, significant correlation was observed in this study between SA and CA for males (r = 0.82) and females (r = 0.85). Interestingly, female subjects were observed to be advanced in SA compared with males. Conclusion: Fishman method of skeletal maturation can be used as an alternative tool for the assessment of mean age of an individual of unknown CA in South Indian children. PMID:25097402

  11. Reliability and validity of the Tilburg Frailty Indicator (TFI) among Chinese community-dwelling older people.

    PubMed

    Dong, Lijuan; Liu, Na; Tian, Xiaoyu; Qiao, Xiaoxia; Gobbens, Robbert J J; Kane, Robert L; Wang, Cuili

    2017-11-01

    To translate the Tilburg Frailty Indicator (TFI) into Chinese and assess its reliability and validity. A sample of 917 community-dwelling older people, aged ≥60 years, in a Chinese city was included between August 2015 and March 2016. Construct validity was assessed using alternative measures corresponding to the TFI items, including self-rated health status (SRH), unintentional weight loss, walking speed, timed-up-and-go tests (TUGT), making telephone calls, grip strength, exhaustion, Short Portable Mental Status Questionnaire (SPMSQ), Geriatric Depression scale (GDS-15), emotional role, Adaptability Partnership Growth Affection and Resolve scale (APGAR) and Social Support Rating Scale (SSRS). Fried's phenotype and frailty index were measured to evaluate criterion validity. Adverse health outcomes (ADL and IADL disability, healthcare utilization, GDS-15, SSRS) were used to assess predictive (concurrent) validity. The internal consistency reliability was good (Cronbach's α=0.71). The test-retest reliability was strong (r=0.88). Kappa coefficients showed agreements between the TFI items and corresponding alternative measures. Alternative measures correlated as expected with the three domains of TFI, with an exclusion that alternative psychological measures had similar correlations with psychological and physical domains of the TFI. The Chinese TFI had excellent criterion validity with the AUCs regarding physical phenotype and frailty index of 0.87 and 0.86, respectively. The predictive (concurrent) validities of the adverse health outcomes and healthcare utilization were acceptable (AUCs: 0.65-0.83). The Chinese TFI has good validity and reliability as an integral instrument to measure frailty of older people living in the community in China. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Photogrammetric Point Clouds Generation in Urban Areas from Integrated Image Matching and Segmentation

    NASA Astrophysics Data System (ADS)

    Ye, L.; Wu, B.

    2017-09-01

    High-resolution imagery is an attractive option for surveying and mapping applications due to the advantages of high quality imaging, short revisit time, and lower cost. Automated reliable and dense image matching is essential for photogrammetric 3D data derivation. Such matching, in urban areas, however, is extremely difficult, owing to the complexity of urban textures and severe occlusion problems on the images caused by tall buildings. Aimed at exploiting high-resolution imagery for 3D urban modelling applications, this paper presents an integrated image matching and segmentation approach for reliable dense matching of high-resolution imagery in urban areas. The approach is based on the framework of our existing self-adaptive triangulation constrained image matching (SATM), but incorporates three novel aspects to tackle the image matching difficulties in urban areas: 1) occlusion filtering based on image segmentation, 2) segment-adaptive similarity correlation to reduce the similarity ambiguity, 3) improved dense matching propagation to provide more reliable matches in urban areas. Experimental analyses were conducted using aerial images of Vaihingen, Germany and high-resolution satellite images in Hong Kong. The photogrammetric point clouds were generated, from which digital surface models (DSMs) were derived. They were compared with the corresponding airborne laser scanning data and the DSMs generated from the Semi-Global matching (SGM) method. The experimental results show that the proposed approach is able to produce dense and reliable matches comparable to SGM in flat areas, while for densely built-up areas, the proposed method performs better than SGM. The proposed method offers an alternative solution for 3D surface reconstruction in urban areas.

  13. Quantifying frontal plane knee motion during single limb squats: reliability and validity of 2-dimensional measures.

    PubMed

    Gwynne, Craig R; Curran, Sarah A

    2014-12-01

    Clinical assessment of lower limb kinematics during dynamic tasks may identify individuals who demonstrate abnormal movement patterns that may lead to etiology of exacerbation of knee conditions such as patellofemoral joint (PFJt) pain. The purpose of this study was to determine the reliability, validity and associated measurement error of a clinically appropriate two-dimensional (2-D) procedure of quantifying frontal plane knee alignment during single limb squats. Nine female and nine male recreationally active subjects with no history of PFJt pain had frontal plane limb alignment assessed using three-dimensional (3-D) motion analysis and digital video cameras (2-D analysis) while performing single limb squats. The association between 2-D and 3-D measures was quantified using Pearson's product correlation coefficients. Intraclass correlation coefficients (ICCs) were determined for within- and between-session reliability of 2-D data and standard error of measurement (SEM) was used to establish measurement error. Frontal plane limb alignment assessed with 2-D analysis demonstrated good correlation compared with 3-D methods (r = 0.64 to 0.78, p < 0.001). Within-session (0.86) and between-session ICCs (0.74) demonstrated good reliability for 2-D measures and SEM scores ranged from 2° to 4°. 2-D measures have good consistency and may provide a valid measure of lower limb alignment when compared to existing 3-D methods. Assessment of lower limb kinematics using 2-D methods may be an accurate and clinically useful alternative to 3-D motion analysis when identifying individuals who demonstrate abnormal movement patterns associated with PFJt pain. 2b.

  14. Tactile Acuity Charts: A Reliable Measure of Spatial Acuity

    PubMed Central

    Bruns, Patrick; Camargo, Carlos J.; Campanella, Humberto; Esteve, Jaume; Dinse, Hubert R.; Röder, Brigitte

    2014-01-01

    For assessing tactile spatial resolution it has recently been recommended to use tactile acuity charts which follow the design principles of the Snellen letter charts for visual acuity and involve active touch. However, it is currently unknown whether acuity thresholds obtained with this newly developed psychophysical procedure are in accordance with established measures of tactile acuity that involve passive contact with fixed duration and control of contact force. Here we directly compared tactile acuity thresholds obtained with the acuity charts to traditional two-point and grating orientation thresholds in a group of young healthy adults. For this purpose, two types of charts, using either Braille-like dot patterns or embossed Landolt rings with different orientations, were adapted from previous studies. Measurements with the two types of charts were equivalent, but generally more reliable with the dot pattern chart. A comparison with the two-point and grating orientation task data showed that the test-retest reliability of the acuity chart measurements after one week was superior to that of the passive methods. Individual thresholds obtained with the acuity charts agreed reasonably with the grating orientation threshold, but less so with the two-point threshold that yielded relatively distinct acuity estimates compared to the other methods. This potentially considerable amount of mismatch between different measures of tactile acuity suggests that tactile spatial resolution is a complex entity that should ideally be measured with different methods in parallel. The simple test procedure and high reliability of the acuity charts makes them a promising complement and alternative to the traditional two-point and grating orientation thresholds. PMID:24504346

  15. A weight-of-evidence approach to assess chemicals: case study on the assessment of persistence of 4,6-substituted phenolic benzotriazoles in the environment.

    PubMed

    Brandt, Marc; Becker, Eva; Jöhncke, Ulrich; Sättler, Daniel; Schulte, Christoph

    2016-01-01

    One important purpose of the European REACH Regulation (EC No. 1907/2006) is to promote the use of alternative methods for assessment of hazards of substances in order to avoid animal testing. Experience with environmental hazard assessment under REACH shows that efficient alternative methods are needed in order to assess chemicals when standard test data are missing. One such assessment method is the weight-of-evidence (WoE) approach. In this study, the WoE approach was used to assess the persistence of certain phenolic benzotriazoles, a group of substances including also such of very high concern (SVHC). For phenolic benzotriazoles, assessment of the environmental persistence is challenging as standard information, i.e. simulation tests on biodegradation are not available. Thus, the WoE approach was used: overall information resulting from many sources was considered, and individual uncertainties of each source analysed separately. In a second step, all information was aggregated giving an overall picture of persistence to assess the degradability of the phenolic benzotriazoles under consideration although the reliability of individual sources was incomplete. Overall, the evidence suggesting that phenolic benzotriazoles are very persistent in the environment is unambiguous. This was demonstrated by a WoE approach considering the prerequisites of REACH by combining several limited information sources. The combination enabled a clear overall assessment which can be reliably used for SVHC identification. Finally, it is recommended to include WoE approaches as an important tool in future environmental risk assessments.

  16. Fault Tree Analysis for an Inspection Robot in a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Ferguson, Thomas A.; Lu, Lixuan

    2017-09-01

    The life extension of current nuclear reactors has led to an increasing demand on inspection and maintenance of critical reactor components that are too expensive to replace. To reduce the exposure dosage to workers, robotics have become an attractive alternative as a preventative safety tool in nuclear power plants. It is crucial to understand the reliability of these robots in order to increase the veracity and confidence of their results. This study presents the Fault Tree (FT) analysis to a coolant outlet piper snake-arm inspection robot in a nuclear power plant. Fault trees were constructed for a qualitative analysis to determine the reliability of the robot. Insight on the applicability of fault tree methods for inspection robotics in the nuclear industry is gained through this investigation.

  17. Microbial Diagnostic Microarrays for the Detection and Typing of Food- and Water-Borne (Bacterial) Pathogens

    PubMed Central

    Kostić, Tanja; Sessitsch, Angela

    2011-01-01

    Reliable and sensitive pathogen detection in clinical and environmental (including food and water) samples is of greatest importance for public health. Standard microbiological methods have several limitations and improved alternatives are needed. Most important requirements for reliable analysis include: (i) specificity; (ii) sensitivity; (iii) multiplexing potential; (iv) robustness; (v) speed; (vi) automation potential; and (vii) low cost. Microarray technology can, through its very nature, fulfill many of these requirements directly and the remaining challenges have been tackled. In this review, we attempt to compare performance characteristics of the microbial diagnostic microarrays developed for the detection and typing of food and water pathogens, and discuss limitations, points still to be addressed and issues specific for the analysis of food, water and environmental samples. PMID:27605332

  18. Robust Derivation of Risk Reduction Strategies

    NASA Technical Reports Server (NTRS)

    Richardson, Julian; Port, Daniel; Feather, Martin

    2007-01-01

    Effective risk reduction strategies can be derived mechanically given sufficient characterization of the risks present in the system and the effectiveness of available risk reduction techniques. In this paper, we address an important question: can we reliably expect mechanically derived risk reduction strategies to be better than fixed or hand-selected risk reduction strategies, given that the quantitative assessment of risks and risk reduction techniques upon which mechanical derivation is based is difficult and likely to be inaccurate? We consider this question relative to two methods for deriving effective risk reduction strategies: the strategic method defined by Kazman, Port et al [Port et al, 2005], and the Defect Detection and Prevention (DDP) tool [Feather & Cornford, 2003]. We performed a number of sensitivity experiments to evaluate how inaccurate knowledge of risk and risk reduction techniques affect the performance of the strategies computed by the Strategic Method compared to a variety of alternative strategies. The experimental results indicate that strategies computed by the Strategic Method were significantly more effective than the alternative risk reduction strategies, even when knowledge of risk and risk reduction techniques was very inaccurate. The robustness of the Strategic Method suggests that its use should be considered in a wide range of projects.

  19. Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott

    1995-01-01

    This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.

  20. Application of real-time PCR for total airborne bacterial assessment: Comparison with epifluorescence microscopy and culture-dependent methods

    NASA Astrophysics Data System (ADS)

    Rinsoz, Thomas; Duquenne, Philippe; Greff-Mirguet, Guylaine; Oppliger, Anne

    Traditional culture-dependent methods to quantify and identify airborne microorganisms are limited by factors such as short-duration sampling times and inability to count non-culturable or non-viable bacteria. Consequently, the quantitative assessment of bioaerosols is often underestimated. Use of the real-time quantitative polymerase chain reaction (Q-PCR) to quantify bacteria in environmental samples presents an alternative method, which should overcome this problem. The aim of this study was to evaluate the performance of a real-time Q-PCR assay as a simple and reliable way to quantify the airborne bacterial load within poultry houses and sewage treatment plants, in comparison with epifluorescence microscopy and culture-dependent methods. The estimates of bacterial load that we obtained from real-time PCR and epifluorescence methods, are comparable, however, our analysis of sewage treatment plants indicate these methods give values 270-290 fold greater than those obtained by the "impaction on nutrient agar" method. The culture-dependent method of air impaction on nutrient agar was also inadequate in poultry houses, as was the impinger-culture method, which gave a bacterial load estimate 32-fold lower than obtained by Q-PCR. Real-time quantitative PCR thus proves to be a reliable, discerning, and simple method that could be used to estimate airborne bacterial load in a broad variety of other environments expected to carry high numbers of airborne bacteria.

  1. Real-time electron density measurements from Cotton-Mouton effect in JET machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brombin, M.; Electrical Engineering Department, Padova University, via Gradenigo 6-A, 35131 Padova; Boboc, A.

    Real-time density profile measurements are essential for advanced fusion tokamak operation and interferometry is a proven method for this task. Nevertheless, as a consequence of edge localized modes, pellet injections, fast density increases, or disruptions, the interferometer is subject to fringe jumps, which produce loss of the signal preventing reliable use of the measured density in a real-time feedback controller. An alternative method to measure the density is polarimetry based on the Cotton-Mouton effect, which is proportional to the line-integrated electron density. A new analysis approach has been implemented and tested to verify the reliability of the Cotton-Mouton measurements formore » a wide range of plasma parameters and to compare the density evaluated from polarimetry with that from interferometry. The density measurements based on polarimetry are going to be integrated in the real-time control system of JET since the difference with the interferometry is within one fringe for more than 90% of the cases.« less

  2. A Bayesian Approach for Measurements of Stray Neutrons at Proton Therapy Facilities: Quantifying Neutron Dose Uncertainty.

    PubMed

    Dommert, M; Reginatto, M; Zboril, M; Fiedler, F; Helmbrecht, S; Enghardt, W; Lutz, B

    2017-11-28

    Bonner sphere measurements are typically analyzed using unfolding codes. It is well known that it is difficult to get reliable estimates of uncertainties for standard unfolding procedures. An alternative approach is to analyze the data using Bayesian parameter estimation. This method provides reliable estimates of the uncertainties of neutron spectra leading to rigorous estimates of uncertainties of the dose. We extend previous Bayesian approaches and apply the method to stray neutrons in proton therapy environments by introducing a new parameterized model which describes the main features of the expected neutron spectra. The parameterization is based on information that is available from measurements and detailed Monte Carlo simulations. The validity of this approach has been validated with results of an experiment using Bonner spheres carried out at the experimental hall of the OncoRay proton therapy facility in Dresden. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Analysis of simplified heat transfer models for thermal property determination of nano-film by TDTR method

    NASA Astrophysics Data System (ADS)

    Wang, Xinwei; Chen, Zhe; Sun, Fangyuan; Zhang, Hang; Jiang, Yuyan; Tang, Dawei

    2018-03-01

    Heat transfer in nanostructures is of critical importance for a wide range of applications such as functional materials and thermal management of electronics. Time-domain thermoreflectance (TDTR) has been proved to be a reliable measurement technique for the thermal property determinations of nanoscale structures. However, it is difficult to determine more than three thermal properties at the same time. Heat transfer model simplifications can reduce the fitting variables and provide an alternative way for thermal property determination. In this paper, two simplified models are investigated and analyzed by the transform matrix method and simulations. TDTR measurements are performed on Al-SiO2-Si samples with different SiO2 thickness. Both theoretical and experimental results show that the simplified tri-layer model (STM) is reliable and suitable for thin film samples with a wide range of thickness. Furthermore, the STM can also extract the intrinsic thermal conductivity and interfacial thermal resistance from serial samples with different thickness.

  4. Assessment of an ensemble seasonal streamflow forecasting system for Australia

    NASA Astrophysics Data System (ADS)

    Bennett, James C.; Wang, Quan J.; Robertson, David E.; Schepen, Andrew; Li, Ming; Michael, Kelvin

    2017-11-01

    Despite an increasing availability of skilful long-range streamflow forecasts, many water agencies still rely on simple resampled historical inflow sequences (stochastic scenarios) to plan operations over the coming year. We assess a recently developed forecasting system called forecast guided stochastic scenarios (FoGSS) as a skilful alternative to standard stochastic scenarios for the Australian continent. FoGSS uses climate forecasts from a coupled ocean-land-atmosphere prediction system, post-processed with the method of calibration, bridging and merging. Ensemble rainfall forecasts force a monthly rainfall-runoff model, while a staged hydrological error model quantifies and propagates hydrological forecast uncertainty through forecast lead times. FoGSS is able to generate ensemble streamflow forecasts in the form of monthly time series to a 12-month forecast horizon. FoGSS is tested on 63 Australian catchments that cover a wide range of climates, including 21 ephemeral rivers. In all perennial and many ephemeral catchments, FoGSS provides an effective alternative to resampled historical inflow sequences. FoGSS generally produces skilful forecasts at shorter lead times ( < 4 months), and transits to climatology-like forecasts at longer lead times. Forecasts are generally reliable and unbiased. However, FoGSS does not perform well in very dry catchments (catchments that experience zero flows more than half the time in some months), sometimes producing strongly negative forecast skill and poor reliability. We attempt to improve forecasts through the use of (i) ESP rainfall forcings, (ii) different rainfall-runoff models, and (iii) a Bayesian prior to encourage the error model to return climatology forecasts in months when the rainfall-runoff model performs poorly. Of these, the use of the prior offers the clearest benefit in very dry catchments, where it moderates strongly negative forecast skill and reduces bias in some instances. However, the prior does not remedy poor reliability in very dry catchments. Overall, FoGSS is an attractive alternative to historical inflow sequences in all but the driest catchments. We discuss ways in which forecast reliability in very dry catchments could be improved in future work.

  5. Identification of Staphylococcus saprophyticus isolated from patients with urinary tract infection using a simple set of biochemical tests correlating with 16S-23S interspace region molecular weight patterns.

    PubMed

    Ferreira, Adriano Martison; Bonesso, Mariana Fávero; Mondelli, Alessandro Lia; da Cunha, Maria de Lourdes Ribeiro de Souza

    2012-12-01

    The emergence of Staphylococcus spp. not only as human pathogens, but also as reservoirs of antibiotic resistance determinants, requires the development of methods for their rapid and reliable identification in medically important samples. The aim of this study was to compare three phenotypic methods for the identification of Staphylococcus spp. isolated from patients with urinary tract infection using the PCR of the 16S-23S interspace region generating molecular weight patterns (ITR-PCR) as reference. All 57 S. saprophyticus studied were correctly identified using only the novobiocin disk. A rate of agreement of 98.0% was obtained for the simplified battery of biochemical tests in relation to ITR-PCR, whereas the Vitek I system and novobiocin disk showed 81.2% and 89.1% agreement, respectively. No other novobiocin-resistant non-S. saprophyticus strain was identified. Thus, the novobiocin disk is a feasible alternative for the identification of S. saprophyticus in urine samples in laboratories with limited resources. ITR-PCR and the simplified battery of biochemical tests were more reliable than the commercial systems currently available. This study confirms that automated systems are still unable to correctly differentiate CoNS species and that simple, reliable and inexpensive methods can be used for routine identification. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Splicing predictions reliably classify different types of alternative splicing

    PubMed Central

    Busch, Anke; Hertel, Klemens J.

    2015-01-01

    Alternative splicing is a key player in the creation of complex mammalian transcriptomes and its misregulation is associated with many human diseases. Multiple mRNA isoforms are generated from most human genes, a process mediated by the interplay of various RNA signature elements and trans-acting factors that guide spliceosomal assembly and intron removal. Here, we introduce a splicing predictor that evaluates hundreds of RNA features simultaneously to successfully differentiate between exons that are constitutively spliced, exons that undergo alternative 5′ or 3′ splice-site selection, and alternative cassette-type exons. Surprisingly, the splicing predictor did not feature strong discriminatory contributions from binding sites for known splicing regulators. Rather, the ability of an exon to be involved in one or multiple types of alternative splicing is dictated by its immediate sequence context, mainly driven by the identity of the exon's splice sites, the conservation around them, and its exon/intron architecture. Thus, the splicing behavior of human exons can be reliably predicted based on basic RNA sequence elements. PMID:25805853

  7. A radio-aware routing algorithm for reliable directed diffusion in lossy wireless sensor networks.

    PubMed

    Kim, Yong-Pyo; Jung, Euihyun; Park, Yong-Jin

    2009-01-01

    In Wireless Sensor Networks (WSNs), transmission errors occur frequently due to node failure, battery discharge, contention or interference by objects. Although Directed Diffusion has been considered as a prominent data-centric routing algorithm, it has some weaknesses due to unexpected network errors. In order to address these problems, we proposed a radio-aware routing algorithm to improve the reliability of Directed Diffusion in lossy WSNs. The proposed algorithm is aware of the network status based on the radio information from MAC and PHY layers using a cross-layer design. The cross-layer design can be used to get detailed information about current status of wireless network such as a link quality or transmission errors of communication links. The radio information indicating variant network conditions and link quality was used to determine an alternative route that provides reliable data transmission under lossy WSNs. According to the simulation result, the radio-aware reliable routing algorithm showed better performance in both grid and random topologies with various error rates. The proposed solution suggested the possibility of providing a reliable transmission method for QoS requests in lossy WSNs based on the radio-awareness. The energy and mobility issues will be addressed in the future work.

  8. An analysis code for the Rapid Engineering Estimation of Momentum and Energy Losses (REMEL)

    NASA Technical Reports Server (NTRS)

    Dechant, Lawrence J.

    1994-01-01

    Nonideal behavior has traditionally been modeled by defining efficiency (a comparison between actual and isentropic processes), and subsequent specification by empirical or heuristic methods. With the increasing complexity of aeropropulsion system designs, the reliability of these more traditional methods is uncertain. Computational fluid dynamics (CFD) and experimental methods can provide this information but are expensive in terms of human resources, cost, and time. This report discusses an alternative to empirical and CFD methods by applying classical analytical techniques and a simplified flow model to provide rapid engineering estimates of these losses based on steady, quasi-one-dimensional governing equations including viscous and heat transfer terms (estimated by Reynold's analogy). A preliminary verification of REMEL has been compared with full Navier-Stokes (FNS) and CFD boundary layer computations for several high-speed inlet and forebody designs. Current methods compare quite well with more complex method results and solutions compare very well with simple degenerate and asymptotic results such as Fanno flow, isentropic variable area flow, and a newly developed, combined variable area duct with friction flow solution. These solution comparisons may offer an alternative to transitional and CFD-intense methods for the rapid estimation of viscous and heat transfer losses in aeropropulsion systems.

  9. Comparison of Elastography, Serum Marker Scores, and Histology for the Assessment of Liver Fibrosis in Hepatitis B Virus (HBV)-Infected Patients in Burkina Faso

    PubMed Central

    Bonnard, Philippe; Sombié, Roger; Lescure, Francois-Xavier; Bougouma, Alain; Guiard-Schmid, Jean Baptiste; Poynard, Thierry; Calès, Paul; Housset, Chantal; Callard, Patrice; Pendeven, Catherine Le; Drabo, Joseph; Carrat, Fabrice; Pialoux, Gilles

    2010-01-01

    Liver fibrosis (LF) must be assessed before talking treatment decisions in hepatitis B. In Burkina Faso, liver biopsy (LB) remains the “gold standard” method for this purpose. Access to treatment might be simpler if reliable alternative techniques for LF evaluation were available. The hepatitis B virus (HBV)-infected patients who underwent LB was invited to have liver stiffness measurement (Fibroscan) and serum marker assays. Fifty-nine patients were enrolled. The performance of each technique for distinguishing F0F1 from F2F3F4 was compared. The area under receiver operating characteristic (AUROC) curves was 0.61, 0.71, 0.79, 0.82, and 0.87 for the aspartate transaminase to platelet ratio index (APRI), Fib-4, Fibrotest, Fibrometre, and Fibroscan. Elastometric thresholds were identified for significant fibrosis and cirrhosis. Combined use of Fibroscan and a serum marker could avoid 80% of biopsies. This study shows that the results of alternative methods concord with those of histology in HBV-infected patients in Burkina Faso. These alternative techniques could help physicians to identify patients requiring treatment. PMID:20207872

  10. Massive Photons: An Infrared Regularization Scheme for Lattice QCD+QED.

    PubMed

    Endres, Michael G; Shindler, Andrea; Tiburzi, Brian C; Walker-Loud, André

    2016-08-12

    Standard methods for including electromagnetic interactions in lattice quantum chromodynamics calculations result in power-law finite-volume corrections to physical quantities. Removing these by extrapolation requires costly computations at multiple volumes. We introduce a photon mass to alternatively regulate the infrared, and rely on effective field theory to remove its unphysical effects. Electromagnetic modifications to the hadron spectrum are reliably estimated with a precision and cost comparable to conventional approaches that utilize multiple larger volumes. A significant overall cost advantage emerges when accounting for ensemble generation. The proposed method may benefit lattice calculations involving multiple charged hadrons, as well as quantum many-body computations with long-range Coulomb interactions.

  11. Method to Eliminate Flux Linkage DC Component in Load Transformer for Static Transfer Switch

    PubMed Central

    2014-01-01

    Many industrial and commercial sensitive loads are subject to the voltage sags and interruptions. The static transfer switch (STS) based on the thyristors is applied to improve the power quality and reliability. However, the transfer will result in severe inrush current in the load transformer, because of the DC component in the magnetic flux generated in the transfer process. The inrush current which is always 2~30 p.u. can cause the disoperation of relay protective devices and bring potential damage to the transformer. The way to eliminate the DC component is to transfer the related phases when the residual flux linkage of the load transformer and the prospective flux linkage of the alternate source are equal. This paper analyzes how the flux linkage of each winding in the load transformer changes in the transfer process. Based on the residual flux linkage when the preferred source is completely disconnected, the method to calculate the proper time point to close each phase of the alternate source is developed. Simulation and laboratory experiments results are presented to show the effectiveness of the transfer method. PMID:25133255

  12. Method to eliminate flux linkage DC component in load transformer for static transfer switch.

    PubMed

    He, Yu; Mao, Chengxiong; Lu, Jiming; Wang, Dan; Tian, Bing

    2014-01-01

    Many industrial and commercial sensitive loads are subject to the voltage sags and interruptions. The static transfer switch (STS) based on the thyristors is applied to improve the power quality and reliability. However, the transfer will result in severe inrush current in the load transformer, because of the DC component in the magnetic flux generated in the transfer process. The inrush current which is always 2 ~ 30 p.u. can cause the disoperation of relay protective devices and bring potential damage to the transformer. The way to eliminate the DC component is to transfer the related phases when the residual flux linkage of the load transformer and the prospective flux linkage of the alternate source are equal. This paper analyzes how the flux linkage of each winding in the load transformer changes in the transfer process. Based on the residual flux linkage when the preferred source is completely disconnected, the method to calculate the proper time point to close each phase of the alternate source is developed. Simulation and laboratory experiments results are presented to show the effectiveness of the transfer method.

  13. A comparative study between different alternatives to prepare gaseous standards for calibrating UV-Ion Mobility Spectrometers.

    PubMed

    Criado-García, Laura; Garrido-Delgado, Rocío; Arce, Lourdes; Valcárcel, Miguel

    2013-07-15

    An UV-Ion Mobility Spectrometer is a simple, rapid, inexpensive instrument widely used in environmental analysis among other fields. The advantageous features of its underlying technology can be of great help towards developing reliable, economical methods for determining gaseous compounds from gaseous, liquid and solid samples. Developing an effective method using UV-Ion Mobility Spectrometry (UV-IMS) to determine volatile analytes entails using appropriate gaseous standards for calibrating the spectrometer. In this work, two home-made sample introduction systems (SISs) and a commercial gas generator were used to obtain such gaseous standards. The first home-made SIS used was a static head-space to measure compounds present in liquid samples and the other home-made system was an exponential dilution set-up to measure compounds present in gaseous samples. Gaseous compounds generated by each method were determined on-line by UV-IMS. Target analytes chosen for this comparative study were ethanol, acetone, benzene, toluene, ethylbenzene and xylene isomers. The different alternatives were acceptable in terms of sensitivity, precision and selectivity. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Comparison of 13C Nuclear Magnetic Resonance and Fourier Transform Infrared spectroscopy for estimating humification and aromatization of soil organic matter

    NASA Astrophysics Data System (ADS)

    Rogers, K.; Cooper, W. T.; Hodgkins, S. B.; Verbeke, B. A.; Chanton, J.

    2017-12-01

    Solid state direct polarization 13C NMR spectroscopy (DP-NMR) is generally considered the most quantitatively reliable method for soil organic matter (SOM) characterization, including determination of the relative abundances of carbon functional groups. These functional abundances can then be used to calculate important soil parameters such as degree of humification and extent of aromaticity that reveal differences in reactivity or compositional changes along gradients (e.g. thaw chronosequence in permafrost). Unfortunately, the 13C NMR DP-NMR experiment is time-consuming, with a single sample often requiring over 24 hours of instrument time. Alternatively, solid state cross polarization 13C NMR (CP-NMR) can circumvent this problem, reducing analyses times to 4-6 hours but with some loss of quantitative reliability. Attenuated Total Reflectance Fourier Transform Infrared spectroscopy (ATR-FTIR) is a quick and relatively inexpensive method for characterizing solid materials, and has been suggested as an alternative to NMR for analysis of soil organic matter and determination of humification (HI) and aromatization (AI) indices. However, the quantitative reliability of ATR-FTIR for SOM analyses has never been verified, nor have any ATR-FTIR data been compared to similar measurements by NMR. In this work we focused on FTIR vibrational bands that correspond to the three functional groups used to calculate HI and AI values: carbohydrates (1030 cm-1), aromatics (1510, 1630 cm-1), and aliphatics (2850, 2920 cm-1). Data from ATR-FTIR measurements were compared to analogous quantitation by DP- and CP-NMR using peat samples from Sweden, Minnesota, and North Carolina. DP- and CP-NMR correlate very strongly, although the correlations are not always 1:1. Direct comparison of relative abundances of the three functional groups determined by NMR and ATR-FTIR yielded satisfactory results for carbohydrates (r2= 0.78) and aliphatics (r2=0.58), but less so for aromatics (r2= 0.395). ATR-FTIR has to this point been used primarily for relative abundance analyses (e.g. calculating HI and AI values), but these results suggest FTIR can provide quantitative reliability that approaches that of NMR.

  15. Alternative Estimates of the Reliability of College Grade Point Averages. Professional File. Article 130, Spring 2013

    ERIC Educational Resources Information Center

    Saupe, Joe L.; Eimers, Mardy T.

    2013-01-01

    The purpose of this paper is to explore differences in the reliabilities of cumulative college grade point averages (GPAs), estimated for unweighted and weighted, one-semester, 1-year, 2-year, and 4-year GPAs. Using cumulative GPAs for a freshman class at a major university, we estimate internal consistency (coefficient alpha) reliabilities for…

  16. Statistical validity of using ratio variables in human kinetics research.

    PubMed

    Liu, Yuanlong; Schutz, Robert W

    2003-09-01

    The purposes of this study were to investigate the validity of the simple ratio and three alternative deflation models and examine how the variation of the numerator and denominator variables affects the reliability of a ratio variable. A simple ratio and three alternative deflation models were fitted to four empirical data sets, and common criteria were applied to determine the best model for deflation. Intraclass correlation was used to examine the component effect on the reliability of a ratio variable. The results indicate that the validity, of a deflation model depends on the statistical characteristics of the particular component variables used, and an optimal deflation model for all ratio variables may not exist. Therefore, it is recommended that different models be fitted to each empirical data set to determine the best deflation model. It was found that the reliability of a simple ratio is affected by the coefficients of variation and the within- and between-trial correlations between the numerator and denominator variables. It was recommended that researchers should compute the reliability of the derived ratio scores and not assume that strong reliabilities in the numerator and denominator measures automatically lead to high reliability in the ratio measures.

  17. Direct PCR Offers a Fast and Reliable Alternative to Conventional DNA Isolation Methods for Gut Microbiomes.

    PubMed

    Videvall, Elin; Strandh, Maria; Engelbrecht, Anel; Cloete, Schalk; Cornwallis, Charlie K

    2017-01-01

    The gut microbiome of animals is emerging as an important factor influencing ecological and evolutionary processes. A major bottleneck in obtaining microbiome data from large numbers of samples is the time-consuming laboratory procedures required, specifically the isolation of DNA and generation of amplicon libraries. Recently, direct PCR kits have been developed that circumvent conventional DNA extraction steps, thereby streamlining the laboratory process by reducing preparation time and costs. However, the reliability and efficacy of direct PCR for measuring host microbiomes have not yet been investigated other than in humans with 454 sequencing. Here, we conduct a comprehensive evaluation of the microbial communities obtained with direct PCR and the widely used Mo Bio PowerSoil DNA extraction kit in five distinct gut sample types (ileum, cecum, colon, feces, and cloaca) from 20 juvenile ostriches, using 16S rRNA Illumina MiSeq sequencing. We found that direct PCR was highly comparable over a range of measures to the DNA extraction method in cecal, colon, and fecal samples. However, the two methods significantly differed in samples with comparably low bacterial biomass: cloacal and especially ileal samples. We also sequenced 100 replicate sample pairs to evaluate repeatability during both extraction and PCR stages and found that both methods were highly consistent for cecal, colon, and fecal samples ( r s > 0.7) but had low repeatability for cloacal ( r s = 0.39) and ileal ( r s = -0.24) samples. This study indicates that direct PCR provides a fast, cheap, and reliable alternative to conventional DNA extraction methods for retrieving 16S rRNA data, which can aid future gut microbiome studies. IMPORTANCE The microbial communities of animals can have large impacts on their hosts, and the number of studies using high-throughput sequencing to measure gut microbiomes is rapidly increasing. However, the library preparation procedure in microbiome research is both costly and time-consuming, especially for large numbers of samples. We investigated a cheaper and faster direct PCR method designed to bypass the DNA isolation steps during 16S rRNA library preparation and compared it with a standard DNA extraction method. We used both techniques on five different gut sample types collected from 20 juvenile ostriches and sequenced samples with Illumina MiSeq. The methods were highly comparable and highly repeatable in three sample types with high microbial biomass (cecum, colon, and feces), but larger differences and low repeatability were found in the microbiomes obtained from the ileum and cloaca. These results will help microbiome researchers assess library preparation procedures and plan their studies accordingly.

  18. RAMI modeling of selected balance of plant systems for the proposed Accelerator Production of Tritium (APT) project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radder, J.A.; Cramer, D.S.

    1997-06-01

    In order to meet Department of Energy (DOE) Defense Program requirements for tritium in the 2005-2007 time frame, new production capability must be made available. The Accelerator Production of Tritium (APT) Plant is being considered as an alternative to nuclear reactor production of tritium, which has been the preferred method in the past. The proposed APT plant will use a high-power proton accelerator to generate thermal neutrons that will be captured in {sup 3}He to produce tritium (3H). It is expected that the APT Plant will be built and operated at the DOE`s Savannah River Site (SRS) in Aiken, Southmore » Carolina. Discussion is focused on Reliability, Availability, Maintainability, and Inspectability (RAMI) modeling of recent conceptual designs for balance of plant (BOP) systems in the proposed APT Plant. In the conceptual designs for balance of plant (BOP) systems in the proposed APT Plant. In the conceptual design phase, system RAMI estimates are necessary to identify the best possible system alternative and to provide a valid picture of the cost effectiveness of the proposed system for comparison with other system alternatives. RAMI estimates in the phase must necessarily be based on generic data. The objective of the RAMI analyses at the conceptual design stage is to assist the designers in achieving an optimum design which balances the reliability and maintainability requirements among the subsystems and components.« less

  19. Intracranial Pressure Monitoring: Invasive versus Non-Invasive Methods—A Review

    PubMed Central

    Raboel, P. H.; Bartek, J.; Andresen, M.; Bellander, B. M.; Romner, B.

    2012-01-01

    Monitoring of intracranial pressure (ICP) has been used for decades in the fields of neurosurgery and neurology. There are multiple techniques: invasive as well as noninvasive. This paper aims to provide an overview of the advantages and disadvantages of the most common and well-known methods as well as assess whether noninvasive techniques (transcranial Doppler, tympanic membrane displacement, optic nerve sheath diameter, CT scan/MRI and fundoscopy) can be used as reliable alternatives to the invasive techniques (ventriculostomy and microtransducers). Ventriculostomy is considered the gold standard in terms of accurate measurement of pressure, although microtransducers generally are just as accurate. Both invasive techniques are associated with a minor risk of complications such as hemorrhage and infection. Furthermore, zero drift is a problem with selected microtransducers. The non-invasive techniques are without the invasive methods' risk of complication, but fail to measure ICP accurately enough to be used as routine alternatives to invasive measurement. We conclude that invasive measurement is currently the only option for accurate measurement of ICP. PMID:22720148

  20. A case study of alternative site response explanatory variables in Parkfield, California

    USGS Publications Warehouse

    Thompson, E.M.; Baise, L.G.; Kayen, R.E.; Morgan, E.C.; Kaklamanos, J.

    2011-01-01

    The combination of densely-spaced strong-motion stations in Parkfield, California, and spectral analysis of surface waves (SASW) profiles provides an ideal dataset for assessing the accuracy of different site response explanatory variables. We judge accuracy in terms of spatial coverage and correlation with observations. The performance of the alternative models is period-dependent, but generally we observe that: (1) where a profile is available, the square-root-of-impedance method outperforms VS30 (average S-wave velocity to 30 m depth), and (2) where a profile is unavailable, the topographic-slope method outperforms surficial geology. The fundamental site frequency is a valuable site response explanatory variable, though less valuable than VS30. However, given the expense and difficulty of obtaining reliable estimates of VS30 and the relative ease with which the fundamental site frequency can be computed, the fundamental site frequency may prove to be a valuable site response explanatory variable for many applications. ?? 2011 ASCE.

  1. The design organization test: further demonstration of reliability and validity as a brief measure of visuospatial ability.

    PubMed

    Killgore, William D S; Gogel, Hannah

    2014-01-01

    Neuropsychological assessments are frequently time-consuming and fatiguing for patients. Brief screening evaluations may reduce test duration and allow more efficient use of time by permitting greater attention toward neuropsychological domains showing probable deficits. The Design Organization Test (DOT) was initially developed as a 2-min paper-and-pencil alternative for the Block Design (BD) subtest of the Wechsler scales. Although initially validated for clinical neurologic patients, we sought to further establish the reliability and validity of this test in a healthy, more diverse population. Two alternate versions of the DOT and the Wechsler Abbreviated Scale of Intelligence (WASI) were administered to 61 healthy adult participants. The DOT showed high alternate forms reliability (r = .90-.92), and the two versions yielded equivalent levels of performance. The DOT was highly correlated with BD (r = .76-.79) and was significantly correlated with all subscales of the WASI. The DOT proved useful when used in lieu of BD in the calculation of WASI IQ scores. Findings support the reliability and validity of the DOT as a measure of visuospatial ability and suggest its potential worth as an efficient estimate of intellectual functioning in situations where lengthier tests may be inappropriate or unfeasible.

  2. Measurement of impulsive choice in rats: Same and alternate form test-retest reliability and temporal tracking

    PubMed Central

    Peterson, Jennifer R.; Hill, Catherine C.; Kirkpatrick, Kimberly

    2016-01-01

    Impulsive choice is typically measured by presenting smaller-sooner (SS) versus larger-later (LL) rewards, with biases towards the SS indicating impulsivity. The current study tested rats on different impulsive choice procedures with LL delay manipulations to assess same-form and alternate-form test-retest reliability. In the systematic-GE procedure (Green & Estle, 2003), the LL delay increased after several sessions of training; in the systematic-ER procedure (Evenden & Ryan, 1996), the delay increased within each session; and in the adjusting-M procedure (Mazur, 1987), the delay changed after each block of trials within a session based on each rat’s choices in the previous block. In addition to measuring choice behavior, we also assessed temporal tracking of the LL delays using the median times of responding during LL trials. The two systematic procedures yielded similar results in both choice and temporal tracking measures following extensive training, whereas the adjusting procedure resulted in relatively more impulsive choices and poorer temporal tracking. Overall, the three procedures produced acceptable same form test-retest reliability over time, but the adjusting procedure did not show significant alternate form test-retest reliability with the other two procedures. The results suggest that systematic procedures may supply better measurements of impulsive choice in rats. PMID:25490901

  3. Periodic component analysis as a spatial filter for SSVEP-based brain-computer interface.

    PubMed

    Kiran Kumar, G R; Reddy, M Ramasubba

    2018-06-08

    Traditional Spatial filters used for steady-state visual evoked potential (SSVEP) extraction such as minimum energy combination (MEC) require the estimation of the background electroencephalogram (EEG) noise components. Even though this leads to improved performance in low signal to noise ratio (SNR) conditions, it makes such algorithms slow compared to the standard detection methods like canonical correlation analysis (CCA) due to the additional computational cost. In this paper, Periodic component analysis (πCA) is presented as an alternative spatial filtering approach to extract the SSVEP component effectively without involving extensive modelling of the noise. The πCA can separate out components corresponding to a given frequency of interest from the background electroencephalogram (EEG) by capturing the temporal information and does not generalize SSVEP based on rigid templates. Data from ten test subjects were used to evaluate the proposed method and the results demonstrate that the periodic component analysis acts as a reliable spatial filter for SSVEP extraction. Statistical tests were performed to validate the results. The experimental results show that πCA provides significant improvement in accuracy compared to standard CCA and MEC in low SNR conditions. The results demonstrate that πCA provides better detection accuracy compared to CCA and on par with that of MEC at a lower computational cost. Hence πCA is a reliable and efficient alternative detection algorithm for SSVEP based brain-computer interface (BCI). Copyright © 2018. Published by Elsevier B.V.

  4. A Comparison of Alternate-Choice and True-False Item Forms Used in Classroom Examinations.

    ERIC Educational Resources Information Center

    Maihoff, N. A.; Mehrens, Wm. A.

    A comparison is presented of alternate-choice and true-false item forms used in an undergraduate natural science course. The alternate-choice item is a modified two-choice multiple-choice item in which the two responses are included within the question stem. This study (1) compared the difficulty level, discrimination level, reliability, and…

  5. The Development and Validation of a Two-Tiered Multiple-Choice Instrument to Identify Alternative Conceptions in Earth Science

    ERIC Educational Resources Information Center

    Mangione, Katherine Anna

    2010-01-01

    This study was to determine reliability and validity for a two-tiered, multiple- choice instrument designed to identify alternative conceptions in earth science. Additionally, this study sought to identify alternative conceptions in earth science held by preservice teachers, to investigate relationships between self-reported confidence scores and…

  6. Detection of proximal caries using quantitative light-induced fluorescence-digital and laser fluorescence: a comparative study.

    PubMed

    Yoon, Hyung-In; Yoo, Min-Jeong; Park, Eun-Jin

    2017-12-01

    The purpose of this study was to evaluate the in vitro validity of quantitative light-induced fluorescence-digital (QLF-D) and laser fluorescence (DIAGNOdent) for assessing proximal caries in extracted premolars, using digital radiography as reference method. A total of 102 extracted premolars with similar lengths and shapes were used. A single operator conducted all the examinations using three different detection methods (bitewing radiography, QLF-D, and DIAGNOdent). The bitewing x-ray scale, QLF-D fluorescence loss (ΔF), and DIAGNOdent peak readings were compared and statistically analyzed. Each method showed an excellent reliability. The correlation coefficient between bitewing radiography and QLF-D, DIAGNOdent were -0.644 and 0.448, respectively, while the value between QLF-D and DIAGNOdent was -0.382. The kappa statistics for bitewing radiography and QLF-D had a higher diagnosis consensus than those for bitewing radiography and DIAGNOdent. The QLF-D was moderately to highly accurate (AUC = 0.753 - 0.908), while DIAGNOdent was moderately to less accurate (AUC = 0.622 - 0.784). All detection methods showed statistically significant correlation and high correlation between the bitewing radiography and QLF-D. QLF-D was found to be a valid and reliable alternative diagnostic method to digital bitewing radiography for in vitro detection of proximal caries.

  7. Optimized approach for Ion Proton RNA sequencing reveals details of RNA splicing and editing features of the transcriptome.

    PubMed

    Brown, Roger B; Madrid, Nathaniel J; Suzuki, Hideaki; Ness, Scott A

    2017-01-01

    RNA-sequencing (RNA-seq) has become the standard method for unbiased analysis of gene expression but also provides access to more complex transcriptome features, including alternative RNA splicing, RNA editing, and even detection of fusion transcripts formed through chromosomal translocations. However, differences in library methods can adversely affect the ability to recover these different types of transcriptome data. For example, some methods have bias for one end of transcripts or rely on low-efficiency steps that limit the complexity of the resulting library, making detection of rare transcripts less likely. We tested several commonly used methods of RNA-seq library preparation and found vast differences in the detection of advanced transcriptome features, such as alternatively spliced isoforms and RNA editing sites. By comparing several different protocols available for the Ion Proton sequencer and by utilizing detailed bioinformatics analysis tools, we were able to develop an optimized random primer based RNA-seq technique that is reliable at uncovering rare transcript isoforms and RNA editing features, as well as fusion reads from oncogenic chromosome rearrangements. The combination of optimized libraries and rapid Ion Proton sequencing provides a powerful platform for the transcriptome analysis of research and clinical samples.

  8. 78 FR 63036 - Transmission Planning Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-23

    ... blend of specific quantitative and qualitative parameters for the permissible use of planned non... circumstances, Reliability Standard TPL-001-4 provides a blend of specific quantitative and qualitative... considerations, such as costs and alternatives, guards against a determination based solely on a quantitative...

  9. Evaluation of an alternative extraction procedure for enterotoxin determination in dairy products.

    PubMed

    Meyrand, A; Atrache, V; Bavai, C; Montet, M P; Vernozy-Rozand, C

    1999-06-01

    A concentration protocol based on trichloroacetic acid precipitation was evaluated and compared with the reference method using dialysis concentration. Different quantities of purified staphylococcal enterotoxins were added to pasteurized Camembert-type cheeses. Detection of enterotoxins in these cheeses was performed using an automated detection system. Raw goat milk Camembert-type cheeses involved in a staphylococcal food poisoning were also tested. Both enterotoxin extraction methods allowed detection of the lowest enterotoxin concentration level used in this study (0.5 ng g-1). Compared with the dialysis concentration method, TCA precipitation of staphylococcal enterotoxins was 'user-friendly' and less time-consuming. These results suggest that TCA precipitation is a rapid (1 h), simple and reliable method of extracting enterotoxin from food which gives excellent recovery from dairy products.

  10. The developmental stages of the middle phalanx of the third finger (MP3): a sole indicator in assessing the skeletal maturity?

    PubMed

    Madhu, S; Hegde, Amitha M; Munshi, A K

    2003-01-01

    Assessment of skeletal maturity is an integral part of interceptive diagnosis and treatment planning. The present day methods of skeletal maturity assessment like the hand-wrist radiographs or cervical vertebrae radiographs are expensive, require elaborate equipment and accounts for high radiation exposure, especially for growing children. The present study was thus undertaken to provide a simple and practical method of skeletal maturity assessment using the developmental stages of the middle phalanx of the third finger (MP3) as seen on an IOPA film taken using a standard dental x-ray machine. The results of the study showed that this simple method was highly reliable and could be used as an alternative method to assess the skeletal maturity of growing children.

  11. Use of computed tomography as a non-invasive method for diagnosing cephenemyiosis in roe deer (Capreolus capreolus).

    PubMed

    Fidalgo, L E; López-Beceiro, A M; Vila-Pastor, M; Martínez-Carrasco, C; Barreiro-Vázquez, J D; Pérez, J M

    2015-03-01

    This study was conducted to assess the reliability of computed tomography (CT) for diagnosing bot fly infestations by Cephenemyia stimulator (Clark) (Diptera: Oestridae) in roe deer (Capreolus capreolus L.) (Artiodactyla: Cervidae). For this purpose, the heads of 30 animals were analysed, firstly by CT and then by necropsy, which was used as the reference standard method. The prevalence values obtained by both methods were identical; the prevalence of infestation was 40.0% overall, and was higher in males (45.5%) than in females (25.0%). These results highlight the usefulness of CT as an alternative or non-invasive method for diagnosing cephenemyiosis in live-captured roe deer and in hunting trophies or museum collections that cannot be destroyed or damaged. © 2014 The Royal Entomological Society.

  12. NS3 genomic sequencing and phylogenetic analysis as alternative to a commercially available assay to reliably determine hepatitis C virus subtypes 1a and 1b

    PubMed Central

    Neukam, Karin; Martínez, Alfredo P.; Culasso, Andrés C. A.; Ridruejo, Ezequiel; García, Gabriel

    2017-01-01

    Objective To evaluate the use of hepatitis C virus (HCV) NS3 sequencing as alternative to the comercially available Versant HCV 2.0 reverse hybridization line-probe assay (LiPA 2.0) to determine HCV genotype 1 (HCV-1) subtypes. Patients and methods A cohort of 104 patients infected by HCV-1 according to LiPA 2.0 was analyzed in a cross-sectional study conducted in patients seen from January 2012 to June 2016 at an outpatient clinic in Buenos Aires, Argentina. Results The samples were included within well supported subtype clades: 64 with HCV-1b and 39 with HCV-1a infection. Twenty of the HCV-1a infected patientes were included in a supported sub-clade “1” and 19 individuals were among the basal sub-clade “2”. LiPA 2.0 failed to subtype HCV-1 in 20 (19.2%) individuals. Subtype classification determined by NS3 direct sequencing showed that 2/18 (11.1%) of the HCV-1a-infected patients as determined by LiPA 2.0 were in fact infected by HCV-1b. Of the HCV-1b-infected according to LiPA 2.0, 10/66 (15.2%) patients showed HCV-1a infection according to NS3 sequencing. Overall misclassification was 14.3% (κ-index for the concordance with NS3 sequencing = 0.635). One (1%) patient was erroneously genotyped as HCV-1 and was revealed as HCV genotype 4 infection. Conclusions Genomic sequencing of the HCV NS3 region represents an adequate alternative since it provides reliable genetic information. It even distinguishes between HCV-1a clades related to resistance-associated substitutions to HCV protease inhibitors, it provides reliable genetic information for genotyping/subgenotyping and simultaneously allows to determine the presence of resistance-associated substitutions to currently recommended DAAs. PMID:28753662

  13. On the Reliability and Validity of a Numerical Reasoning Speed Dimension Derived from Response Times Collected in Computerized Testing

    ERIC Educational Resources Information Center

    Davison, Mark L.; Semmes, Robert; Huang, Lan; Close, Catherine N.

    2012-01-01

    Data from 181 college students were used to assess whether math reasoning item response times in computerized testing can provide valid and reliable measures of a speed dimension. The alternate forms reliability of the speed dimension was .85. A two-dimensional structural equation model suggests that the speed dimension is related to the accuracy…

  14. Evaluation of petrifilm series 2000 as a possible rapid method to count coliforms in foods.

    PubMed

    Priego, R; Medina, L M; Jordano, R

    2000-08-01

    This research note is a preliminary comparison between the Petrifilm 2000 method and a widely used traditional enumeration method (on violet red bile agar); six batches of different foods (egg, frozen green beans, fresh sausage, a bakery product, raw minced meat, and raw milk) were studied. The reliability of the presumptive counts taken at 10, 12, and 14 h of incubation using this method was also verified by comparing the counts with the total confirmed counts at 24 h. In all the batches studied, results obtained with Petrifilm 2000 presented a close correlation to those obtained using violet red bile agar (r = 0.860) and greater sensitivity (93.33% of the samples displayed higher counts on Petrifilm 2000), showing that this method is a reliable and efficient alternative. The count taken at 10-h incubation is of clear interest as an early indicator of results in microbiological food control, since it accounted for 90% of the final count in all the batches analyzed. Counts taken at 12 and 14 h bore a greater similarity to those taken at 24 h. The Petrifilm 2000 method provides results in less than 12 h of incubation, making it a possible rapid method that adapts perfectly to hazard analysis critical control point system by enabling the microbiological quality control of the processing.

  15. Maximum Entropy Approach in Dynamic Contrast-Enhanced Magnetic Resonance Imaging.

    PubMed

    Farsani, Zahra Amini; Schmid, Volker J

    2017-01-01

    In the estimation of physiological kinetic parameters from Dynamic Contrast-Enhanced Magnetic Resonance Imaging (DCE-MRI) data, the determination of the arterial input function (AIF) plays a key role. This paper proposes a Bayesian method to estimate the physiological parameters of DCE-MRI along with the AIF in situations, where no measurement of the AIF is available. In the proposed algorithm, the maximum entropy method (MEM) is combined with the maximum a posterior approach (MAP). To this end, MEM is used to specify a prior probability distribution of the unknown AIF. The ability of this method to estimate the AIF is validated using the Kullback-Leibler divergence. Subsequently, the kinetic parameters can be estimated with MAP. The proposed algorithm is evaluated with a data set from a breast cancer MRI study. The application shows that the AIF can reliably be determined from the DCE-MRI data using MEM. Kinetic parameters can be estimated subsequently. The maximum entropy method is a powerful tool to reconstructing images from many types of data. This method is useful for generating the probability distribution based on given information. The proposed method gives an alternative way to assess the input function from the existing data. The proposed method allows a good fit of the data and therefore a better estimation of the kinetic parameters. In the end, this allows for a more reliable use of DCE-MRI. Schattauer GmbH.

  16. Detection of ingested nitromethane and reliable creatinine assessment using multiple common analytical methods.

    PubMed

    Murphy, Christine M; Devlin, John J; Beuhler, Michael C; Cheifetz, Paul; Maynard, Susan; Schwartz, Michael D; Kacinko, Sherri

    2018-04-01

    Nitromethane, found in fuels used for short distance racing, model cars, and model airplanes, produces a falsely elevated serum creatinine with standard creatinine analysis via the Jaffé method. Erroneous creatinine elevation often triggers extensive testing, leads to inaccurate diagnoses, and delayed or inappropriate medical interventions. Multiple reports in the literature identify "enzymatic assays" as an alternative method to detect the true value of creatinine, but this ambiguity does not help providers translate what type of enzymatic assay testing can be done in real time to determine if there is indeed false elevation. We report seven cases of ingested nitromethane where creatinine was determined via Beckman Coulter ® analyser using the Jaffé method, Vitros ® analyser, or i-Stat ® point-of-care testing. Nitromethane was detected and semi-quantified using a common clinical toxic alcohol analysis method, and quantified by headspace-gas chromatography-mass spectrometry. When creatinine was determined using i-Stat ® point-of-care testing or a Vitros ® analyser, levels were within the normal range. Comparatively, all initial creatinine levels obtained via the Jaffé method were elevated. Nitromethane concentrations ranged from 42 to 310 μg/mL. These cases demonstrate reliable assessment of creatinine through other enzymatic methods using a Vitros ® analyser or i-STAT ® . Additionally, nitromethane is detectable and quantifiable using routine alcohols gas chromatography analysis and by headspace-gas chromatography-mass spectrometry.

  17. Reliability of a smartphone-based goniometer for knee joint goniometry.

    PubMed

    Ferriero, Giorgio; Vercelli, Stefano; Sartorio, Francesco; Muñoz Lasa, Susana; Ilieva, Elena; Brigatti, Elisa; Ruella, Carolina; Foti, Calogero

    2013-06-01

    The aim of this study was to assess the reliability of a smartphone-based application developed for photographic-based goniometry, DrGoniometer (DrG), by comparing its measurement of the knee joint angle with that made by a universal goniometer (UG). Joint goniometry is a common mode of clinical assessment used in many disciplines, in particular in rehabilitation. One validated method is photographic-based goniometry, but the procedure is usually complex: the image has to be downloaded from the camera to a computer and then edited using dedicated software. This disadvantage may be overcome by the new generation of mobile phones (smartphones) that have computer-like functionality and an integrated digital camera. This validation study was carried out under two different controlled conditions: (i) with the participant to measure in a fixed position and (ii) with a battery of pictures to assess. In the first part, four raters performed repeated measurements with DrG and UG at different knee joint angles. Then, 10 other raters measured the knee at different flexion angles ranging 20-145° on a battery of 35 pictures taken in a clinical setting. The results showed that inter-rater and intra-rater correlations were always more than 0.958. Agreement with the UG showed a width of 18.2° [95% limits of agreement (LoA)=-7.5/+10.7°] and 14.1° (LoA=-6.6/+7.5°). In conclusion, DrG seems to be a reliable method for measuring knee joint angle. This mHealth application can be an alternative/additional method of goniometry, easier to use than other photographic-based goniometric assessments. Further studies are required to assess its reliability for the measurement of other joints.

  18. Seismic data restoration with a fast L1 norm trust region method

    NASA Astrophysics Data System (ADS)

    Cao, Jingjie; Wang, Yanfei

    2014-08-01

    Seismic data restoration is a major strategy to provide reliable wavefield when field data dissatisfy the Shannon sampling theorem. Recovery by sparsity-promoting inversion often get sparse solutions of seismic data in a transformed domains, however, most methods for sparsity-promoting inversion are line-searching methods which are efficient but are inclined to obtain local solutions. Using trust region method which can provide globally convergent solutions is a good choice to overcome this shortcoming. A trust region method for sparse inversion has been proposed, however, the efficiency should be improved to suitable for large-scale computation. In this paper, a new L1 norm trust region model is proposed for seismic data restoration and a robust gradient projection method for solving the sub-problem is utilized. Numerical results of synthetic and field data demonstrate that the proposed trust region method can get excellent computation speed and is a viable alternative for large-scale computation.

  19. Application of the Probabilistic Dynamic Synthesis Method to the Analysis of a Realistic Structure

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Ferri, Aldo A.

    1998-01-01

    The Probabilistic Dynamic Synthesis method is a new technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. A previous work verified the feasibility of the PDS method on a simple seven degree-of-freedom spring-mass system. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.

  20. Application of the Probabilistic Dynamic Synthesis Method to Realistic Structures

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Ferri, Aldo A.

    1998-01-01

    The Probabilistic Dynamic Synthesis method is a technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. In previous work, the feasibility of the PDS method applied to a simple seven degree-of-freedom spring-mass system was verified. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.

  1. 1st NASA Electronic Parts Packaging (NEPP) Program Electronic Technology Workshop (ETW)

    NASA Technical Reports Server (NTRS)

    LaBel, Kenneth A.; Sampson, Michael J.

    2010-01-01

    NEPP supports all of NASA for >20 years - 7 NASA Centers and JPL actively participate The NEPP Program focuses on the reliability aspects of electronic devices - Three prime technical areas: Parts (die), Packaging, and Radiation Alternately, reliability may be viewed as: -

  2. Dutch population specific sex estimation formulae using the proximal femur.

    PubMed

    Colman, K L; Janssen, M C L; Stull, K E; van Rijn, R R; Oostra, R J; de Boer, H H; van der Merwe, A E

    2018-05-01

    Sex estimation techniques are frequently applied in forensic anthropological analyses of unidentified human skeletal remains. While morphological sex estimation methods are able to endure population differences, the classification accuracy of metric sex estimation methods are population-specific. No metric sex estimation method currently exists for the Dutch population. The purpose of this study is to create Dutch population specific sex estimation formulae by means of osteometric analyses of the proximal femur. Since the Netherlands lacks a representative contemporary skeletal reference population, 2D plane reconstructions, derived from clinical computed tomography (CT) data, were used as an alternative source for a representative reference sample. The first part of this study assesses the intra- and inter-observer error, or reliability, of twelve measurements of the proximal femur. The technical error of measurement (TEM) and relative TEM (%TEM) were calculated using 26 dry adult femora. In addition, the agreement, or accuracy, between the dry bone and CT-based measurements was determined by percent agreement. Only reliable and accurate measurements were retained for the logistic regression sex estimation formulae; a training set (n=86) was used to create the models while an independent testing set (n=28) was used to validate the models. Due to high levels of multicollinearity, only single variable models were created. Cross-validated classification accuracies ranged from 86% to 92%. The high cross-validated classification accuracies indicate that the developed formulae can contribute to the biological profile and specifically in sex estimation of unidentified human skeletal remains in the Netherlands. Furthermore, the results indicate that clinical CT data can be a valuable alternative source of data when representative skeletal collections are unavailable. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Cortical Thickness Estimations of FreeSurfer and the CAT12 Toolbox in Patients with Alzheimer's Disease and Healthy Controls.

    PubMed

    Seiger, Rene; Ganger, Sebastian; Kranz, Georg S; Hahn, Andreas; Lanzenberger, Rupert

    2018-05-15

    Automated cortical thickness (CT) measurements are often used to assess gray matter changes in the healthy and diseased human brain. The FreeSurfer software is frequently applied for this type of analysis. The computational anatomy toolbox (CAT12) for SPM, which offers a fast and easy-to-use alternative approach, was recently made available. In this study, we compared region of interest (ROI)-wise CT estimations of the surface-based FreeSurfer 6 (FS6) software and the volume-based CAT12 toolbox for SPM using 44 elderly healthy female control subjects (HC). In addition, these 44 HCs from the cross-sectional analysis and 34 age- and sex-matched patients with Alzheimer's disease (AD) were used to assess the potential of detecting group differences for each method. Finally, a test-retest analysis was conducted using 19 HC subjects. All data were taken from the OASIS database and MRI scans were recorded at 1.5 Tesla. A strong correlation was observed between both methods in terms of ROI mean CT estimates (R 2 = .83). However, CAT12 delivered significantly higher CT estimations in 32 of the 34 ROIs, indicating a systematic difference between both approaches. Furthermore, both methods were able to reliably detect atrophic brain areas in AD subjects, with the highest decreases in temporal areas. Finally, FS6 as well as CAT12 showed excellent test-retest variability scores. Although CT estimations were systematically higher for CAT12, this study provides evidence that this new toolbox delivers accurate and robust CT estimates and can be considered a fast and reliable alternative to FreeSurfer. © 2018 The Authors. Journal of Neuroimaging published by Wiley Periodicals, Inc. on behalf of American Society of Neuroimaging.

  4. Development of Internet-Based Tasks for the Executive Function Performance Test.

    PubMed

    Rand, Debbie; Lee Ben-Haim, Keren; Malka, Rachel; Portnoy, Sigal

    The Executive Function Performance Test (EFPT) is a reliable and valid performance-based tool to assess executive functions (EFs). This study's objective was to develop and verify two Internet-based tasks for the EFPT. A cross-sectional study assessed the alternate-form reliability of the Internet-based bill-paying and telephone-use tasks in healthy adults and people with subacute stroke (Study 1). It also sought to establish the tasks' criterion reliability for assessing EF deficits by correlating performance with that on the Trail Making Test in five groups: healthy young adults, healthy older adults, people with subacute stroke, people with chronic stroke, and young adults with attention deficit hyperactivity disorder (Study 2). The alternative-form reliability and initial construct validity for the Internet-based bill-paying task were verified. Criterion validity was established for both tasks. The Internet-based tasks are comparable to the original EFPT tasks and can be used for assessment of EF deficits. Copyright © 2018 by the American Occupational Therapy Association, Inc.

  5. Evaluation of Alternative Conceptual Models Using Interdisciplinary Information: An Application in Shallow Groundwater Recharge and Discharge

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Bajcsy, P.; Valocchi, A. J.; Kim, C.; Wang, J.

    2007-12-01

    Natural systems are complex, thus extensive data are needed for their characterization. However, data acquisition is expensive; consequently we develop models using sparse, uncertain information. When all uncertainties in the system are considered, the number of alternative conceptual models is large. Traditionally, the development of a conceptual model has relied on subjective professional judgment. Good judgment is based on experience in coordinating and understanding auxiliary information which is correlated to the model but difficult to be quantified into the mathematical model. For example, groundwater recharge and discharge (R&D) processes are known to relate to multiple information sources such as soil type, river and lake location, irrigation patterns and land use. Although hydrologists have been trying to understand and model the interaction between each of these information sources and R&D processes, it is extremely difficult to quantify their correlations using a universal approach due to the complexity of the processes, the spatiotemporal distribution and uncertainty. There is currently no single method capable of estimating R&D rates and patterns for all practical applications. Chamberlin (1890) recommended use of "multiple working hypotheses" (alternative conceptual models) for rapid advancement in understanding of applied and theoretical problems. Therefore, cross analyzing R&D rates and patterns from various estimation methods and related field information will likely be superior to using only a single estimation method. We have developed the Pattern Recognition Utility (PRU), to help GIS users recognize spatial patterns from noisy 2D image. This GIS plug-in utility has been applied to help hydrogeologists establish alternative R&D conceptual models in a more efficient way than conventional methods. The PRU uses numerical methods and image processing algorithms to estimate and visualize shallow R&D patterns and rates. It can provide a fast initial estimate prior to planning labor intensive and time consuming field R&D measurements. Furthermore, the Spatial Pattern 2 Learn (SP2L) was developed to cross analyze results from the PRU with ancillary field information, such as land coverage, soil type, topographic maps and previous estimates. The learning process of SP2L cross examines each initially recognized R&D pattern with the ancillary spatial dataset, and then calculates a quantifiable reliability index for each R&D map using a supervised machine learning technique called decision tree. This JAVA based software package is capable of generating alternative R&D maps if the user decides to apply certain conditions recognized by the learning process. The reliability indices from SP2L will improve the traditionally subjective approach to initiating conceptual models by providing objectively quantifiable conceptual bases for further probabilistic and uncertainty analyses. Both the PRU and SP2L have been designed to be user-friendly and universal utilities for pattern recognition and learning to improve model predictions from sparse measurements by computer-assisted integration of spatially dense geospatial image data and machine learning of model dependencies.

  6. 10 CFR 503.31 - Lack of alternate fuel supply for the first 10 years of useful life.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... useful life. 503.31 Section 503.31 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS NEW FACILITIES... useful life. (a) Eligibility. Section 212(a)(1)(A)(i) of the Act provides for a permanent exemption due to lack of an adequate and reliable supply of alternate fuel within the first 10 years of useful life...

  7. 10 CFR 503.31 - Lack of alternate fuel supply for the first 10 years of useful life.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... useful life. 503.31 Section 503.31 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS NEW FACILITIES... useful life. (a) Eligibility. Section 212(a)(1)(A)(i) of the Act provides for a permanent exemption due to lack of an adequate and reliable supply of alternate fuel within the first 10 years of useful life...

  8. 10 CFR 503.31 - Lack of alternate fuel supply for the first 10 years of useful life.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... useful life. 503.31 Section 503.31 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS NEW FACILITIES... useful life. (a) Eligibility. Section 212(a)(1)(A)(i) of the Act provides for a permanent exemption due to lack of an adequate and reliable supply of alternate fuel within the first 10 years of useful life...

  9. 10 CFR 503.31 - Lack of alternate fuel supply for the first 10 years of useful life.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... useful life. 503.31 Section 503.31 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS NEW FACILITIES... useful life. (a) Eligibility. Section 212(a)(1)(A)(i) of the Act provides for a permanent exemption due to lack of an adequate and reliable supply of alternate fuel within the first 10 years of useful life...

  10. 10 CFR 503.31 - Lack of alternate fuel supply for the first 10 years of useful life.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... useful life. 503.31 Section 503.31 Energy DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS NEW FACILITIES... useful life. (a) Eligibility. Section 212(a)(1)(A)(i) of the Act provides for a permanent exemption due to lack of an adequate and reliable supply of alternate fuel within the first 10 years of useful life...

  11. Payload operations control center network (POCCNET) systems definition phase study report

    NASA Technical Reports Server (NTRS)

    Desjardins, R.

    1978-01-01

    The results of the studies performed during the systems definition phase of POCCNET are presented. The concept of POCCNET as a system of standard POCCs is described and an analysis of system requirements is also included. Alternative systems concepts were evaluated as well as various methods for development of reliable reusable software. A number of POCC application areas, such as command management, on board computer support, and simulation were also studied. Other areas of investigation included the operation of POCCNET systems, the facility requirements and usage.

  12. Bioassays as one of the Green Chemistry tools for assessing environmental quality: A review.

    PubMed

    Wieczerzak, M; Namieśnik, J; Kudłak, B

    2016-09-01

    For centuries, mankind has contributed to irreversible environmental changes, but due to the modern science of recent decades, scientists are able to assess the scale of this impact. The introduction of laws and standards to ensure environmental cleanliness requires comprehensive environmental monitoring, which should also meet the requirements of Green Chemistry. The broad spectrum of Green Chemistry principle applications should also include all of the techniques and methods of pollutant analysis and environmental monitoring. The classical methods of chemical analyses do not always match the twelve principles of Green Chemistry, and they are often expensive and employ toxic and environmentally unfriendly solvents in large quantities. These solvents can generate hazardous and toxic waste while consuming large volumes of resources. Therefore, there is a need to develop reliable techniques that would not only meet the requirements of Green Analytical Chemistry, but they could also complement and sometimes provide an alternative to conventional classical analytical methods. These alternatives may be found in bioassays. Commercially available certified bioassays often come in the form of ready-to-use toxkits, and they are easy to use and relatively inexpensive in comparison with certain conventional analytical methods. The aim of this study is to provide evidence that bioassays can be a complementary alternative to classical methods of analysis and can fulfil Green Analytical Chemistry criteria. The test organisms discussed in this work include single-celled organisms, such as cell lines, fungi (yeast), and bacteria, and multicellular organisms, such as invertebrate and vertebrate animals and plants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. TDRSS telecommunications system, PN code analysis

    NASA Technical Reports Server (NTRS)

    Dixon, R.; Gold, R.; Kaiser, F.

    1976-01-01

    The pseudo noise (PN) codes required to support the TDRSS telecommunications services are analyzed and the impact of alternate coding techniques on the user transponder equipment, the TDRSS equipment, and all factors that contribute to the acquisition and performance of these telecommunication services is assessed. Possible alternatives to the currently proposed hybrid FH/direct sequence acquisition procedures are considered and compared relative to acquisition time, implementation complexity, operational reliability, and cost. The hybrid FH/direct sequence technique is analyzed and rejected in favor of a recommended approach which minimizes acquisition time and user transponder complexity while maximizing probability of acquisition and overall link reliability.

  14. Smartphone and Universal Goniometer for Measurement of Elbow Joint Motions: A Comparative Study

    PubMed Central

    Behnoush, Behnam; Tavakoli, Nasim; Bazmi, Elham; Nateghi Fard, Fariborz; Pourgharib Shahi, Mohammad Hossein; Okazi, Arash; Mokhtari, Tahmineh

    2016-01-01

    Background Universal goniometer (UG) is commonly used as a standard method to evaluate range of motion (ROM) as part of joint motions. It has some restrictions, such as involvement of both hands of the physician, leads to instability of hands and error. Nowadays smartphones usage has been increasing due to its easy application. Objectives The study was designed to compare the smartphone inclinometer-based app and UG in evaluation of ROM of elbow. Materials and Methods The maximum ROM of elbow in position of flexion and pronation and supination of forearm were examined in 60 healthy volunteers with UG and smartphone. Data were analyzed using SPSS (ver. 16) software and appropriate statistical tests were applied, such as paired t-test, ICC and Bland Altman curves. Results The results of this study showed high reliability and validity of smartphone in regarding UG with ICC > 0.95. The highest reliability for both methods was in elbow supination and the lowest was in the elbow flexion (0.84). Conclusions Smartphones due to ease of access and usage for the physician and the patient, may be good alternatives for UG. PMID:27625754

  15. A Novel TRM Calculation Method by Probabilistic Concept

    NASA Astrophysics Data System (ADS)

    Audomvongseree, Kulyos; Yokoyama, Akihiko; Verma, Suresh Chand; Nakachi, Yoshiki

    In a new competitive environment, it becomes possible for the third party to access a transmission facility. From this structure, to efficiently manage the utilization of the transmission network, a new definition about Available Transfer Capability (ATC) has been proposed. According to the North American ElectricReliability Council (NERC)’s definition, ATC depends on several parameters, i. e. Total Transfer Capability (TTC), Transmission Reliability Margin (TRM), and Capacity Benefit Margin (CBM). This paper is focused on the calculation of TRM which is one of the security margin reserved for any uncertainty of system conditions. The TRM calculation by probabilistic method is proposed in this paper. Based on the modeling of load forecast error and error in transmission line limitation, various cases of transmission transfer capability and its related probabilistic nature can be calculated. By consideration of the proposed concept of risk analysis, the appropriate required amount of TRM can be obtained. The objective of this research is to provide realistic information on the actual ability of the network which may be an alternative choice for system operators to make an appropriate decision in the competitive market. The advantages of the proposed method are illustrated by application to the IEEJ-WEST10 model system.

  16. Using DNA chips for identification of tephritid pest species.

    PubMed

    Chen, Yen-Hou; Liu, Lu-Yan; Tsai, Wei-Huang; Haymer, David S; Lu, Kuang-Hui

    2014-08-01

    The ability correctly to identify species in a rapid and reliable manner is critical in many situations. For insects in particular, the primary tools for such identification rely on adult-stage morphological characters. For a number of reasons, however, there is a clear need for alternatives. This paper reports on the development of a new method employing DNA biochip technology for the identification of pest species within the family Tephritidae. The DNA biochip developed and tested here quickly and efficiently identifies and discriminates between several tephritid species, except for some that are members of a complex of closely related taxa and that may in fact not represent distinct biological species. The use of these chips offers a number of potential advantages over current methods. Results can be obtained in less than 5 h using material from any stage of the life cycle and with greater sensitivity than other methods currently available. This technology provides a novel tool for the rapid and reliable identification of several major pest species that may be intercepted in imported fruits or other commodities. The existing chips can also easily be expanded to incorporate additional markers and species as needed. © 2013 Society of Chemical Industry.

  17. Dynamics of subcellular compartmentalization of steroid receptors in living cells as a strategic screening method to determine the biological impact of suspected endocrine disruptors.

    PubMed

    Tyagi, Rakesh Kumar

    2003-04-01

    Although a number of screening methods being used for identifying potential endocrine disruptors have generated a wealth of information, a search for alternative combination of methods is still needed to overcome experimental artefacts. There are no generally accepted or validated screening methods for monitoring and studying impact of environmental endocrine disruptors. Also, no single assay can accurately predict all the deleterious effects of endocrine disruptors. For this reason various environmental protection agencies, mainly European and US, have urged that a battery of tests in current use need to be designed to assess their adequacy in detecting the effects of endocrine disruptors. Some details about endocrine disruptors and screening programs can be found at http://www.epa.gov/scipoly/oscpendo/whatis.htm. Several studies in recent years have used fusion proteins between steroid receptors (estrogen, androgen, progesterone, etc.) and green fluorescent protein (GFP) that can serve as an alternative potent screening method to study intracellular dynamics of receptors in living cells. An approach employing nucleocytoplasmic trafficking of steroid receptors as a parameter in response to potential xenobiotic chemicals in living cells may prove to be promising in terms of being direct, fast, reliable, simple and inexpensive. Copyright 2003 Elsevier Science Ltd.

  18. Speech-driven environmental control systems--a qualitative analysis of users' perceptions.

    PubMed

    Judge, Simon; Robertson, Zoë; Hawley, Mark; Enderby, Pam

    2009-05-01

    To explore users' experiences and perceptions of speech-driven environmental control systems (SPECS) as part of a larger project aiming to develop a new SPECS. The motivation for this part of the project was to add to the evidence base for the use of SPECS and to determine the key design specifications for a new speech-driven system from a user's perspective. Semi-structured interviews were conducted with 12 users of SPECS from around the United Kingdom. These interviews were transcribed and analysed using a qualitative method based on framework analysis. Reliability is the main influence on the use of SPECS. All the participants gave examples of occasions when their speech-driven system was unreliable; in some instances, this unreliability was reported as not being a problem (e.g., for changing television channels); however, it was perceived as a problem for more safety critical functions (e.g., opening a door). Reliability was cited by participants as the reason for using a switch-operated system as back up. Benefits of speech-driven systems focused on speech operation enabling access when other methods were not possible; quicker operation and better aesthetic considerations. Overall, there was a perception of increased independence from the use of speech-driven environmental control. In general, speech was considered a useful method of operating environmental controls by the participants interviewed; however, their perceptions regarding reliability often influenced their decision to have backup or alternative systems for certain functions.

  19. Benchmarking DFT and semi-empirical methods for a reliable and cost-efficient computational screening of benzofulvene derivatives as donor materials for small-molecule organic solar cells.

    PubMed

    Tortorella, Sara; Talamo, Maurizio Mastropasqua; Cardone, Antonio; Pastore, Mariachiara; De Angelis, Filippo

    2016-02-24

    A systematic computational investigation on the optical properties of a group of novel benzofulvene derivatives (Martinelli 2014 Org. Lett. 16 3424-7), proposed as possible donor materials in small molecule organic photovoltaic (smOPV) devices, is presented. A benchmark evaluation against experimental results on the accuracy of different exchange and correlation functionals and semi-empirical methods in predicting both reliable ground state equilibrium geometries and electronic absorption spectra is carried out. The benchmark of the geometry optimization level indicated that the best agreement with x-ray data is achieved by using the B3LYP functional. Concerning the optical gap prediction, we found that, among the employed functionals, MPW1K provides the most accurate excitation energies over the entire set of benzofulvenes. Similarly reliable results were also obtained for range-separated hybrid functionals (CAM-B3LYP and wB97XD) and for global hybrid methods incorporating a large amount of non-local exchange (M06-2X and M06-HF). Density functional theory (DFT) hybrids with a moderate (about 20-30%) extent of Hartree-Fock exchange (HFexc) (PBE0, B3LYP and M06) were also found to deliver HOMO-LUMO energy gaps which compare well with the experimental absorption maxima, thus representing a valuable alternative for a prompt and predictive estimation of the optical gap. The possibility of using completely semi-empirical approaches (AM1/ZINDO) is also discussed.

  20. Regional reliability of quantitative signal targeting with alternating radiofrequency (STAR) labeling of arterial regions (QUASAR).

    PubMed

    Tatewaki, Yasuko; Higano, Shuichi; Taki, Yasuyuki; Thyreau, Benjamin; Murata, Takaki; Mugikura, Shunji; Ito, Daisuke; Takase, Kei; Takahashi, Shoki

    2014-01-01

    Quantitative signal targeting with alternating radiofrequency labeling of arterial regions (QUASAR) is a recent spin labeling technique that could improve the reliability of brain perfusion measurements. Although it is considered reliable for measuring gray matter as a whole, it has never been evaluated regionally. Here we assessed this regional reliability. Using a 3-Tesla Philips Achieva whole-body system, we scanned four times 10 healthy volunteers, in two sessions 2 weeks apart, to obtain QUASAR images. We computed perfusion images and ran a voxel-based analysis within all brain structures. We also calculated mean regional cerebral blood flow (rCBF) within regions of interest configured for each arterial territory distribution. The mean CBF over whole gray matter was 37.74 with intraclass correlation coefficient (ICC) of .70. In white matter, it was 13.94 with an ICC of .30. Voxel-wise ICC and coefficient-of-variation maps showed relatively lower reliability in watershed areas and white matter especially in deeper white matter. The absolute mean rCBF values were consistent with the ones reported from PET, as was the relatively low variability in different feeding arteries. Thus, QUASAR reliability for regional perfusion is high within gray matter, but uncertain within white matter. © 2014 The Authors. Journal of Neuroimaging published by the American Society of Neuroimaging.

  1. Correcting Fallacies in Validity, Reliability, and Classification

    ERIC Educational Resources Information Center

    Sijtsma, Klaas

    2009-01-01

    This article reviews three topics from test theory that continue to raise discussion and controversy and capture test theorists' and constructors' interest. The first topic concerns the discussion of the methodology of investigating and establishing construct validity; the second topic concerns reliability and its misuse, alternative definitions…

  2. Evaluation of the reliability of maize reference assays for GMO quantification.

    PubMed

    Papazova, Nina; Zhang, David; Gruden, Kristina; Vojvoda, Jana; Yang, Litao; Buh Gasparic, Meti; Blejec, Andrej; Fouilloux, Stephane; De Loose, Marc; Taverniers, Isabel

    2010-03-01

    A reliable PCR reference assay for relative genetically modified organism (GMO) quantification must be specific for the target taxon and amplify uniformly along the commercialised varieties within the considered taxon. Different reference assays for maize (Zea mays L.) are used in official methods for GMO quantification. In this study, we evaluated the reliability of eight existing maize reference assays, four of which are used in combination with an event-specific polymerase chain reaction (PCR) assay validated and published by the Community Reference Laboratory (CRL). We analysed the nucleotide sequence variation in the target genomic regions in a broad range of transgenic and conventional varieties and lines: MON 810 varieties cultivated in Spain and conventional varieties from various geographical origins and breeding history. In addition, the reliability of the assays was evaluated based on their PCR amplification performance. A single base pair substitution, corresponding to a single nucleotide polymorphism (SNP) reported in an earlier study, was observed in the forward primer of one of the studied alcohol dehydrogenase 1 (Adh1) (70) assays in a large number of varieties. The SNP presence is consistent with a poor PCR performance observed for this assay along the tested varieties. The obtained data show that the Adh1 (70) assay used in the official CRL NK603 assay is unreliable. Based on our results from both the nucleotide stability study and the PCR performance test, we can conclude that the Adh1 (136) reference assay (T25 and Bt11 assays) as well as the tested high mobility group protein gene assay, which also form parts of CRL methods for quantification, are highly reliable. Despite the observed uniformity in the nucleotide sequence of the invertase gene assay, the PCR performance test reveals that this target sequence might occur in more than one copy. Finally, although currently not forming a part of official quantification methods, zein and SSIIb assays are found to be highly reliable in terms of nucleotide stability and PCR performance and are proposed as good alternative targets for a reference assay for maize.

  3. Novel Method for Reliable Identification of Siccibacter and Franconibacter Strains: from “Pseudo-Cronobacter” to New Enterobacteriaceae Genera

    PubMed Central

    Vlach, Jiří; Junková, Petra; Karamonová, Ludmila; Blažková, Martina; Fukal, Ladislav

    2017-01-01

    ABSTRACT In the last decade, strains of the genera Franconibacter and Siccibacter have been misclassified as first Enterobacter and later Cronobacter. Because Cronobacter is a serious foodborne pathogen that affects premature neonates and elderly individuals, such misidentification may not only falsify epidemiological statistics but also lead to tests of powdered infant formula or other foods giving false results. Currently, the main ways of identifying Franconibacter and Siccibacter strains are by biochemical testing or by sequencing of the fusA gene as part of Cronobacter multilocus sequence typing (MLST), but in relation to these strains the former is generally highly difficult and unreliable while the latter remains expensive. To address this, we developed a fast, simple, and most importantly, reliable method for Franconibacter and Siccibacter identification based on intact-cell matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS). Our method integrates the following steps: data preprocessing using mMass software; principal-component analysis (PCA) for the selection of mass spectrum fingerprints of Franconibacter and Siccibacter strains; optimization of the Biotyper database settings for the creation of main spectrum projections (MSPs). This methodology enabled us to create an in-house MALDI MS database that extends the current MALDI Biotyper database by including Franconibacter and Siccibacter strains. Finally, we verified our approach using seven previously unclassified strains, all of which were correctly identified, thereby validating our method. IMPORTANCE We show that the majority of methods currently used for the identification of Franconibacter and Siccibacter bacteria are not able to properly distinguish these strains from those of Cronobacter. While sequencing of the fusA gene as part of Cronobacter MLST remains the most reliable such method, it is highly expensive and time-consuming. Here, we demonstrate a cost-effective and reliable alternative that correctly distinguishes between Franconibacter, Siccibacter, and Cronobacter bacteria and identifies Franconibacter and Siccibacter at the species level. Using intact-cell MALDI-TOF MS, we extend the current MALDI Biotyper database with 11 Franconibacter and Siccibacter MSPs. In addition, the use of our approach is likely to lead to a more reliable identification scheme for Franconibacter and Siccibacter strains and, consequently, a more trustworthy epidemiological picture of their involvement in disease. PMID:28455327

  4. The contraception needs of the perimenopausal woman.

    PubMed

    Hardman, Sarah M R; Gebbie, Ailsa E

    2014-08-01

    Perimenopausal women have low fertility but must still be advised to use contraception until natural sterility is reached if they are sexually active. Patterns of contraceptive use vary in different countries worldwide. Long-acting reversible contraceptive methods offer reliable contraception that may be an alternative to sterilisation. Hormonal methods confer significant non-contraceptive benefits, and each individual woman should weigh up the benefits and risks of a particular method. No method of contraception is contraindicated by age alone, although combined hormonal contraception and injectable progestogens are not recommended for women over the age of 50 years. The intrauterine system has particular advantages as a low-dose method of effective hormonal contraception, which also offers control of menstrual dysfunction and endometrial protection in women requiring oestrogen replacement. Condoms are recommended for personal protection against sexually transmitted infections in new relationships. Standard hormone replacement therapy is not a method of contraception. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Hypothesis Testing Using Factor Score Regression

    PubMed Central

    Devlieger, Ines; Mayer, Axel; Rosseel, Yves

    2015-01-01

    In this article, an overview is given of four methods to perform factor score regression (FSR), namely regression FSR, Bartlett FSR, the bias avoiding method of Skrondal and Laake, and the bias correcting method of Croon. The bias correcting method is extended to include a reliable standard error. The four methods are compared with each other and with structural equation modeling (SEM) by using analytic calculations and two Monte Carlo simulation studies to examine their finite sample characteristics. Several performance criteria are used, such as the bias using the unstandardized and standardized parameterization, efficiency, mean square error, standard error bias, type I error rate, and power. The results show that the bias correcting method, with the newly developed standard error, is the only suitable alternative for SEM. While it has a higher standard error bias than SEM, it has a comparable bias, efficiency, mean square error, power, and type I error rate. PMID:29795886

  6. Language evolution and human history: what a difference a date makes.

    PubMed

    Gray, Russell D; Atkinson, Quentin D; Greenhill, Simon J

    2011-04-12

    Historical inference is at its most powerful when independent lines of evidence can be integrated into a coherent account. Dating linguistic and cultural lineages can potentially play a vital role in the integration of evidence from linguistics, anthropology, archaeology and genetics. Unfortunately, although the comparative method in historical linguistics can provide a relative chronology, it cannot provide absolute date estimates and an alternative approach, called glottochronology, is fundamentally flawed. In this paper we outline how computational phylogenetic methods can reliably estimate language divergence dates and thus help resolve long-standing debates about human prehistory ranging from the origin of the Indo-European language family to the peopling of the Pacific.

  7. Language evolution and human history: what a difference a date makes

    PubMed Central

    Gray, Russell D.; Atkinson, Quentin D.; Greenhill, Simon J.

    2011-01-01

    Historical inference is at its most powerful when independent lines of evidence can be integrated into a coherent account. Dating linguistic and cultural lineages can potentially play a vital role in the integration of evidence from linguistics, anthropology, archaeology and genetics. Unfortunately, although the comparative method in historical linguistics can provide a relative chronology, it cannot provide absolute date estimates and an alternative approach, called glottochronology, is fundamentally flawed. In this paper we outline how computational phylogenetic methods can reliably estimate language divergence dates and thus help resolve long-standing debates about human prehistory ranging from the origin of the Indo-European language family to the peopling of the Pacific. PMID:21357231

  8. Practical implementation of spectral-intensity dispersion-canceled optical coherence tomography with artifact suppression

    NASA Astrophysics Data System (ADS)

    Shirai, Tomohiro; Friberg, Ari T.

    2018-04-01

    Dispersion-canceled optical coherence tomography (OCT) based on spectral intensity interferometry was devised as a classical counterpart of quantum OCT to enhance the basic performance of conventional OCT. In this paper, we demonstrate experimentally that an alternative method of realizing this kind of OCT by means of two optical fiber couplers and a single spectrometer is a more practical and reliable option than the existing methods proposed previously. Furthermore, we develop a recipe for reducing multiple artifacts simultaneously on the basis of simple averaging and verify experimentally that it works successfully in the sense that all the artifacts are mitigated effectively and only the true signals carrying structural information about the sample survive.

  9. Capture approximations beyond a statistical quantum mechanical method for atom-diatom reactions

    NASA Astrophysics Data System (ADS)

    Barrios, Lizandra; Rubayo-Soneira, Jesús; González-Lezana, Tomás

    2016-03-01

    Statistical techniques constitute useful approaches to investigate atom-diatom reactions mediated by insertion dynamics which involves complex-forming mechanisms. Different capture schemes based on energy considerations regarding the specific diatom rovibrational states are suggested to evaluate the corresponding probabilities of formation of such collision species between reactants and products in an attempt to test reliable alternatives for computationally demanding processes. These approximations are tested in combination with a statistical quantum mechanical method for the S + H2(v = 0 ,j = 1) → SH + H and Si + O2(v = 0 ,j = 1) → SiO + O reactions, where this dynamical mechanism plays a significant role, in order to probe their validity.

  10. Characterization of perovskite solar cells: Towards a reliable measurement protocol

    NASA Astrophysics Data System (ADS)

    Zimmermann, Eugen; Wong, Ka Kan; Müller, Michael; Hu, Hao; Ehrenreich, Philipp; Kohlstädt, Markus; Würfel, Uli; Mastroianni, Simone; Mathiazhagan, Gayathri; Hinsch, Andreas; Gujar, Tanaji P.; Thelakkat, Mukundan; Pfadler, Thomas; Schmidt-Mende, Lukas

    2016-09-01

    Lead halide perovskite solar cells have shown a tremendous rise in power conversion efficiency with reported record efficiencies of over 20% making this material very promising as a low cost alternative to conventional inorganic solar cells. However, due to a differently severe "hysteretic" behaviour during current density-voltage measurements, which strongly depends on scan rate, device and measurement history, preparation method, device architecture, etc., commonly used solar cell measurements do not give reliable or even reproducible results. For the aspect of commercialization and the possibility to compare results of different devices among different laboratories, it is necessary to establish a measurement protocol which gives reproducible results. Therefore, we compare device characteristics derived from standard current density-voltage measurements with stabilized values obtained from an adaptive tracking of the maximum power point and the open circuit voltage as well as characteristics extracted from time resolved current density-voltage measurements. Our results provide insight into the challenges of a correct determination of device performance and propose a measurement protocol for a reliable characterisation which is easy to implement and has been tested on varying perovskite solar cells fabricated in different laboratories.

  11. The Development of a Motor-Free Short-Form of the Wechsler Intelligence Scale for Children-Fifth Edition.

    PubMed

    Piovesana, Adina M; Harrison, Jessica L; Ducat, Jacob J

    2017-12-01

    This study aimed to develop a motor-free short-form of the Wechsler Intelligence Scale for Children-Fifth Edition (WISC-V) that allows clinicians to estimate the Full Scale Intelligence Quotients of youths with motor impairments. Using the reliabilities and intercorrelations of six WISC-V motor-free subtests, psychometric methodologies were applied to develop look-up tables for four Motor-free Short-form indices: Verbal Comprehension Short-form, Perceptual Reasoning Short-form, Working Memory Short-form, and a Motor-free Intelligence Quotient. Index-level discrepancy tables were developed using the same methods to allow clinicians to statistically compare visual, verbal, and working memory abilities. The short-form indices had excellent reliabilities ( r = .92-.97) comparable to the original WISC-V. This motor-free short-form of the WISC-V is a reliable alternative for the assessment of intellectual functioning in youths with motor impairments. Clinicians are provided with user-friendly look-up tables, index level discrepancy tables, and base rates, displayed similar to those in the WISC-V manuals to enable interpretation of assessment results.

  12. A novel approach on accelerated ageing towards reliability optimization of high concentration photovoltaic cells

    NASA Astrophysics Data System (ADS)

    Tsanakas, John A.; Jaffre, Damien; Sicre, Mathieu; Elouamari, Rachid; Vossier, Alexis; de Salins, Jean-Edouard; Bechou, Laurent; Levrier, Bruno; Perona, Arnaud; Dollet, Alain

    2014-09-01

    This paper presents a preliminary study upon a novel approach proposed for highly accelerated ageing and reliability optimization of high concentrating photovoltaic (HCPV) cells and assemblies. The intended approach aims to overcome several limitations of some current accelerated ageing tests (AAT) adopted up today, proposing the use of an alternative experimental set-up for performing faster and more realistic thermal cycles, under real sun, without the involvement of environmental chamber. The study also includes specific characterization techniques, before and after each AAT sequence, which respectively provide the initial and final diagnosis on the condition of the tested sample. The acquired data from these diagnostic/characterization methods are then used as indices to determine both quantitatively and qualitatively the severity of degradation and, thus, the ageing level for each tested HCPV assembly or cell sample. Ultimate goal of such "initial diagnosis - AAT - final diagnosis" sequences is to provide the basis for a future work on the reliability analysis of the main degradation mechanisms and confident prediction of failure propagation in HCPV cells, by means of acceleration factor (AF) and mean-time-to-failure (MTTF) estimations.

  13. Scientific rigour in qualitative research--examples from a study of women's health in family practice.

    PubMed

    Hamberg, K; Johansson, E; Lindgren, G; Westman, G

    1994-06-01

    The increase in qualitative research in family medicine raises a demand for critical discussions about design, methods and conclusions. This article shows how scientific claims for truthful findings and neutrality can be assessed. Established concepts such as validity, reliability, objectivity and generalization cannot be used in qualitative research. Alternative criteria for scientific rigour, initially introduced by Lincoln and Guba, are presented: credibility, dependability, confirmability and transferability. These criteria have been applied to a research project, a qualitative study with in-depth interviews with female patients suffering from chronic pain in the locomotor system. The interview data were analysed on the basis of grounded theory. The proposed indicators for scientific rigour were shown to be useful when applied to the research project. Several examples are given. Difficulties in the use of the alternative criteria are also discussed.

  14. Integrating alternative splicing detection into gene prediction.

    PubMed

    Foissac, Sylvain; Schiex, Thomas

    2005-02-10

    Alternative splicing (AS) is now considered as a major actor in transcriptome/proteome diversity and it cannot be neglected in the annotation process of a new genome. Despite considerable progresses in term of accuracy in computational gene prediction, the ability to reliably predict AS variants when there is local experimental evidence of it remains an open challenge for gene finders. We have used a new integrative approach that allows to incorporate AS detection into ab initio gene prediction. This method relies on the analysis of genomically aligned transcript sequences (ESTs and/or cDNAs), and has been implemented in the dynamic programming algorithm of the graph-based gene finder EuGENE. Given a genomic sequence and a set of aligned transcripts, this new version identifies the set of transcripts carrying evidence of alternative splicing events, and provides, in addition to the classical optimal gene prediction, alternative optimal predictions (among those which are consistent with the AS events detected). This allows for multiple annotations of a single gene in a way such that each predicted variant is supported by a transcript evidence (but not necessarily with a full-length coverage). This automatic combination of experimental data analysis and ab initio gene finding offers an ideal integration of alternatively spliced gene prediction inside a single annotation pipeline.

  15. Can reliable sage-grouse lek counts be obtained using aerial infrared technology

    USGS Publications Warehouse

    Gillette, Gifford L.; Coates, Peter S.; Petersen, Steven; Romero, John P.

    2013-01-01

    More effective methods for counting greater sage-grouse (Centrocercus urophasianus) are needed to better assess population trends through enumeration or location of new leks. We describe an aerial infrared technique for conducting sage-grouse lek counts and compare this method with conventional ground-based lek count methods. During the breeding period in 2010 and 2011, we surveyed leks from fixed-winged aircraft using cryogenically cooled mid-wave infrared cameras and surveyed the same leks on the same day from the ground following a standard lek count protocol. We did not detect significant differences in lek counts between surveying techniques. These findings suggest that using a cryogenically cooled mid-wave infrared camera from an aerial platform to conduct lek surveys is an effective alternative technique to conventional ground-based methods, but further research is needed. We discuss multiple advantages to aerial infrared surveys, including counting in remote areas, representing greater spatial variation, and increasing the number of counted leks per season. Aerial infrared lek counts may be a valuable wildlife management tool that releases time and resources for other conservation efforts. Opportunities exist for wildlife professionals to refine and apply aerial infrared techniques to wildlife monitoring programs because of the increasing reliability and affordability of this technology.

  16. Grasp specific and user friendly interface design for myoelectric hand prostheses.

    PubMed

    Mohammadi, Alireza; Lavranos, Jim; Howe, Rob; Choong, Peter; Oetomo, Denny

    2017-07-01

    This paper presents the design and characterisation of a hand prosthesis and its user interface, focusing on performing the most commonly used grasps in activities of daily living (ADLs). Since the operation of a multi-articulated powered hand prosthesis is difficult to learn and master, there is a significant rate of abandonment by amputees in preference for simpler devices. In choosing so, amputees chose to live with fewer features in their prosthesis that would more reliably perform the basic operations. In this paper, we look simultaneously at a hand prosthesis design method that aims for a small number of grasps, a low complexity user interface and an alternative method to the current use of EMG as a preshape selection method through the use of a simple button; to enable amputees to get to and execute the intended hand movements intuitively, quickly and reliably. An experiment is reported at the end of the paper comparing the speed and accuracy with which able-bodied naive subjects are able to select the intended preshapes through the use of a simplified EMG method and a simple button. It is shown that the button was significantly superior in the speed of successful task completion and marginally superior in accuracy (success of first attempt).

  17. High Temperature Irradiation-Resistant Thermocouple Performance Improvements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joshua Daw; Joy Rempe; Darrell Knudson

    2009-04-01

    Traditional methods for measuring temperature in-pile degrade at temperatures above 1100 ºC. To address this instrumentation need, the Idaho National Laboratory (INL) developed and evaluated the performance of a high temperature irradiation-resistant thermocouple (HTIR-TC) that contains alloys of molybdenum and niobium. Data from high temperature (up to 1500 ºC) long duration (up to 4000 hours) tests and on-going irradiations at INL’s Advanced Test Reactor demonstrate the superiority of these sensors to commercially-available thermocouples. However, several options have been identified that could further enhance their reliability, reduce their production costs, and allow their use in a wider range of operating conditions.more » This paper presents results from on-going Idaho National Laboratory (INL)/University of Idaho (UI) efforts to investigate options to improve HTIR-TC ductility, reliability, and resolution by investigating specially-formulated alloys of molybdenum and niobium and alternate diameter thermoelements (wires). In addition, on-going efforts to evaluate alternate fabrication approaches, such as drawn and loose assembly techniques will be discussed. Efforts to reduce HTIR-TC fabrication costs, such as the use of less expensive extension cable will also be presented. Finally, customized HTIR-TC designs developed for specific customer needs will be summarized to emphasize the varied conditions under which these sensors may be used.« less

  18. Who Shot Ya? How Emergency Departments Can Collect Reliable Police Shooting Data.

    PubMed

    Richardson, Joseph B; St Vil, Christopher; Cooper, Carnell

    2016-04-01

    This paper examines an alternative solution for collecting reliable police shooting data. One alternative is the collection of police shooting data from hospital trauma units, specifically hospital-based violence intervention programs. These programs are situated in Level I trauma units in many major cities in USA. While the intent of these programs is to reduce the risk factors associated with trauma recidivism among victims of violent injury, they also collect reliable data on the number of individuals treated for gunshot wounds. While most trauma units do a great job collecting data on mode of injury, many do not collect data on the circumstances surrounding the injury, particularly police-involved shootings. Research protocol on firearm-related injury conducted in emergency departments typically does not allow researchers to interview victims of violent injury who are under arrest. Most victims of nonfatal police-involved shootings are under arrest at the time they are treated by the ED for their injury. Research protocol on victims of violent injury often excludes individuals under arrest; they fall under the exclusion criteria when recruiting potential participants for research on violence. Researchers working in hospital emergency departments are prohibited from recruited individuals under arrests. The trauma staff, particularly ED physicians and nurses, are in a strategic position to collect this kind of data. Thus, this paper examines how trauma units can serve as an alternative in the reliable collection of police shooting data.

  19. It's not that Difficult: An Interrater Reliability Study of the DSM–5 Section III Alternative Model for Personality Disorders

    DOE PAGES

    Garcia, Darren J.; Skadberg, Rebecca M.; Schmidt, Megan; ...

    2018-03-05

    The Diagnostic and Statistical Manual of Mental Disorders (5th ed. [DSM–5]; American Psychiatric Association, 2013) Section III Alternative Model for Personality Disorders (AMPD) represents a novel approach to the diagnosis of personality disorder (PD). In this model, PD diagnosis requires evaluation of level of impairment in personality functioning (Criterion A) and characterization by pathological traits (Criterion B). Questions about clinical utility, complexity, and difficulty in learning and using the AMPD have been expressed in recent scholarly literature. We examined the learnability, interrater reliability, and clinical utility of the AMPD using a vignette methodology and graduate student raters. Results showed thatmore » student clinicians can learn Criterion A of the AMPD to a high level of interrater reliability and agreement with expert ratings. Interrater reliability of the 25 trait facets of the AMPD varied but showed overall acceptable levels of agreement. Examination of severity indexes of PD impairment showed the level of personality functioning (LPF) added information beyond that of global assessment of functioning (GAF). Clinical utility ratings were generally strong. Lastly, the satisfactory interrater reliability of components of the AMPD indicates the model, including the LPF, is very learnable.« less

  20. It's not that Difficult: An Interrater Reliability Study of the DSM–5 Section III Alternative Model for Personality Disorders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Darren J.; Skadberg, Rebecca M.; Schmidt, Megan

    The Diagnostic and Statistical Manual of Mental Disorders (5th ed. [DSM–5]; American Psychiatric Association, 2013) Section III Alternative Model for Personality Disorders (AMPD) represents a novel approach to the diagnosis of personality disorder (PD). In this model, PD diagnosis requires evaluation of level of impairment in personality functioning (Criterion A) and characterization by pathological traits (Criterion B). Questions about clinical utility, complexity, and difficulty in learning and using the AMPD have been expressed in recent scholarly literature. We examined the learnability, interrater reliability, and clinical utility of the AMPD using a vignette methodology and graduate student raters. Results showed thatmore » student clinicians can learn Criterion A of the AMPD to a high level of interrater reliability and agreement with expert ratings. Interrater reliability of the 25 trait facets of the AMPD varied but showed overall acceptable levels of agreement. Examination of severity indexes of PD impairment showed the level of personality functioning (LPF) added information beyond that of global assessment of functioning (GAF). Clinical utility ratings were generally strong. Lastly, the satisfactory interrater reliability of components of the AMPD indicates the model, including the LPF, is very learnable.« less

  1. 49 CFR Appendix A to Part 611 - Description of Measures Used for Project Evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) Livable Communities initiatives and local economic activities; (5) Consideration of alternative land use... to fixed guideway systems and extensions may not be limited to a single project). (b) The stability and reliability of the proposed capital financing plan, according to: (i) The stability, reliability...

  2. Low Voltage Electrolytic Capacitor Pulse Forming Inductive Network for Electric Weapons

    DTIC Science & Technology

    2006-06-01

    reliable high- current, high-energy pulses of many megawatts. Pulsed alternators potentially have the same maintenance issues as other motor ...high-energy pulses of many megawatts. Pulsed alternators potentially have the same maintenance issues as other motor -generator sets, so a solid...Rotating Flywheel) Pulse Forming Network Compensated Pulsed Alternators, or Compulsators as they are called, are essentially large motor -generator

  3. High Frequency Alternator, Power Frequency Conversion (HFA-PFC) Technology for Lightweight Tactical Power Generation

    DTIC Science & Technology

    1995-09-22

    Modules 345-800 Amperes/400-3000 Votts - Current and Thermal Ratings of Module * Circuit Currents Element Data Model* Current Thermal Units...IGBTs modules (Powerex) 56 Main components for rectifiers, Diode Bridge modules (Powerex) 65 Heat Sinks (Aavid Engineering) 85 Westinghouse...exciter circuit , are not reliable enough for military applications, and they were replaced by brushless alternators. The brushless AC alternator

  4. Nuclear electric propulsion operational reliability and crew safety study: NEP systems/modeling report

    NASA Technical Reports Server (NTRS)

    Karns, James

    1993-01-01

    The objective of this study was to establish the initial quantitative reliability bounds for nuclear electric propulsion systems in a manned Mars mission required to ensure crew safety and mission success. Finding the reliability bounds involves balancing top-down (mission driven) requirements and bottom-up (technology driven) capabilities. In seeking this balance we hope to accomplish the following: (1) provide design insights into the achievability of the baseline design in terms of reliability requirements, given the existing technology base; (2) suggest alternative design approaches which might enhance reliability and crew safety; and (3) indicate what technology areas require significant research and development to achieve the reliability objectives.

  5. Evaluation of MALDI-TOF MS (Matrix-Assisted Laser Desorption-Ionization Time-of-Flight Mass Spectrometry) for routine identification of anaerobic bacteria.

    PubMed

    Rodríguez-Sánchez, Belén; Alcalá, Luis; Marín, Mercedes; Ruiz, Adrián; Alonso, Elena; Bouza, Emilio

    2016-12-01

    Information regarding the use of MALDI-TOF MS as an alternative to conventional laboratory methods for the rapid and reliable identification of bacterial isolates is still limited. In this study, MALDI-TOF MS was evaluated on 295 anaerobic isolates previously identified by 16S rRNA gene sequencing and with biochemical tests (Rapid ID 32A system, BioMérieux). In total, 85.8% of the isolates were identified by MALDI-TOF MS at the species level vs 49.8% using the Rapid ID 32A system (p < 0.0001). None of the isolates was discordantly identified at the genus level using MALDI-TOF MS and only 9 of them could not be identified using the method. Thus, our results show that MALDI-TOF MS is a robust and reliable tool for the identification of anaerobic isolates in the microbiology laboratory. Its implementation will reduce the turnaround time for a final identification and the number of isolates that require 16S rRNA sequencing. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. On the reliable measurement of specific absorption rates and intrinsic loss parameters in magnetic hyperthermia materials

    NASA Astrophysics Data System (ADS)

    Wildeboer, R. R.; Southern, P.; Pankhurst, Q. A.

    2014-12-01

    In the clinical application of magnetic hyperthermia, the heat generated by magnetic nanoparticles in an alternating magnetic field is used as a cancer treatment. The heating ability of the particles is quantified by the specific absorption rate (SAR), an extrinsic parameter based on the clinical response characteristic of power delivered per unit mass, and by the intrinsic loss parameter (ILP), an intrinsic parameter based on the heating capacity of the material. Even though both the SAR and ILP are widely used as comparative design parameters, they are almost always measured in non-adiabatic systems that make accurate measurements difficult. We present here the results of a systematic review of measurement methods for both SAR and ILP, leading to recommendations for a standardised, simple and reliable method for measurements using non-adiabatic systems. In a representative survey of 50 retrieved datasets taken from published papers, the derived SAR or ILP was found to be more than 5% overestimated in 24% of cases and more than 5% underestimated in 52% of cases.

  7. Detection of microbial concentration in ice-cream using the impedance technique.

    PubMed

    Grossi, M; Lanzoni, M; Pompei, A; Lazzarini, R; Matteuzzi, D; Riccò, B

    2008-06-15

    The detection of microbial concentration, essential for safe and high quality food products, is traditionally made with the plate count technique, that is reliable, but also slow and not easily realized in the automatic form, as required for direct use in industrial machines. To this purpose, the method based on impedance measurements represents an attractive alternative since it can produce results in about 10h, instead of the 24-48h needed by standard plate counts and can be easily realized in automatic form. In this paper such a method has been experimentally studied in the case of ice-cream products. In particular, all main ice-cream compositions of real interest have been considered and no nutrient media has been used to dilute the samples. A measurement set-up has been realized using benchtop instruments for impedance measurements on samples whose bacteria concentration was independently measured by means of standard plate counts. The obtained results clearly indicate that impedance measurement represents a feasible and reliable technique to detect total microbial concentration in ice-cream, suitable to be implemented as an embedded system for industrial machines.

  8. Rapid molecular sexing of three-spined sticklebacks, Gasterosteus aculeatus L., based on large Y-chromosomal insertions.

    PubMed

    Bakker, Theo C M; Giger, Thomas; Frommen, Joachim G; Largiadèr, Carlo R

    2017-08-01

    There is a need for rapid and reliable molecular sexing of three-spined sticklebacks, Gasterosteus aculeatus, the supermodel species for evolutionary biology. A DNA region at the 5' end of the sex-linked microsatellite Gac4202 was sequenced for the X chromosome of six females and the Y chromosome of five males from three populations. The Y chromosome contained two large insertions, which did not recombine with the phenotype of sex in a cross of 322 individuals. Genetic variation (SNPs and indels) within the insertions was smaller than on flanking DNA sequences. Three molecular PCR-based sex tests were developed, in which the first, the second or both insertions were covered. In five European populations (from DE, CH, NL, GB) of three-spined sticklebacks, tests with both insertions combined showed two clearly separated bands on agarose minigels in males and one band in females. The tests with the separate insertions gave similar results. Thus, the new molecular sexing method gave rapid and reliable results for sexing three-spined sticklebacks and is an improvement and/or alternative to existing methods.

  9. Comparison of biotyping methods as alternative identification tools to molecular typing of pathogenic Cryptococcus species in sub-Saharan Africa

    PubMed Central

    Nyazika, Tinashe K.; Robertson, Valerie J.; Nherera, Brenda; Mapondera, Prichard T.; Meis, Jacques F.; Hagen, Ferry

    2015-01-01

    Summary Cryptococcal meningitis is the leading fungal infection and AIDS defining opportunistic illness in patients with late stage HIV infection, particularly in South-East Asia and sub-Saharan Africa. Given the high mortality, clinical differences and the extensive ecological niche of Cryptococcus neoformans and Cryptococcus gattii species complexes, there is need for laboratories in sub-Sahara African countries to adopt new and alternative reliable diagnostic algorithms that rapidly identify and distinguish these species. We biotyped 74 and then amplified fragment length polymorphism (AFLP) genotyped 66 Cryptococcus isolates from a cohort of patients with HIV-associated cryptococcal meningitis. Cryptococcus gattii sensu lato was isolated at a prevalence of 16.7% (n = 11/66) and C. neoformans sensu stricto was responsible for 83.3% (n = 55/66) of the infections. l-Canavanine glycine bromothymol blue, yeast-carbon-base-d-proline-d-tryptophan and creatinine dextrose bromothymol blue thymine were able to distinguish pathogenic C. gattii sensu lato from C. neoformans sensu stricto species when compared with amplified fragment length polymorphism genotyping. This study demonstrates high C. gattii sensu lato prevalence in Zimbabwe. In addition, biotyping methods can be used as alternative diagnostic tools to molecular typing in resource-limited areas for differentiating pathogenic Cryptococcus species. PMID:26661484

  10. Comparison of biotyping methods as alternative identification tools to molecular typing of pathogenic Cryptococcus species in sub-Saharan Africa.

    PubMed

    Nyazika, Tinashe K; Robertson, Valerie J; Nherera, Brenda; Mapondera, Prichard T; Meis, Jacques F; Hagen, Ferry

    2016-03-01

    Cryptococcal meningitis is the leading fungal infection and AIDS defining opportunistic illness in patients with late stage HIV infection, particularly in South-East Asia and sub-Saharan Africa. Given the high mortality, clinical differences and the extensive ecological niche of Cryptococcus neoformans and Cryptococcus gattii species complexes, there is need for laboratories in sub-Sahara African countries to adopt new and alternative reliable diagnostic algorithms that rapidly identify and distinguish these species. We biotyped 74 and then amplified fragment length polymorphism (AFLP) genotyped 66 Cryptococcus isolates from a cohort of patients with HIV-associated cryptococcal meningitis. C. gattii sensu lato was isolated at a prevalence of 16.7% (n = 11/66) and C. neoformans sensu stricto was responsible for 83.3% (n = 55/66) of the infections. l-Canavanine glycine bromothymol blue, yeast-carbon-base-d-proline-d-tryptophan and creatinine dextrose bromothymol blue thymine were able to distinguish pathogenic C. gattii sensu lato from C. neoformans sensu stricto species when compared with AFLP genotyping. This study demonstrates high C. gattii sensu lato prevalence in Zimbabwe. In addition, biotyping methods can be used as alternative diagnostic tools to molecular typing in resource-limited areas for differentiating pathogenic Cryptococcus species. © 2015 Blackwell Verlag GmbH.

  11. Simple method to distinguish between primary and secondary C3 deficiencies.

    PubMed

    Pereira de Carvalho Florido, Marlene; Ferreira de Paula, Patrícia; Isaac, Lourdes

    2003-03-01

    Due to the increasing numbers of reported clinical cases of complement deficiency in medical centers, clinicians are now more aware of the role of the complement system in the protection against infections caused by microorganisms. Therefore, clinical laboratories are now prepared to perform a number of diagnostic tests of the complement system other than the standard 50% hemolytic component assay. Deficiencies of alternative complement pathway proteins are related to severe and recurrent infections; and the application of easy, reliable, and low-cost methods for their detection and distinction are always welcome, notably in developing countries. When activation of the alternative complement pathway is evaluated in hemolytic agarose plates, some but not all human sera cross-react to form a late linear lysis. Since the formation of this linear lysis is dependent on C3 and factor B, it is possible to use late linear lysis to routinely screen for the presence of deficiencies of alternative human complement pathway proteins such as factor B. Furthermore, since linear lysis is observed between normal human serum and primary C3-deficient serum but not between normal human serum and secondary C3-deficient serum caused by the lack of factor H or factor I, this assay may also be used to discriminate between primary and secondary C3 deficiencies.

  12. What approach to brain partial volume correction is best for PET/MRI?

    NASA Astrophysics Data System (ADS)

    Hutton, B. F.; Thomas, B. A.; Erlandsson, K.; Bousse, A.; Reilhac-Laborde, A.; Kazantsev, D.; Pedemonte, S.; Vunckx, K.; Arridge, S. R.; Ourselin, S.

    2013-02-01

    Many partial volume correction approaches make use of anatomical information, readily available in PET/MRI systems but it is not clear what approach is best. Seven novel approaches to partial volume correction were evaluated, including several post-reconstruction methods and several reconstruction methods that incorporate anatomical information. These were compared with an MRI-independent approach (reblurred van Cittert ) and uncorrected data. Monte Carlo PET data were generated for activity distributions representing both 18F FDG and amyloid tracer uptake. Post-reconstruction methods provided the best recovery with ideal segmentation but were particularly sensitive to mis-registration. Alternative approaches performed better in maintaining lesion contrast (unseen in MRI) with good noise control. These were also relatively insensitive to mis-registration errors. The choice of method will depend on the specific application and reliability of segmentation and registration algorithms.

  13. Robust PV Degradation Methodology and Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, Dirk; Deline, Christopher A; Kurtz, Sarah

    The degradation rate plays an important role in predicting and assessing the long-term energy generation of PV systems. Many methods have been proposed for extracting the degradation rate from operational data of PV systems, but most of the published approaches are susceptible to bias due to inverter clipping, module soiling, temporary outages, seasonality, and sensor degradation. In this manuscript, we propose a methodology for determining PV degradation leveraging available modeled clear-sky irradiance data rather than site sensor data, and a robust year-over-year (YOY) rate calculation. We show the method to provide reliable degradation rate estimates even in the case ofmore » sensor drift, data shifts, and soiling. Compared with alternate methods, we demonstrate that the proposed method delivers the lowest uncertainty in degradation rate estimates for a fleet of 486 PV systems.« less

  14. Robust PV Degradation Methodology and Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, Dirk C.; Deline, Chris; Kurtz, Sarah R.

    The degradation rate plays an important role in predicting and assessing the long-term energy generation of photovoltaics (PV) systems. Many methods have been proposed for extracting the degradation rate from operational data of PV systems, but most of the published approaches are susceptible to bias due to inverter clipping, module soiling, temporary outages, seasonality, and sensor degradation. In this paper, we propose a methodology for determining PV degradation leveraging available modeled clear-sky irradiance data rather than site sensor data, and a robust year-over-year rate calculation. We show the method to provide reliable degradation rate estimates even in the case ofmore » sensor drift, data shifts, and soiling. Compared with alternate methods, we demonstrate that the proposed method delivers the lowest uncertainty in degradation rate estimates for a fleet of 486 PV systems.« less

  15. Robust PV Degradation Methodology and Application

    DOE PAGES

    Jordan, Dirk C.; Deline, Chris; Kurtz, Sarah R.; ...

    2017-12-21

    The degradation rate plays an important role in predicting and assessing the long-term energy generation of photovoltaics (PV) systems. Many methods have been proposed for extracting the degradation rate from operational data of PV systems, but most of the published approaches are susceptible to bias due to inverter clipping, module soiling, temporary outages, seasonality, and sensor degradation. In this paper, we propose a methodology for determining PV degradation leveraging available modeled clear-sky irradiance data rather than site sensor data, and a robust year-over-year rate calculation. We show the method to provide reliable degradation rate estimates even in the case ofmore » sensor drift, data shifts, and soiling. Compared with alternate methods, we demonstrate that the proposed method delivers the lowest uncertainty in degradation rate estimates for a fleet of 486 PV systems.« less

  16. Technical note: a novel method for routine genotyping of horse coat color gene polymorphisms.

    PubMed

    Royo, L J; Fernández, I; Azor, P J; Alvarez, I; Pérez-Pardal, L; Goyache, F

    2008-06-01

    The aim of this note is to describe a reliable, fast, and cost-effective real-time PCR method for routine genotyping of mutations responsible for most coat color variation in horses. The melanocortin-1 receptor, Agouti-signaling peptide, and membrane-associated transporter protein alleles were simultaneously determined using 2 PCR protocols. The assay described here is an alternative method for routine genotyping of a defined number of polymorphisms. Allelic variants are detected in real time and no post-PCR manipulations are required, therefore limiting costs and possible carryover contamination. Data can be copied to a Microsoft Excel spreadsheet for semiautomatic determination of the genotype using a macro freely available at http://www.igijon.com/personales/fgoyache/software_i.htm (last accessed February 26, 2007). The performance of the method is demonstrated on 156 Spanish Purebred horses.

  17. [Utility of Fibroscan in the evaluation of liver fibrosis].

    PubMed

    Carrión, José A

    2009-01-01

    Chronic liver diseases produce a progressive accumulation of collagenous fiber in the liver parenchyma. For years, liver biopsy has been the gold standard to quantify liver fibrosis. Currently, non-invasive alternatives are available to quantify fibrosis. Transient elastography (TE) or Fibroscan quantifies liver rigidity, which is proportional to the grade of liver fibrosis. Studies are available that have evaluated the reliability and limitations of TE in healthy individuals, in patients with acute hepatitis, in distinct chronic liver diseases and in liver transplant recipients. TE is reliable for the diagnosis of liver cirrhosis (F4) and significant fibrosis (F2) but its values may vary according to the patient's characteristics and the etiology of the disease. TE can avoid liver biopsy in 90% of patients with cirrhosis and in up to 70% of those with significant fibrosis when combined with other non-invasive methods.

  18. Anaconda: AN automated pipeline for somatic COpy Number variation Detection and Annotation from tumor exome sequencing data.

    PubMed

    Gao, Jianing; Wan, Changlin; Zhang, Huan; Li, Ao; Zang, Qiguang; Ban, Rongjun; Ali, Asim; Yu, Zhenghua; Shi, Qinghua; Jiang, Xiaohua; Zhang, Yuanwei

    2017-10-03

    Copy number variations (CNVs) are the main genetic structural variations in cancer genome. Detecting CNVs in genetic exome region is efficient and cost-effective in identifying cancer associated genes. Many tools had been developed accordingly and yet these tools lack of reliability because of high false negative rate, which is intrinsically caused by genome exonic bias. To provide an alternative option, here, we report Anaconda, a comprehensive pipeline that allows flexible integration of multiple CNV-calling methods and systematic annotation of CNVs in analyzing WES data. Just by one command, Anaconda can generate CNV detection result by up to four CNV detecting tools. Associated with comprehensive annotation analysis of genes involved in shared CNV regions, Anaconda is able to deliver a more reliable and useful report in assistance with CNV-associate cancer researches. Anaconda package and manual can be freely accessed at http://mcg.ustc.edu.cn/bsc/ANACONDA/ .

  19. Cleaning of printed circuit assemblies with surface-mounted components

    NASA Astrophysics Data System (ADS)

    Arzigian, J. S.

    The need for ever-increasing miniaturization of airborne instrumentation through the use of surface mounted components closely placed on printed circuit boards highlights problems with traditional board cleaning methods. The reliability of assemblies which have been cleaned with vapor degreasing and spray cleaning can be seriously compromised by residual contaminants leading to solder joint failure, board corrosion, and even electrical failure of the mounted parts. In addition, recent government actions to eliminate fully halogenated chlorofluorocarbons (CFC) and chlorinated hydrocarbons from the industrial environment require the development of new cleaning materials and techniques. This paper discusses alternative cleaning materials and techniques and results that can be expected with them. Particular emphasis is placed on problems related to surface-mounted parts. These new techniques may lead to improved circuit reliability and, at the same time, be less expensive and less environmentally hazardous than the traditional systems.

  20. Power processing for electric propulsion

    NASA Technical Reports Server (NTRS)

    Finke, R. C.; Herron, B. G.; Gant, G. D.

    1975-01-01

    The potential of achieving up to 30 per cent more spacecraft payload or 50 per cent more useful operating life by the use of electric propulsion in place of conventional cold gas or hydrazine systems in science, communications, and earth applications spacecraft is a compelling reason to consider the inclusion of electric thruster systems in new spacecraft design. The propulsion requirements of such spacecraft dictate a wide range of thruster power levels and operational lifetimes, which must be matched by lightweight, efficient, and reliable thruster power processing systems. This paper will present electron bombardment ion thruster requirements; review the performance characteristics of present power processing systems; discuss design philosophies and alternatives in areas such as inverter type, arc protection, and control methods; and project future performance potentials for meeting goals in the areas of power processor weight (10 kg/kW), efficiency (approaching 92 per cent), reliability (0.96 for 15,000 hr), and thermal control capability (0.3 to 5 AU).

  1. Advances in Measuring Culturally Competent Care: A Confirmatory Factor Analysis of CAHPS-CC in a Safety-net Population

    PubMed Central

    Stern, RJ; Fernandez, A; Jacobs, EA; Neilands, TB; Weech-Maldonado, R; Quan, J; Carle, A; Seligman, HK

    2012-01-01

    Background Providing culturally competent care shows promise as a mechanism to reduce healthcare inequalities. Until the recent development of the CAHPS Cultural Competency Item Set (CAHPS-CC), no measures capturing patient-level experiences with culturally competent care have been suitable for broad-scale administration. Methods We performed confirmatory factor analysis and internal consistency reliability analysis of CAHPS-CC among patients with type 2 diabetes (n=600) receiving primary care in safety-net clinics. CAHPS-CC domains were also correlated with global physician ratings. Results A 7-factor model demonstrated satisfactory fit (χ2(231)=484.34, p<.0001) with significant factor loadings at p<.05. Three domains showed excellent reliability – Doctor Communication- Positive Behaviors (α=.82), Trust (α=.77), and Doctor Communication- Health Promotion (α=.72). Four domains showed inadequate reliability either among Spanish speakers or overall (overall reliabilities listed): Doctor Communication- Negative Behaviors (α=.54), Equitable Treatment (α=.69), Doctor Communication- Alternative Medicine (α=.52), and Shared Decision-Making (α=.51). CAHPS-CC domains were positively and significantly correlated with global physician rating. Conclusions Select CAHPS-CC domains are suitable for broad-scale administration among safety-net patients. Those domains may be used to target quality-improvement efforts focused on providing culturally competent care in safety-net settings. PMID:22895231

  2. A critique of the EC's expert (draft) reports on the status of alternatives for cosmetics testing to meet the 2013 deadline.

    PubMed

    Taylor, Katy; Casalegno, Carlotta; Stengel, Wolfgang

    2011-01-01

    The 7th Amendment to the EU's Cosmetic Directive (now recast as Regulation 1223/2009) bans the testing of cosmetic ingredients and products on animals, effective 2009. An extension until 2013 was granted, for marketing purposes only, for three endpoints: repeated dose, toxicokinetics, and reproductive toxicity. If the European Commission determines that alternatives for these endpoints are not likely to be available, it can propose a further extension. To this end, the Commission has instructed experts to produce reports on the status of alternatives for the 2013 deadline. We criticized the draft reports on a number of issues. First, the experts fell into the "high fidelity fallacy trap," i.e. asserting that full replication of the in vivo response, as opposed to high predictivity, is required before an animal test can be considered useful for regulatory purposes. Second, the experts' reports were incomplete, omitting various methods and failing to provide data on the validity, reliability, and applicability of all the methods discussed, regardless of whether the methods were in vivo, in vitro, or in silico. In this paper we provide a summary of our criticisms and provide some of the missing data in an alternative proposal for replacement of animal tests by 2013. It is our belief that use of the Threshold of Toxicological Concern (TTC) will be a useful method to mitigate much animal testing. Alternative approaches for carcinogenicity and skin sensitization could be considered sufficient in the very near future, even though these tests are not listed under the 2013 extension. For repeated dose, toxicokinetics, and reproductive toxicity a combination of in vitro methods may be able to provide appropriate protection for consumers, especially when viewed in the context of the poor predictivity of the animal models they replace. We hope the revised report will incorporate these comments, since a more thorough and positive review is required if the elimination of animal testing for cosmetics in Europe and beyond is to be achieved.

  3. RELIABILITY AND VALIDITY OF A MODIFIED ISOMETRIC DYNAMOMETER IN THE ASSESSMENT OF MUSCULAR PERFORMANCE IN INDIVIDUALS WITH ANTERIOR CRUCIATE LIGAMENT RECONSTRUCTION

    PubMed Central

    de Vasconcelos, Rodrigo Antunes; Bevilaqua-Grossi, Débora; Shimano, Antonio Carlos; Paccola, Cleber Jansen; Salvini, Tânia Fátima; Prado, Christiane Lanatovits; Junior, Wilson A. Mello

    2015-01-01

    Objectives: The aim of this study was to evaluate the reliability and validity of a modified isometric dynamometer (MID) in performance deficits of the knee extensor and flexor muscles in normal individuals and in those with ACL reconstructions. Methods: Sixty male subjects were invited to participate of the study, being divided into three groups with 20 subjects each: control group (GC), group of individuals with ACL reconstruction with patellar tendon graft (GTP, and group of individuals with ACL reconstruction with hamstrings graft (GTF). All individuals performed isometric tests in the MID, muscular strength deficits collected were subsequently compared to the tests performed on the Biodex System 3 operating in the isometric and isokinetic mode at speeds of 60°/s and 180o/s. Intraclass ICC correlation calculations were done in order to assess MID reliability, specificity, sensitivity and Kappa's consistency coefficient calculations, respectively, for assessing the MID's validity in detecting muscular deficits and intra- and intergroup comparisons when performing the four strength tests using the ANOVA method. Results: The modified isometric dynamometer (MID) showed excellent reliability and good validity in the assessment of the performance of the knee extensor and flexor muscles groups. In the comparison between groups, the GTP showed significantly greater deficits as compared to the GTF and GC groups. Conclusion: Isometric dynamometers connected to mechanotherapy equipments could be an alternative option to collect data concerning performance deficits of the extensor and flexor muscles groups of the knee in subjects with ACL reconstruction. PMID:27004175

  4. The deployment of carbon monoxide wireless sensor network (CO-WSN) for ambient air monitoring.

    PubMed

    Chaiwatpongsakorn, Chaichana; Lu, Mingming; Keener, Tim C; Khang, Soon-Jai

    2014-06-16

    Wireless sensor networks are becoming increasingly important as an alternative solution for environment monitoring because they can reduce cost and complexity. Also, they can improve reliability and data availability in places where traditional monitoring methods are difficult to site. In this study, a carbon monoxide wireless sensor network (CO-WSN) was developed to measure carbon monoxide concentrations at a major traffic intersection near the University of Cincinnati main campus. The system has been deployed over two weeks during Fall 2010, and Summer 2011-2012, traffic data was also recorded by using a manual traffic counter and a video camcorder to characterize vehicles at the intersection 24 h, particularly, during the morning and evening peak hour periods. According to the field test results, the 1 hr-average CO concentrations were found to range from 0.1-1.0 ppm which is lower than the National Ambient Air Quality Standards (NAAQS) 35 ppm on a one-hour averaging period. During rush hour periods, the traffic volume at the intersection varied from 2,067 to 3,076 vehicles per hour with 97% being passenger vehicles. Furthermore, the traffic volume based on a 1-h average showed good correlation (R2 = 0.87) with the 1-h average CO-WSN concentrations for morning and evening peak time periods whereas CO-WSN results provided a moderate correlation (R2 = 0.42) with 24 hours traffic volume due to fluctuated changes of meteorological conditions. It is concluded that the performance and the reliability of wireless ambient air monitoring networks can be used as an alternative method for real time air monitoring.

  5. Selecting the optimum number of partial least squares components for the calibration of attenuated total reflectance-mid-infrared spectra of undesigned kerosene samples.

    PubMed

    Gómez-Carracedo, M P; Andrade, J M; Rutledge, D N; Faber, N M

    2007-03-07

    Selecting the correct dimensionality is critical for obtaining partial least squares (PLS) regression models with good predictive ability. Although calibration and validation sets are best established using experimental designs, industrial laboratories cannot afford such an approach. Typically, samples are collected in an (formally) undesigned way, spread over time and their measurements are included in routine measurement processes. This makes it hard to evaluate PLS model dimensionality. In this paper, classical criteria (leave-one-out cross-validation and adjusted Wold's criterion) are compared to recently proposed alternatives (smoothed PLS-PoLiSh and a randomization test) to seek out the optimum dimensionality of PLS models. Kerosene (jet fuel) samples were measured by attenuated total reflectance-mid-IR spectrometry and their spectra where used to predict eight important properties determined using reference methods that are time-consuming and prone to analytical errors. The alternative methods were shown to give reliable dimensionality predictions when compared to external validation. By contrast, the simpler methods seemed to be largely affected by the largest changes in the modeling capabilities of the first components.

  6. Alternate methods for high level pyrotechnic shock simulation

    NASA Astrophysics Data System (ADS)

    Gray, Phillip J., Sr.

    Two effective methods to recreate a realistic pyrotechnic shock are presented. The first method employs a resonant beam and is used for SRS levels of 12,000 G or more. The test unit is at one end of the beam and a hammer strikes the opposite end causing a shock to be transmitted to the other end of the fixture. The second method is based on a standard shaker system with a resonant beam to amplify the input signal. The engineer defines the duration of the shock signal induced to the vibration amplifier using the GenRad 2514 controller. The shock signal is then input via the shaker to the resonant beam, which amplifies the signal to produce the desired response at the end of the fixture. The shock response spectrum stays within a +/-6 dB tolerance with levels as high as 3000 G peak. These methods are repeatable, reliable, cost-effective, and consistent with a real pyroevent.

  7. Scoring and setting pass/fail standards for an essay certification examination in nurse-midwifery.

    PubMed

    Fullerton, J T; Greener, D L; Gross, L J

    1992-03-01

    Examination for certification or licensure of health professionals (credentialing) in the United States is almost exclusively of the multiple choice format. The certification examination for entry into the practice of the profession of nurse-midwifery has, however, used a modified essay format throughout its twenty-year history. The examination has recently undergone a revision in the method for score interpretation and for pass/fail decision-making. The revised method, described in this paper, has important implications for all health professional credentialing agencies which use modified essay, oral or practical methods of competency assessment. This paper describes criterion-referenced scoring, the process of constructing the essay items, the methods for assuring validity and reliability for the examination, and the manner of standard setting. In addition, two alternative methods for increasing the validity of the pass/fail decision are evaluated, and the rationale for decision-making about marginal candidates is described.

  8. An assessment of the Nguyen and Pinder method for slug test analysis. [In situ estimates of ground water contamination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, J.J. Jr.; Hyder, Z.

    The Nguyen and Pinder method is one of four techniques commonly used for analysis of response data from slug tests. Limited field research has raised questions about the reliability of the parameter estimates obtained with this method. A theoretical evaluation of this technique reveals that errors were made in the derivation of the analytical solution upon which the technique is based. Simulation and field examples show that the errors result in parameter estimates that can differ from actual values by orders of magnitude. These findings indicate that the Nguyen and Pinder method should no longer be a tool in themore » repertoire of the field hydrogeologist. If data from a slug test performed in a partially penetrating well in a confined aquifer need to be analyzed, recent work has shown that the Hvorslev method is the best alternative among the commonly used techniques.« less

  9. Portfolio assessment and evaluation: implications and guidelines for clinical nursing education.

    PubMed

    Chabeli, M M

    2002-08-01

    With the advent of Outcomes-Based Education in South Africa, the quality of nursing education is debatable, especially with regard to the assessment and evaluation of clinical nursing education, which is complex and renders the validity and reliability of the methods used questionable. This paper seeks to explore and describe the use of portfolio assessment and evaluation, its implications and guidelines for its effective use in nursing education. Firstly, the concepts of assessment, evaluation, portfolio and alternative methods of evaluation are defined. Secondly, a comparison of the characteristics of the old (traditional) methods and the new alternative methods of evaluation is made. Thirdly, through deductive analysis, synthesis and inference, implications and guidelines for the effective use of portfolio assessment and evaluation are described. In view of the qualitative, descriptive and exploratory nature of the study, a focus group interview with twenty students following a post-basic degree at a university in Gauteng regarding their perceptions on the use of portfolio assessment and evaluation method in clinical nursing education was used. A descriptive method of qualitative data analysis of open coding in accordance with Tesch's protocol (in Creswell 1994:155) was used. Resultant implications and guidelines were conceptualised and described within the existing theoretical framework. Principles of trustworthiness were maintained as described by (Lincoln & Guba 1985:290-327). Ethical considerations were in accordance with DENOSA's standards of research (1998:7).

  10. [Classical and molecular methods for identification and quantification of domestic moulds].

    PubMed

    Fréalle, E; Bex, V; Reboux, G; Roussel, S; Bretagne, S

    2017-12-01

    To study the impact of the constant and inevitable inhalation of moulds, it is necessary to sample, identify and count the spores. Environmental sampling methods can be separated into three categories: surface sampling is easy to perform but non quantitative, air sampling is easy to calibrate but provides time limited information, and dust sampling which is more representative of long term exposure to moulds. The sampling strategy depends on the objectives (evaluation of the risk of exposure for individuals; quantification of the household contamination; evaluation of the efficacy of remediation). The mould colonies obtained in culture are identified using microscopy, Maldi-TOF, and/or DNA sequencing. Electrostatic dust collectors are an alternative to older methods for identifying and quantifying household mould spores. They are easy to use and relatively cheap. Colony counting should be progressively replaced by quantitative real-time PCR, which is already validated, while waiting for more standardised high throughput sequencing methods for assessment of mould contamination without technical bias. Despite some technical recommendations for obtaining reliable and comparable results, the huge diversity of environmental moulds, the variable quantity of spores inhaled and the association with other allergens (mites, plants) make the evaluation of their impact on human health difficult. Hence there is a need for reliable and generally applicable quantitative methods. Copyright © 2017 SPLF. Published by Elsevier Masson SAS. All rights reserved.

  11. Application and evaluation of electromagnetic methods for imaging saltwater intrusion in coastal aquifers: Seaside Groundwater Basin, California

    USGS Publications Warehouse

    Nenna, Vanessa; Herckenrather, Daan; Knight, Rosemary; Odlum, Nick; McPhee, Darcy

    2013-01-01

    Developing effective resource management strategies to limit or prevent saltwater intrusion as a result of increasing demands on coastal groundwater resources requires reliable information about the geologic structure and hydrologic state of an aquifer system. A common strategy for acquiring such information is to drill sentinel wells near the coast to monitor changes in water salinity with time. However, installation and operation of sentinel wells is costly and provides limited spatial coverage. We studied the use of noninvasive electromagnetic (EM) geophysical methods as an alternative to installation of monitoring wells for characterizing coastal aquifers. We tested the feasibility of using EM methods at a field site in northern California to identify the potential for and/or presence of hydraulic communication between an unconfined saline aquifer and a confined freshwater aquifer. One-dimensional soundings were acquired using the time-domain electromagnetic (TDEM) and audiomagnetotelluric (AMT) methods. We compared inverted resistivity models of TDEM and AMT data obtained from several inversion algorithms. We found that multiple interpretations of inverted models can be supported by the same data set, but that there were consistencies between all data sets and inversion algorithms. Results from all collected data sets suggested that EM methods are capable of reliably identifying a saltwater-saturated zone in the unconfined aquifer. Geophysical data indicated that the impermeable clay between aquifers may be more continuous than is supported by current models.

  12. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  13. Improving the complementary methods to estimate evapotranspiration under diverse climatic and physical conditions

    NASA Astrophysics Data System (ADS)

    Anayah, F. M.; Kaluarachchi, J. J.

    2014-06-01

    Reliable estimation of evapotranspiration (ET) is important for the purpose of water resources planning and management. Complementary methods, including complementary relationship areal evapotranspiration (CRAE), advection aridity (AA) and Granger and Gray (GG), have been used to estimate ET because these methods are simple and practical in estimating regional ET using meteorological data only. However, prior studies have found limitations in these methods especially in contrasting climates. This study aims to develop a calibration-free universal method using the complementary relationships to compute regional ET in contrasting climatic and physical conditions with meteorological data only. The proposed methodology consists of a systematic sensitivity analysis using the existing complementary methods. This work used 34 global FLUXNET sites where eddy covariance (EC) fluxes of ET are available for validation. A total of 33 alternative model variations from the original complementary methods were proposed. Further analysis using statistical methods and simplified climatic class definitions produced one distinctly improved GG-model-based alternative. The proposed model produced a single-step ET formulation with results equal to or better than the recent studies using data-intensive, classical methods. Average root mean square error (RMSE), mean absolute bias (BIAS) and R2 (coefficient of determination) across 34 global sites were 20.57 mm month-1, 10.55 mm month-1 and 0.64, respectively. The proposed model showed a step forward toward predicting ET in large river basins with limited data and requiring no calibration.

  14. 10 CFR 503.32 - Lack of alternate fuel supply at a cost which does not substantially exceed the cost of using...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... reliable supply of an alternate fuel for use as a primary energy source of the quality and quantity... 10 Energy 4 2012-01-01 2012-01-01 false Lack of alternate fuel supply at a cost which does not substantially exceed the cost of using imported petroleum. 503.32 Section 503.32 Energy DEPARTMENT OF ENERGY...

  15. 10 CFR 503.32 - Lack of alternate fuel supply at a cost which does not substantially exceed the cost of using...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... reliable supply of an alternate fuel for use as a primary energy source of the quality and quantity... 10 Energy 4 2014-01-01 2014-01-01 false Lack of alternate fuel supply at a cost which does not substantially exceed the cost of using imported petroleum. 503.32 Section 503.32 Energy DEPARTMENT OF ENERGY...

  16. 10 CFR 503.32 - Lack of alternate fuel supply at a cost which does not substantially exceed the cost of using...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... reliable supply of an alternate fuel for use as a primary energy source of the quality and quantity... 10 Energy 4 2013-01-01 2013-01-01 false Lack of alternate fuel supply at a cost which does not substantially exceed the cost of using imported petroleum. 503.32 Section 503.32 Energy DEPARTMENT OF ENERGY...

  17. Distributed capillary adiabatic tissue homogeneity model in parametric multi-channel blind AIF estimation using DCE-MRI.

    PubMed

    Kratochvíla, Jiří; Jiřík, Radovan; Bartoš, Michal; Standara, Michal; Starčuk, Zenon; Taxt, Torfinn

    2016-03-01

    One of the main challenges in quantitative dynamic contrast-enhanced (DCE) MRI is estimation of the arterial input function (AIF). Usually, the signal from a single artery (ignoring contrast dispersion, partial volume effects and flow artifacts) or a population average of such signals (also ignoring variability between patients) is used. Multi-channel blind deconvolution is an alternative approach avoiding most of these problems. The AIF is estimated directly from the measured tracer concentration curves in several tissues. This contribution extends the published methods of multi-channel blind deconvolution by applying a more realistic model of the impulse residue function, the distributed capillary adiabatic tissue homogeneity model (DCATH). In addition, an alternative AIF model is used and several AIF-scaling methods are tested. The proposed method is evaluated on synthetic data with respect to the number of tissue regions and to the signal-to-noise ratio. Evaluation on clinical data (renal cell carcinoma patients before and after the beginning of the treatment) gave consistent results. An initial evaluation on clinical data indicates more reliable and less noise sensitive perfusion parameter estimates. Blind multi-channel deconvolution using the DCATH model might be a method of choice for AIF estimation in a clinical setup. © 2015 Wiley Periodicals, Inc.

  18. Random-effects meta-analysis: the number of studies matters.

    PubMed

    Guolo, Annamaria; Varin, Cristiano

    2017-06-01

    This paper investigates the impact of the number of studies on meta-analysis and meta-regression within the random-effects model framework. It is frequently neglected that inference in random-effects models requires a substantial number of studies included in meta-analysis to guarantee reliable conclusions. Several authors warn about the risk of inaccurate results of the traditional DerSimonian and Laird approach especially in the common case of meta-analysis involving a limited number of studies. This paper presents a selection of likelihood and non-likelihood methods for inference in meta-analysis proposed to overcome the limitations of the DerSimonian and Laird procedure, with a focus on the effect of the number of studies. The applicability and the performance of the methods are investigated in terms of Type I error rates and empirical power to detect effects, according to scenarios of practical interest. Simulation studies and applications to real meta-analyses highlight that it is not possible to identify an approach uniformly superior to alternatives. The overall recommendation is to avoid the DerSimonian and Laird method when the number of meta-analysis studies is modest and prefer a more comprehensive procedure that compares alternative inferential approaches. R code for meta-analysis according to all of the inferential methods examined in the paper is provided.

  19. 30 CFR 285.527 - May I demonstrate financial strength and reliability to meet the financial assurance requirement...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, REGULATION, AND ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE RENEWABLE ENERGY ALTERNATE USES OF EXISTING FACILITIES ON THE OUTER CONTINENTAL SHELF... operation and generation of renewable energy on the OCS or onshore; (3) Evidence that shows reliability in...

  20. Value-Added Models for Teacher Preparation Programs: Validity and Reliability Threats, and a Manageable Alternative

    ERIC Educational Resources Information Center

    Brady, Michael P.; Heiser, Lawrence A.; McCormick, Jazarae K.; Forgan, James

    2016-01-01

    High-stakes standardized student assessments are increasingly used in value-added evaluation models to connect teacher performance to P-12 student learning. These assessments are also being used to evaluate teacher preparation programs, despite validity and reliability threats. A more rational model linking student performance to candidates who…

  1. Reliability, Compliance, and Security in Web-Based Course Assessments

    ERIC Educational Resources Information Center

    Bonham, Scott

    2008-01-01

    Pre- and postcourse assessment has become a very important tool for education research in physics and other areas. The web offers an attractive alternative to in-class paper administration, but concerns about web-based administration include reliability due to changes in medium, student compliance rates, and test security, both question leakage…

  2. Evaluation of General Classes of Reliability Estimators Often Used in Statistical Analyses of Quasi-Experimental Designs

    NASA Astrophysics Data System (ADS)

    Saini, K. K.; Sehgal, R. K.; Sethi, B. L.

    2008-10-01

    In this paper major reliability estimators are analyzed and there comparatively result are discussed. There strengths and weaknesses are evaluated in this case study. Each of the reliability estimators has certain advantages and disadvantages. Inter-rater reliability is one of the best ways to estimate reliability when your measure is an observation. However, it requires multiple raters or observers. As an alternative, you could look at the correlation of ratings of the same single observer repeated on two different occasions. Each of the reliability estimators will give a different value for reliability. In general, the test-retest and inter-rater reliability estimates will be lower in value than the parallel forms and internal consistency ones because they involve measuring at different times or with different raters. Since reliability estimates are often used in statistical analyses of quasi-experimental designs.

  3. Parameter estimation for the 4-parameter Asymmetric Exponential Power distribution by the method of L-moments using R

    USGS Publications Warehouse

    Asquith, William H.

    2014-01-01

    The implementation characteristics of two method of L-moments (MLM) algorithms for parameter estimation of the 4-parameter Asymmetric Exponential Power (AEP4) distribution are studied using the R environment for statistical computing. The objective is to validate the algorithms for general application of the AEP4 using R. An algorithm was introduced in the original study of the L-moments for the AEP4. A second or alternative algorithm is shown to have a larger L-moment-parameter domain than the original. The alternative algorithm is shown to provide reliable parameter production and recovery of L-moments from fitted parameters. A proposal is made for AEP4 implementation in conjunction with the 4-parameter Kappa distribution to create a mixed-distribution framework encompassing the joint L-skew and L-kurtosis domains. The example application provides a demonstration of pertinent algorithms with L-moment statistics and two 4-parameter distributions (AEP4 and the Generalized Lambda) for MLM fitting to a modestly asymmetric and heavy-tailed dataset using R.

  4. Sensitivity of two noninvasive blood pressure measurement techniques compared to telemetry in cynomolgus monkeys and beagle dogs.

    PubMed

    Mitchell, Andrea Z; McMahon, Carrie; Beck, Tom W; Sarazan, R Dustan

    2010-01-01

    Animals are commonly used in toxicological research for the evaluation of drug effects on the cardiovascular system. Accurate and reproducible determination of blood pressure (BP) in conscious, manually restrained monkeys and dogs is a challenge with current non-invasive cuff techniques. The High Definition Oscillometry (HDO) technique enables real time measurements with immediate visual feedback via PC screen on data validity. HDO measurements are considerably faster with a duration of approximately 8 to 15s than conventional cuff methods that can take several minutes. HDO Memo Diagnostic Model Science and Cardell BP Monitor Model 9401 measurements were compared for accuracy and reliability with simultaneously recorded direct blood pressure data captured via radiotelemetry. Six monkeys and six dogs implanted with DSI PCT telemetry transmitters were used; BP data were collected by all methods under manual constraint and compared. Measurements were performed with HDO and Cardell in the presence of a BP lowering drug (hexamethonium bromide). Systolic, diastolic, mean arterial pressure, and pulse rate were determined before, during and following up to 10mg/kg hexamethonium administration via intravenous slow bolus injection. Drug induced hemodynamic changes could be detected in monkeys and dogs with the HDO method but only in dogs with the Cardell method. Correlation coefficients were generally higher for HDO versus Telemetry than Cardell versus Telemetry comparisons, indicating that this novel, non-invasive technique produces reliable blood pressure data and is able to detect drug-induced hemodynamic changes. HDO provides an alternative approach for invasive telemetry surgeries to obtain reliable hemodynamic data in animal models for cardiovascular research when invasive techniques are not warranted. Copyright 2010 Elsevier Inc. All rights reserved.

  5. Electronic Quality of Life Assessment Using Computer-Adaptive Testing

    PubMed Central

    2016-01-01

    Background Quality of life (QoL) questionnaires are desirable for clinical practice but can be time-consuming to administer and interpret, making their widespread adoption difficult. Objective Our aim was to assess the performance of the World Health Organization Quality of Life (WHOQOL)-100 questionnaire as four item banks to facilitate adaptive testing using simulated computer adaptive tests (CATs) for physical, psychological, social, and environmental QoL. Methods We used data from the UK WHOQOL-100 questionnaire (N=320) to calibrate item banks using item response theory, which included psychometric assessments of differential item functioning, local dependency, unidimensionality, and reliability. We simulated CATs to assess the number of items administered before prespecified levels of reliability was met. Results The item banks (40 items) all displayed good model fit (P>.01) and were unidimensional (fewer than 5% of t tests significant), reliable (Person Separation Index>.70), and free from differential item functioning (no significant analysis of variance interaction) or local dependency (residual correlations < +.20). When matched for reliability, the item banks were between 45% and 75% shorter than paper-based WHOQOL measures. Across the four domains, a high standard of reliability (alpha>.90) could be gained with a median of 9 items. Conclusions Using CAT, simulated assessments were as reliable as paper-based forms of the WHOQOL with a fraction of the number of items. These properties suggest that these item banks are suitable for computerized adaptive assessment. These item banks have the potential for international development using existing alternative language versions of the WHOQOL items. PMID:27694100

  6. Universal first-order reliability concept applied to semistatic structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1994-01-01

    A reliability design concept was developed for semistatic structures which combines the prevailing deterministic method with the first-order reliability method. The proposed method surmounts deterministic deficiencies in providing uniformly reliable structures and improved safety audits. It supports risk analyses and reliability selection criterion. The method provides a reliability design factor derived from the reliability criterion which is analogous to the current safety factor for sizing structures and verifying reliability response. The universal first-order reliability method should also be applicable for air and surface vehicles semistatic structures.

  7. Universal first-order reliability concept applied to semistatic structures

    NASA Astrophysics Data System (ADS)

    Verderaime, V.

    1994-07-01

    A reliability design concept was developed for semistatic structures which combines the prevailing deterministic method with the first-order reliability method. The proposed method surmounts deterministic deficiencies in providing uniformly reliable structures and improved safety audits. It supports risk analyses and reliability selection criterion. The method provides a reliability design factor derived from the reliability criterion which is analogous to the current safety factor for sizing structures and verifying reliability response. The universal first-order reliability method should also be applicable for air and surface vehicles semistatic structures.

  8. Allowable SEM noise for unbiased LER measurement

    NASA Astrophysics Data System (ADS)

    Papavieros, George; Constantoudis, Vassilios; Gogolides, Evangelos

    2018-03-01

    Recently, a novel method for the calculation of unbiased Line Edge Roughness based on Power Spectral Density analysis has been proposed. In this paper first an alternative method is discussed and investigated, utilizing the Height-Height Correlation Function (HHCF) of edges. The HHCF-based method enables the unbiased determination of the whole triplet of LER parameters including besides rms the correlation length and roughness exponent. The key of both methods is the sensitivity of PSD and HHCF on noise at high frequencies and short distance respectively. Secondly, we elaborate a testbed of synthesized SEM images with controlled LER and noise to justify the effectiveness of the proposed unbiased methods. Our main objective is to find out the boundaries of the method in respect to noise levels and roughness characteristics, for which the method remains reliable, i.e the maximum amount of noise allowed, for which the output results cope with the controllable known inputs. At the same time, we will also set the extremes of roughness parameters for which the methods hold their accuracy.

  9. Determination of the post mortem interval in skeletal remains by the comparative use of different physico-chemical methods: Are they reliable as an alternative to 14C?

    PubMed

    Amadasi, Alberto; Cappella, Annalisa; Cattaneo, Cristina; Cofrancesco, Pacifico; Cucca, Lucia; Merli, Daniele; Milanese, Chiara; Pinto, Andrea; Profumo, Antonella; Scarpulla, Valentina; Sguazza, Emanuela

    2017-05-01

    The determination of the post-mortem interval (PMI) of skeletal remains is a challenging aspect in the forensic field. Previous studies focused their attention on different macroscopic and morphological aspects but a thorough and complete evaluation of the potential of chemical and physical analyses in this field of research has not been performed. In addition to luminol test and Oxford histology index (OHI) reported in a recent paper, widely spread and accessible methods based on physical aspect and chemical characteristics of skeletal remains have been investigated as potential alternatives to dating by determination of 14 C. The investigation was performed on a total of 24 archeological and forensic bone samples with known PMI, with inductively coupled plasma optical emission spectrometer (ICP-OES), inductively coupled plasma quadruple mass spectrometry (ICP-MS), Fourier transform infrared (FT-IR) spectroscopy, energy dispersive X-ray analysis (EDX), powder X-ray diffraction analysis (XRPD) and scanning electron microscopy (SEM). Finally, the feasibility of such alternative methods was discussed. Some results such as carbonates/phosphates ratio from FT-IR, the amounts of organic and inorganic matter by EDX, crystallite sizes with XRPD, and surface morphology obtained by SEM, showed significant trends along with PMI. Though, from a chemical point of view cut-off values and gold-standard methods still present challenges, and rather different techniques together can provide useful information toward the assessment of the PMI of skeletal remains. It is however clear that in a hypothetical flowchart those methods may be placed practically at the same level and a choice should always consider the evaluation of results by each technique, execution times and a costs/benefits relationship. Copyright © 2017 Elsevier GmbH. All rights reserved.

  10. Interference-free spectrofluorometric quantification of aristolochic acid I and aristololactam I in five Chinese herbal medicines using chemical derivatization enhancement and second-order calibration methods

    NASA Astrophysics Data System (ADS)

    Hu, Yong; Wu, Hai-Long; Yin, Xiao-Li; Gu, Hui-Wen; Xiao, Rong; Wang, Li; Fang, Huan; Yu, Ru-Qin

    2017-03-01

    A rapid interference-free spectrofluorometric method combined with the excitation-emission matrix fluorescence and the second-order calibration methods based on the alternating penalty trilinear decomposition (APTLD) and the self-weighted alternating trilinear decomposition (SWATLD) algorithms, was proposed for the simultaneous determination of nephrotoxic aristolochic acid I (AA-I) and aristololactam I (AL-I) in five Chinese herbal medicines. The method was based on a chemical derivatization that converts the non-fluorescent AA-I to high-fluorescent AL-I, achieving a high sensitive and simultaneous quantification of the analytes. The variables of the derivatization reaction that conducted by using zinc powder in acetose methanol aqueous solution, were studied and optimized for best quantification results of AA-I and AL-I. The satisfactory results of AA-I and AL-I for the spiked recovery assay were achieved with average recoveries in the range of 100.4-103.8% and RMSEPs < 0.78 ng mL- 1, which validate the accuracy and reliability of the proposed method. The contents of AA-I and AL-I in five herbal medicines obtained from the proposed method were also in good accordance with those of the validated LC-MS/MS method. In light of high sensitive fluorescence detection, the limits of detection (LODs) of AA-I and AL-I for the proposed method compare favorably with that of the LC-MS/MS method, with the LODs < 0.35 and 0.29 ng mL- 1, respectively. The proposed strategy based on the APTLD and SWATLD algorithms by virtue of the "second-order advantage", can be considered as an attractive and green alternative for the quantification of AA-I and AL-I in complex herbal medicine matrices without any prior separations and clear-up processes.

  11. Reverse Transrectal Stapling Technique Using the EEA Stapler: An Alternative Approach in Difficult Reversal of Hartmann’s Procedure

    PubMed Central

    Zachariah, Sanoop K.

    2010-01-01

    The introduction of circular end-to-end stapling devices (CEEA OR EEA stapler) into colorectal surgery have revolutionised anastomotic techniques. The EEA stapler is generally regarded as an instrument that is safe, reliable, and simple to operate. Despite it’s popularity, very little information is available regarding the technical difficulties encountered during surgery. The routine technique to perform an end-to-end circular colonic anastomosis is to introduce the instrument distally through the anus (transrectal/transanal approach) and attach it to the anvil which is purse stringed at the distal end of the proximal bowel to be anastomosed. Two cases of reversal of Hartmann’s procedure for perforated diverticulitis are described in the present study, where difficulty was experienced while using the EEA stapler in the routine method. Hence, an alternative reverse technique which was used is presented. PMID:22091338

  12. Surrogate measures: A proposed alternative in human factors assessment of operational measures of performance

    NASA Technical Reports Server (NTRS)

    Kennedy, Robert S.; Lane, Norman E.; Kuntz, Lois A.

    1987-01-01

    Surrogate measures are proposed as an alternative to direct assessment of operational performance for purposes of screening agents who may have to work under unusual stresses or in exotic environments. Such measures are particularly proposed when the surrogate can be empirically validated against the operational criterion. The focus is on cognitive (or throughput) performances in humans as opposed to sensory (input) or motor (output) measures, but the methods should be applicable for development of batteries which will tap input/output functions. A menu of performance tasks is under development for implementation on a battery-operated portable microcomputer, with 21 tests currently available. The tasks are reliable and become stable in minimum amounts of time; appear sensitive to some agents; comprise constructs related to actual job tasks; and are easily administered in most environments. Implications for human factors engineering studies in environmental stress are discussed.

  13. A forward view on reliable computers for flight control

    NASA Technical Reports Server (NTRS)

    Goldberg, J.; Wensley, J. H.

    1976-01-01

    The requirements for fault-tolerant computers for flight control of commercial aircraft are examined; it is concluded that the reliability requirements far exceed those typically quoted for space missions. Examination of circuit technology and alternative computer architectures indicates that the desired reliability can be achieved with several different computer structures, though there are obvious advantages to those that are more economic, more reliable, and, very importantly, more certifiable as to fault tolerance. Progress in this field is expected to bring about better computer systems that are more rigorously designed and analyzed even though computational requirements are expected to increase significantly.

  14. Adaptive particle swarm optimization for optimal orbital elements of binary stars

    NASA Astrophysics Data System (ADS)

    Attia, Abdel-Fattah

    2016-12-01

    The paper presents an adaptive particle swarm optimization (APSO) as an alternative method to determine the optimal orbital elements of the star η Bootis of MK type G0 IV. The proposed algorithm transforms the problem of finding periodic orbits into the problem of detecting global minimizers as a function, to get a best fit of Keplerian and Phase curves. The experimental results demonstrate that the proposed approach of APSO generally more accurate than the standard particle swarm optimization (PSO) and other published optimization algorithms, in terms of solution accuracy, convergence speed and algorithm reliability.

  15. A new approach for cancelable iris recognition

    NASA Astrophysics Data System (ADS)

    Yang, Kai; Sui, Yan; Zhou, Zhi; Du, Yingzi; Zou, Xukai

    2010-04-01

    The iris is a stable and reliable biometric for positive human identification. However, the traditional iris recognition scheme raises several privacy concerns. One's iris pattern is permanently bound with him and cannot be changed. Hence, once it is stolen, this biometric is lost forever as well as all the applications where this biometric is used. Thus, new methods are desirable to secure the original pattern and ensure its revocability and alternatives when compromised. In this paper, we propose a novel scheme which incorporates iris features, non-invertible transformation and data encryption to achieve "cancelability" and at the same time increases iris recognition accuracy.

  16. The gas chromatographic determination of volatile fatty acids in wastewater samples: evaluation of experimental biases in direct injection method against thermal desorption method.

    PubMed

    Ullah, Md Ahsan; Kim, Ki-Hyun; Szulejko, Jan E; Cho, Jinwoo

    2014-04-11

    The production of short-chained volatile fatty acids (VFAs) by the anaerobic bacterial digestion of sewage (wastewater) affords an excellent opportunity to alternative greener viable bio-energy fuels (i.e., microbial fuel cell). VFAs in wastewater (sewage) samples are commonly quantified through direct injection (DI) into a gas chromatograph with a flame ionization detector (GC-FID). In this study, the reliability of VFA analysis by the DI-GC method has been examined against a thermal desorption (TD-GC) method. The results indicate that the VFA concentrations determined from an aliquot from each wastewater sample by the DI-GC method were generally underestimated, e.g., reductions of 7% (acetic acid) to 93.4% (hexanoic acid) relative to the TD-GC method. The observed differences between the two methods suggest the possibly important role of the matrix effect to give rise to the negative biases in DI-GC analysis. To further explore this possibility, an ancillary experiment was performed to examine bias patterns of three DI-GC approaches. For instance, the results of the standard addition (SA) method confirm the definite role of matrix effect when analyzing wastewater samples by DI-GC. More importantly, their biases tend to increase systematically with increasing molecular weight and decreasing VFA concentrations. As such, the use of DI-GC method, if applied for the analysis of samples with a complicated matrix, needs a thorough validation to improve the reliability in data acquisition. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Hungry pigeons make suboptimal choices, less hungry pigeons do not.

    PubMed

    Laude, Jennifer R; Pattison, Kristina F; Zentall, Thomas R

    2012-10-01

    Hungry animals will often choose suboptimally by being attracted to reliable signals for food that occur infrequently (they gamble) over less reliable signals for food that occur more often. That is, pigeons prefer an option that 50 % of the time provides them with a reliable signal for the appearance of food but 50 % of the time provides them with a reliable signal for the absence of food (overall 50 % reinforcement) over an alternative that always provides them with a signal for the appearance of food 75 % of the time (overall 75 % reinforcement). The pigeons appear to choose impulsively for the possibility of obtaining the reliable signal for reinforcement. There is evidence that greater hunger is associated with greater impulsivity. We tested the hypothesis that if the pigeons were less hungry, they would be less impulsive and, thus, would choose more optimally (i.e., on the basis of the overall probability of reinforcement). We found that hungry pigeons choose the 50 % reinforcement alternative suboptimally but less hungry pigeons prefer the more optimal 75 % reinforcement. Paradoxically, pigeons that needed the food more received less of it. These findings have implications for how level of motivation may also affect human suboptimal choice (e.g., purchase of lottery tickets and playing slot machines).

  18. Evaluating the safety risk of roadside features for rural two-lane roads using reliability analysis.

    PubMed

    Jalayer, Mohammad; Zhou, Huaguo

    2016-08-01

    The severity of roadway departure crashes mainly depends on the roadside features, including the sideslope, fixed-object density, offset from fixed objects, and shoulder width. Common engineering countermeasures to improve roadside safety include: cross section improvements, hazard removal or modification, and delineation. It is not always feasible to maintain an object-free and smooth roadside clear zone as recommended in design guidelines. Currently, clear zone width and sideslope are used to determine roadside hazard ratings (RHRs) to quantify the roadside safety of rural two-lane roadways on a seven-point pictorial scale. Since these two variables are continuous and can be treated as random, probabilistic analysis can be applied as an alternative method to address existing uncertainties. Specifically, using reliability analysis, it is possible to quantify roadside safety levels by treating the clear zone width and sideslope as two continuous, rather than discrete, variables. The objective of this manuscript is to present a new approach for defining the reliability index for measuring roadside safety on rural two-lane roads. To evaluate the proposed approach, we gathered five years (2009-2013) of Illinois run-off-road (ROR) crash data and identified the roadside features (i.e., clear zone widths and sideslopes) of 4500 300ft roadway segments. Based on the obtained results, we confirm that reliability indices can serve as indicators to gauge safety levels, such that the greater the reliability index value, the lower the ROR crash rate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Reliability Analysis of a Green Roof Under Different Storm Scenarios

    NASA Astrophysics Data System (ADS)

    William, R. K.; Stillwell, A. S.

    2015-12-01

    Urban environments continue to face the challenges of localized flooding and decreased water quality brought on by the increasing amount of impervious area in the built environment. Green infrastructure provides an alternative to conventional storm sewer design by using natural processes to filter and store stormwater at its source. However, there are currently few consistent standards available in North America to ensure that installed green infrastructure is performing as expected. This analysis offers a method for characterizing green roof failure using a visual aid commonly used in earthquake engineering: fragility curves. We adapted the concept of the fragility curve based on the efficiency in runoff reduction provided by a green roof compared to a conventional roof under different storm scenarios. We then used the 2D distributed surface water-groundwater coupled model MIKE SHE to model the impact that a real green roof might have on runoff in different storm events. We then employed a multiple regression analysis to generate an algebraic demand model that was input into the Matlab-based reliability analysis model FERUM, which was then used to calculate the probability of failure. The use of reliability analysis as a part of green infrastructure design code can provide insights into green roof weaknesses and areas for improvement. It also supports the design of code that is more resilient than current standards and is easily testable for failure. Finally, the understanding of reliability of a single green roof module under different scenarios can support holistic testing of system reliability.

  20. Diagnosis of cystic fibrosis with chloride meter (Sherwood M926S chloride analyzer®) and sweat test analysis system (CFΔ collection system®) compared to the Gibson Cooke method.

    PubMed

    Emiralioğlu, Nagehan; Özçelik, Uğur; Yalçın, Ebru; Doğru, Deniz; Kiper, Nural

    2016-01-01

    Sweat test with Gibson Cooke (GC) method is the diagnostic gold standard for cystic fibrosis (CF). Recently, alternative methods have been introduced to simplify both the collection and analysis of sweat samples. Our aim was to compare sweat chloride values obtained by GC method with other sweat test methods in patients diagnosed with CF and whose CF diagnosis had been ruled out. We wanted to determine if the other sweat test methods could reliably identify patients with CF and differentiate them from healthy subjects. Chloride concentration was measured with GC method, chloride meter and sweat test analysis system; also conductivity was determined with sweat test analysis system. Forty eight patients with CF and 82 patients without CF underwent the sweat test, showing median sweat chloride values 98.9 mEq/L with GC method, 101 mmol/L with chloride meter, 87.8 mmol/L with sweat test analysis system. In non-CF group, median sweat chloride values were 16.8 mEq/L with GC method, 10.5 mmol/L with chloride meter, and 15.6 mmol/L with sweat test analysis system. Median conductivity value was 107.3 mmol/L in CF group and 32.1 mmol/L in non CF group. There was a strong positive correlation between GC method and the other sweat test methods with a statistical significance (r=0.85) in all subjects. Sweat chloride concentration and conductivity by other sweat test methods highly correlate with the GC method. We think that the other sweat test equipments can be used as reliably as the classic GC method to diagnose or exclude CF.

  1. 8760-Based Method for Representing Variable Generation Capacity Value in Capacity Expansion Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frew, Bethany A

    Capacity expansion models (CEMs) are widely used to evaluate the least-cost portfolio of electricity generators, transmission, and storage needed to reliably serve load over many years or decades. CEMs can be computationally complex and are often forced to estimate key parameters using simplified methods to achieve acceptable solve times or for other reasons. In this paper, we discuss one of these parameters -- capacity value (CV). We first provide a high-level motivation for and overview of CV. We next describe existing modeling simplifications and an alternate approach for estimating CV that utilizes hourly '8760' data of load and VG resources.more » We then apply this 8760 method to an established CEM, the National Renewable Energy Laboratory's (NREL's) Regional Energy Deployment System (ReEDS) model (Eurek et al. 2016). While this alternative approach for CV is not itself novel, it contributes to the broader CEM community by (1) demonstrating how a simplified 8760 hourly method, which can be easily implemented in other power sector models when data is available, more accurately captures CV trends than a statistical method within the ReEDS CEM, and (2) providing a flexible modeling framework from which other 8760-based system elements (e.g., demand response, storage, and transmission) can be added to further capture important dynamic interactions, such as curtailment.« less

  2. Fast and simultaneous determination of 12 polyphenols in apple peel and pulp by using chemometrics-assisted high-performance liquid chromatography with diode array detection.

    PubMed

    Wang, Tong; Wu, Hai-Long; Xie, Li-Xia; Zhu, Li; Liu, Zhi; Sun, Xiao-Dong; Xiao, Rong; Yu, Ru-Qin

    2017-04-01

    In this work, a smart chemometrics-enhanced strategy, high-performance liquid chromatography, and diode array detection coupled with second-order calibration method based on alternating trilinear decomposition algorithm was proposed to simultaneously quantify 12 polyphenols in different kinds of apple peel and pulp samples. The proposed strategy proved to be a powerful tool to solve the problems of coelution, unknown interferences, and chromatographic shifts in the process of high-performance liquid chromatography analysis, making it possible for the determination of 12 polyphenols in complex apple matrices within 10 min under simple conditions of elution. The average recoveries with standard deviations, and figures of merit including sensitivity, selectivity, limit of detection, and limit of quantitation were calculated to validate the accuracy of the proposed method. Compared to the quantitative analysis results from the classic high-performance liquid chromatography method, the statistical and graphical analysis showed that our proposed strategy obtained more reliable results. All results indicated that our proposed method used in the quantitative analysis of apple polyphenols was an accurate, fast, universal, simple, and green one, and it was expected to be developed as an attractive alternative method for simultaneous determination of multitargeted analytes in complex matrices. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekmekcioglu, Mehmet, E-mail: meceng3584@yahoo.co; Kaya, Tolga; Kahraman, Cengiz

    The use of fuzzy multiple criteria analysis (MCA) in solid waste management has the advantage of rendering subjective and implicit decision making more objective and analytical, with its ability to accommodate both quantitative and qualitative data. In this paper a modified fuzzy TOPSIS methodology is proposed for the selection of appropriate disposal method and site for municipal solid waste (MSW). Our method is superior to existing methods since it has capability of representing vague qualitative data and presenting all possible results with different degrees of membership. In the first stage of the proposed methodology, a set of criteria of cost,more » reliability, feasibility, pollution and emission levels, waste and energy recovery is optimized to determine the best MSW disposal method. Landfilling, composting, conventional incineration, and refuse-derived fuel (RDF) combustion are the alternatives considered. The weights of the selection criteria are determined by fuzzy pairwise comparison matrices of Analytic Hierarchy Process (AHP). It is found that RDF combustion is the best disposal method alternative for Istanbul. In the second stage, the same methodology is used to determine the optimum RDF combustion plant location using adjacent land use, climate, road access and cost as the criteria. The results of this study illustrate the importance of the weights on the various factors in deciding the optimized location, with the best site located in Catalca. A sensitivity analysis is also conducted to monitor how sensitive our model is to changes in the various criteria weights.« less

  4. A new web-based framework development for fuzzy multi-criteria group decision-making.

    PubMed

    Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik

    2016-01-01

    Fuzzy multi-criteria group decision making (FMCGDM) process is usually used when a group of decision-makers faces imprecise data or linguistic variables to solve the problems. However, this process contains many methods that require many time-consuming calculations depending on the number of criteria, alternatives and decision-makers in order to reach the optimal solution. In this study, a web-based FMCGDM framework that offers decision-makers a fast and reliable response service is proposed. The proposed framework includes commonly used tools for multi-criteria decision-making problems such as fuzzy Delphi, fuzzy AHP and fuzzy TOPSIS methods. The integration of these methods enables taking advantages of the strengths and complements each method's weakness. Finally, a case study of location selection for landfill waste in Morocco is performed to demonstrate how this framework can facilitate decision-making process. The results demonstrate that the proposed framework can successfully accomplish the goal of this study.

  5. Polyatomic molecular Dirac-Hartree-Fock calculations with Gaussian basis sets

    NASA Technical Reports Server (NTRS)

    Dyall, Kenneth G.; Faegri, Knut, Jr.; Taylor, Peter R.

    1990-01-01

    Numerical methods have been used successfully in atomic Dirac-Hartree-Fock (DHF) calculations for many years. Some DHF calculations using numerical methods have been done on diatomic molecules, but while these serve a useful purpose for calibration, the computational effort in extending this approach to polyatomic molecules is prohibitive. An alternative more in line with traditional quantum chemistry is to use an analytical basis set expansion of the wave function. This approach fell into disrepute in the early 1980's due to problems with variational collapse and intruder states, but has recently been put on firm theoretical foundations. In particular, the problems of variational collapse are well understood, and prescriptions for avoiding the most serious failures have been developed. Consequently, it is now possible to develop reliable molecular programs using basis set methods. This paper describes such a program and reports results of test calculations to demonstrate the convergence and stability of the method.

  6. Markov and semi-Markov processes as a failure rate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabski, Franciszek

    2016-06-08

    In this paper the reliability function is defined by the stochastic failure rate process with a non negative and right continuous trajectories. Equations for the conditional reliability functions of an object, under assumption that the failure rate is a semi-Markov process with an at most countable state space are derived. A proper theorem is presented. The linear systems of equations for the appropriate Laplace transforms allow to find the reliability functions for the alternating, the Poisson and the Furry-Yule failure rate processes.

  7. New analytical exact solutions of time fractional KdV-KZK equation by Kudryashov methods

    NASA Astrophysics Data System (ADS)

    S Saha, Ray

    2016-04-01

    In this paper, new exact solutions of the time fractional KdV-Khokhlov-Zabolotskaya-Kuznetsov (KdV-KZK) equation are obtained by the classical Kudryashov method and modified Kudryashov method respectively. For this purpose, the modified Riemann-Liouville derivative is used to convert the nonlinear time fractional KdV-KZK equation into the nonlinear ordinary differential equation. In the present analysis, the classical Kudryashov method and modified Kudryashov method are both used successively to compute the analytical solutions of the time fractional KdV-KZK equation. As a result, new exact solutions involving the symmetrical Fibonacci function, hyperbolic function and exponential function are obtained for the first time. The methods under consideration are reliable and efficient, and can be used as an alternative to establish new exact solutions of different types of fractional differential equations arising from mathematical physics. The obtained results are exhibited graphically in order to demonstrate the efficiencies and applicabilities of these proposed methods of solving the nonlinear time fractional KdV-KZK equation.

  8. Alternative Fuels Data Center: Minnesota Transportation Data for

    Science.gov Websites

    Energy Laboratory Case Studies Video thumbnail for Minnesota School District Finds Cost Savings, Cold Reliability with Propane Buses Jan. 26, 2016 Video thumbnail for Minneapolis Makes EV-Charging History Record Minnesota Videos on YouTube Video thumbnail for GE Showcases Innovation in Alternative Fuel Vehicles GE

  9. Sweet Nanochemistry: A Fast, Reliable Alternative Synthesis of Yellow Colloidal Silver Nanoparticles Using Benign Reagents

    ERIC Educational Resources Information Center

    Cooke, Jason; Hebert, Dominique; Kelly, Joel A.

    2015-01-01

    This work describes a convenient and reliable laboratory experiment in nanochemistry that is flexible and adaptable to a wide range of educational settings. The rapid preparation of yellow colloidal silver nanoparticles is achieved by glucose reduction of silver nitrate in the presence of starch and sodium citrate in gently boiling water, using…

  10. ScoreRel CI: An Excel Program for Computing Confidence Intervals for Commonly Used Score Reliability Coefficients

    ERIC Educational Resources Information Center

    Barnette, J. Jackson

    2005-01-01

    An Excel program developed to assist researchers in the determination and presentation of confidence intervals around commonly used score reliability coefficients is described. The software includes programs to determine confidence intervals for Cronbachs alpha, Pearson r-based coefficients such as those used in test-retest and alternate forms…

  11. Increasing the Reliability of Ability-Achievement Difference Scores: An Example Using the Kaufman Assessment Battery for Children.

    ERIC Educational Resources Information Center

    Caruso, John C.; Witkiewitz, Katie

    2002-01-01

    As an alternative to equally weighted difference scores, examined an orthogonal reliable component analysis (RCA) solution and an oblique principal components analysis (PCA) solution for the standardization sample of the Kaufman Assessment Battery for Children (KABC; A. Kaufman and N. Kaufman, 1983). Discusses the practical implications of the…

  12. Psychological Perspectives on Interrogation.

    PubMed

    Vrij, Aldert; Meissner, Christian A; Fisher, Ronald P; Kassin, Saul M; Morgan, Charles A; Kleinman, Steven M

    2017-11-01

    Proponents of "enhanced interrogation techniques" in the United States have claimed that such methods are necessary for obtaining information from uncooperative terrorism subjects. In the present article, we offer an informed, academic perspective on such claims. Psychological theory and research shows that harsh interrogation methods are ineffective. First, they are likely to increase resistance by the subject rather than facilitate cooperation. Second, the threatening and adversarial nature of harsh interrogation is often inimical to the goal of facilitating the retrieval of information from memory and therefore reduces the likelihood that a subject will provide reports that are extensive, detailed, and accurate. Third, harsh interrogation methods make lie detection difficult. Analyzing speech content and eliciting verifiable details are the most reliable cues to assessing credibility; however, to elicit such cues subjects must be encouraged to provide extensive narratives, something that does not occur in harsh interrogations. Evidence is accumulating for the effectiveness of rapport-based information-gathering approaches as an alternative to harsh interrogations. Such approaches promote cooperation, enhance recall of relevant and reliable information, and facilitate assessments of credibility. Given the available evidence that torture is ineffective, why might some laypersons, policymakers, and interrogation personnel support the use of torture? We conclude our review by offering a psychological perspective on this important question.

  13. A "shotgun" method for tracing the birth locations of sheep from flock tags, applied to scrapie surveillance in Great Britain.

    PubMed

    Birch, Colin P D; Del Rio Vilas, Victor J; Chikukwa, Ambrose C

    2010-09-01

    Movement records are often used to identify animal sample provenance by retracing the movements of individuals. Here we present an alternative method, which uses the same identity tags and movement records as are used to retrace movements, but ignores individual movement paths. The first step uses a simple query to identify the most likely birth holding for every identity tag included in a database recording departures from agricultural holdings. The second step rejects a proportion of the birth holding locations to leave a list of birth holding locations that are relatively reliable. The method was used to trace the birth locations of sheep sampled for scrapie in abattoirs, or on farm as fallen stock. Over 82% of the sheep sampled in the fallen stock survey died at the holding of birth. This lack of movement may be an important constraint on scrapie transmission. These static sheep provided relatively reliable birth locations, which were used to define criteria for selecting reliable traces. The criteria rejected 16.8% of fallen stock traces and 11.9% of abattoir survey traces. Two tests provided estimates that selection reduced error in fallen stock traces from 11.3% to 3.2%, and in abattoir survey traces from 8.1% to 1.8%. This method generated 14,591 accepted traces of fallen stock from samples taken during 2002-2005 and 83,136 accepted traces from abattoir samples. The absence or ambiguity of flock tag records at the time of slaughter prevented the tracing of 16-24% of abattoir samples during 2002-2004, although flock tag records improved in 2005. The use of internal scoring to generate and evaluate results from the database query, and the confirmation of results by comparison with other database fields, are analogous to methods used in web search engines. Such methods may have wide application in tracing samples and in adding value to biological datasets. Crown Copyright 2010. Published by Elsevier B.V. All rights reserved.

  14. An Alternative Thiol-Reactive Dye to Analyze Ligand Interactions with the Chemokine Receptor CXCR2 Using a New Thermal Shift Assay Format.

    PubMed

    Bergsdorf, Christian; Fiez-Vandal, Cédric; Sykes, David A; Bernet, Pascal; Aussenac, Sonia; Charlton, Steven J; Schopfer, Ulrich; Ottl, Johannes; Duckely, Myriam

    2016-03-01

    Integral membrane proteins (IMPs) play an important role in many cellular events and are involved in numerous pathological processes. Therefore, understanding the structure and function of IMPs is a crucial prerequisite to enable successful targeting of these proteins with low molecular weight (LMW) ligands early on in the discovery process. To optimize IMP purification/crystallization and to identify/characterize LMW ligand-target interactions, robust, reliable, high-throughput, and sensitive biophysical methods are needed. Here, we describe a differential scanning fluorimetry (DSF) screening method using the thiol-reactive BODIPY FL-cystine dye to monitor thermal unfolding of the G-protein-coupled receptor (GPCR), CXCR2. To validate this method, the seven-transmembrane protein CXCR2 was analyzed with a set of well-characterized antagonists. This study showed that the new DSF assay assessed reliably the stability of CXCR2 in a 384-well format. The analysis of 14 ligands with a potency range over 4 log units demonstrated the detection/characterization of LMW ligands binding to the membrane protein target. Furthermore, DSF results cross-validated with the label-free differential static light scattering (DSLS) thermal denaturation method. These results underline the potential of the BODIPY assay format as a general tool to investigate membrane proteins and their interaction partners. © 2015 Society for Laboratory Automation and Screening.

  15. Study of complete interconnect reliability for a GaAs MMIC power amplifier

    NASA Astrophysics Data System (ADS)

    Lin, Qian; Wu, Haifeng; Chen, Shan-ji; Jia, Guoqing; Jiang, Wei; Chen, Chao

    2018-05-01

    By combining the finite element analysis (FEA) and artificial neural network (ANN) technique, the complete prediction of interconnect reliability for a monolithic microwave integrated circuit (MMIC) power amplifier (PA) at the both of direct current (DC) and alternating current (AC) operation conditions is achieved effectively in this article. As a example, a MMIC PA is modelled to study the electromigration failure of interconnect. This is the first time to study the interconnect reliability for an MMIC PA at the conditions of DC and AC operation simultaneously. By training the data from FEA, a high accuracy ANN model for PA reliability is constructed. Then, basing on the reliability database which is obtained from the ANN model, it can give important guidance for improving the reliability design for IC.

  16. Resting State Network Estimation in Individual Subjects

    PubMed Central

    Hacker, Carl D.; Laumann, Timothy O.; Szrama, Nicholas P.; Baldassarre, Antonello; Snyder, Abraham Z.

    2014-01-01

    Resting-state functional magnetic resonance imaging (fMRI) has been used to study brain networks associated with both normal and pathological cognitive function. The objective of this work is to reliably compute resting state network (RSN) topography in single participants. We trained a supervised classifier (multi-layer perceptron; MLP) to associate blood oxygen level dependent (BOLD) correlation maps corresponding to pre-defined seeds with specific RSN identities. Hard classification of maps obtained from a priori seeds was highly reliable across new participants. Interestingly, continuous estimates of RSN membership retained substantial residual error. This result is consistent with the view that RSNs are hierarchically organized, and therefore not fully separable into spatially independent components. After training on a priori seed-based maps, we propagated voxel-wise correlation maps through the MLP to produce estimates of RSN membership throughout the brain. The MLP generated RSN topography estimates in individuals consistent with previous studies, even in brain regions not represented in the training data. This method could be used in future studies to relate RSN topography to other measures of functional brain organization (e.g., task-evoked responses, stimulation mapping, and deficits associated with lesions) in individuals. The multi-layer perceptron was directly compared to two alternative voxel classification procedures, specifically, dual regression and linear discriminant analysis; the perceptron generated more spatially specific RSN maps than either alternative. PMID:23735260

  17. Modeling the X-Ray Process, and X-Ray Flaw Size Parameter for POD Studies

    NASA Technical Reports Server (NTRS)

    Khoshti, Ajay

    2014-01-01

    Nondestructive evaluation (NDE) method reliability can be determined by a statistical flaw detection study called probability of detection (POD) study. In many instances the NDE flaw detectability is given as a flaw size such as crack length. The flaw is either a crack or behaving like a crack in terms of affecting the structural integrity of the material. An alternate approach is to use a more complex flaw size parameter. The X-ray flaw size parameter, given here, takes into account many setup and geometric factors. The flaw size parameter relates to X-ray image contrast and is intended to have a monotonic correlation with the POD. Some factors such as set-up parameters including X-ray energy, exposure, detector sensitivity, and material type that are not accounted for in the flaw size parameter may be accounted for in the technique calibration and controlled to meet certain quality requirements. The proposed flaw size parameter and the computer application described here give an alternate approach to conduct the POD studies. Results of the POD study can be applied to reliably detect small flaws through better assessment of effect of interaction between various geometric parameters on the flaw detectability. Moreover, a contrast simulation algorithm for a simple part-source-detector geometry using calibration data is also provided for the POD estimation.

  18. Modeling the X-ray Process, and X-ray Flaw Size Parameter for POD Studies

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2014-01-01

    Nondestructive evaluation (NDE) method reliability can be determined by a statistical flaw detection study called probability of detection (POD) study. In many instances, the NDE flaw detectability is given as a flaw size such as crack length. The flaw is either a crack or behaving like a crack in terms of affecting the structural integrity of the material. An alternate approach is to use a more complex flaw size parameter. The X-ray flaw size parameter, given here, takes into account many setup and geometric factors. The flaw size parameter relates to X-ray image contrast and is intended to have a monotonic correlation with the POD. Some factors such as set-up parameters, including X-ray energy, exposure, detector sensitivity, and material type that are not accounted for in the flaw size parameter may be accounted for in the technique calibration and controlled to meet certain quality requirements. The proposed flaw size parameter and the computer application described here give an alternate approach to conduct the POD studies. Results of the POD study can be applied to reliably detect small flaws through better assessment of effect of interaction between various geometric parameters on the flaw detectability. Moreover, a contrast simulation algorithm for a simple part-source-detector geometry using calibration data is also provided for the POD estimation.

  19. Imaging findings in essure related complications.

    PubMed

    Djeffal, Hachem; Blouet, Marie; Pizzoferrato, Anne-Cécile; Vardon, Delphine; Belloy, Frederique; Pelage, Jean-Pierre

    2018-06-21

    Tubal sterilization with Essure inserts has become a prevalent alternative to laparoscopic sterilization because of its minimal invasiveness. It is a well-tolerated ambulatory procedure that provides reliable permanent contraception without the risks associated with laparoscopic surgery and general anesthesia. Correct positioning of the Essure device is necessary to achieve the fibrotic reaction induced by the polyethylene terephthalate fibers, subsequently resulting in tubal occlusion usually within 3 months. After uneventful procedures with satisfactory bilateral placement, only the correct position of the devices needs to be confirmed at follow-up. The imaging techniques used to asses Essure devices may vary depending on the country and its recommendations. The gold standard test to ascertain tubal occlusion remains the hysterosalpingography but after uneventful procedures, vaginal-ultrasound proved to be a reliable alternative to confirm the proper position of the inserts. Radiologists have been increasingly confronted to post procedural evaluations and despite the efficiency rate of the Essure device, its use still exposes to a low risk of complications and malfunctions such as unwanted pregnancies, device misplacement, tubal or uterine perforation, and chronic pelvic pain. Unintended pregnancies are mostly due to patient or physician non-compliance and misinterpretation of post procedural examinations by radiologists which emphasizes the importance of their training in Essure device assessment. This pictorial review discusses the imaging methods used to asses Essure implants and illustrates the possible complications related to them.

  20. Identifying Complementary and Alternative Medicine Usage Information from Internet Resources. A Systematic Review.

    PubMed

    Sharma, Vivekanand; Holmes, John H; Sarkar, Indra N

    2016-08-05

    Identify and highlight research issues and methods used in studying Complementary and Alternative Medicine (CAM) information needs, access, and exchange over the Internet. A literature search was conducted using Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines from PubMed to identify articles that have studied Internet use in the CAM context. Additional searches were conducted at Nature.com and Google Scholar. The Internet provides a major medium for attaining CAM information and can also serve as an avenue for conducting CAM related surveys. Based on the literature analyzed in this review, there seems to be significant interest in developing methodologies for identifying CAM treatments, including the analysis of search query data and social media platform discussions. Several studies have also underscored the challenges in developing approaches for identifying the reliability of CAM-related information on the Internet, which may not be supported with reliable sources. The overall findings of this review suggest that there are opportunities for developing approaches for making available accurate information and developing ways to restrict the spread and sale of potentially harmful CAM products and information. Advances in Internet research are yet to be used in context of understanding CAM prevalence and perspectives. Such approaches may provide valuable insights into the current trends and needs in context of CAM use and spread.

  1. Design optimization of a fuzzy distributed generation (DG) system with multiple renewable energy sources

    NASA Astrophysics Data System (ADS)

    Ganesan, T.; Elamvazuthi, I.; Shaari, Ku Zilati Ku; Vasant, P.

    2012-09-01

    The global rise in energy demands brings major obstacles to many energy organizations in providing adequate energy supply. Hence, many techniques to generate cost effective, reliable and environmentally friendly alternative energy source are being explored. One such method is the integration of photovoltaic cells, wind turbine generators and fuel-based generators, included with storage batteries. This sort of power systems are known as distributed generation (DG) power system. However, the application of DG power systems raise certain issues such as cost effectiveness, environmental impact and reliability. The modelling as well as the optimization of this DG power system was successfully performed in the previous work using Particle Swarm Optimization (PSO). The central idea of that work was to minimize cost, minimize emissions and maximize reliability (multi-objective (MO) setting) with respect to the power balance and design requirements. In this work, we introduce a fuzzy model that takes into account the uncertain nature of certain variables in the DG system which are dependent on the weather conditions (such as; the insolation and wind speed profiles). The MO optimization in a fuzzy environment was performed by applying the Hopfield Recurrent Neural Network (HNN). Analysis on the optimized results was then carried out.

  2. Translation Quality Assessment in Health Research: A Functionalist Alternative to Back-Translation.

    PubMed

    Colina, Sonia; Marrone, Nicole; Ingram, Maia; Sánchez, Daisey

    2017-09-01

    As international research studies become more commonplace, the importance of developing multilingual research instruments continues to increase and with it that of translated materials. It is therefore not unexpected that assessing the quality of translated materials (e.g., research instruments, questionnaires, etc.) has become essential to cross-cultural research, given that the reliability and validity of the research findings crucially depend on the translated instruments. In some fields (e.g., public health and medicine), the quality of translated instruments can also impact the effectiveness and success of interventions and public campaigns. Back-translation (BT) is a commonly used quality assessment tool in cross-cultural research. This quality assurance technique consists of (a) translation (target text [TT 1 ]) of the source text (ST), (b) translation (TT 2 ) of TT 1 back into the source language, and (c) comparison of TT 2 with ST to make sure there are no discrepancies. The accuracy of the BT with respect to the source is supposed to reflect equivalence/accuracy of the TT. This article shows how the use of BT as a translation quality assessment method can have a detrimental effect on a research study and proposes alternatives to BT. One alternative is illustrated on the basis of the translation and quality assessment methods used in a research study on hearing loss carried out in a border community in the southwest of the United States.

  3. Concurrent validity and reliability of using ground reaction force and center of pressure parameters in the determination of leg movement initiation during single leg lift.

    PubMed

    Aldabe, Daniela; de Castro, Marcelo Peduzzi; Milosavljevic, Stephan; Bussey, Melanie Dawn

    2016-09-01

    Postural adjustment evaluations during single leg lift requires the initiation of heel lift (T1) identification. T1 measured by means of motion analyses system is the most reliable approach. However, this method involves considerable workspace, expensive cameras, and time processing data and setting up laboratory. The use of ground reaction forces (GRF) and centre of pressure (COP) data is an alternative method as its data processing and setting up is less time consuming. Further, kinetic data is normally collected using frequency samples higher than 1000Hz whereas kinematic data are commonly captured using 50-200Hz. This study describes the concurrent-validity and reliability of GRF and COP measurements in determining T1, using a motion analysis system as reference standard. Kinematic and kinetic data during single leg lift were collected from ten participants. GRF and COP data were collected using one and two force plates. Displacement of a single heel marker was captured by means of ten Vicon(©) cameras. Kinetic and kinematic data were collected using a sample frequency of 1000Hz. Data were analysed in two stages: identification of key events in the kinetic data, and assessing concurrent validity of T1 based on the chosen key events with T1 provided by the kinematic data. The key event presenting the least systematic bias, along with a narrow 95% CI and limits of agreement against the reference standard T1, was the Baseline COPy event. Baseline COPy event was obtained using one force plate and presented excellent between-tester reliability. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Modern status of photonuclear data

    NASA Astrophysics Data System (ADS)

    Varlamov, V. V.; Ishkhanov, B. S.

    2017-09-01

    The reliability of experimental cross sections obtained for (γ, 1 n), (γ, 2 n), and (γ, 3 n) partial photoneutron reactions using beams of quasimonoenergetic annihilation photons and bremsstrahlung is analyzed by employing data for a large number of medium-heavy and heavy nuclei, including those of 63,65Cu, 80Se, 90,91,94Zr, 115In, 112-124Sn, 133Cs, 138Ba, 159Tb, 181Ta, 186-192Os, 197Au, 208Pb, and 209Bi. The ratios of the cross sections of definite partial reactions to the cross section of the neutron-yield reaction, F i = σ(γ, in)/ σ(γ, xn), are used as criteria of experimental-data reliability. By definition, positive values of these ratios should not exceed the upper limits of 1.00, 0.50, 0.33,... for i = 1, 2, 3,..., respectively. For many nuclei, unreliable values of the above ratios were found to correlate clearly in various photon-energy regions F i with physically forbidden negative values of cross sections of partial reactions. On this basis, one can conclude that correspondent experimental data are unreliable. Significant systematic uncertainties of the methods used to determine photoneutron multiplicity are shown to be the main reason for this. New partial-reaction cross sections that satisfy the above data-reliability criteria were evaluated within an experimental-theoretical method [ σ eval(γ, in) = F i theor (γ, in) × σ expt(γ, xn)] by employing the ratios F i theor (γ, in) calculated on the basis of a combined photonuclear-reaction model. It was obtained that cross sections evaluated in this way deviate substantially from the results of many experiments performed via neutron-multiplicity sorting, but, at the same time, agree with the results of alternative activation experiments. Prospects of employing methods that would provide, without recourse to photoneutron-multiplicity sorting, reliable data on cross sections of partial photoneutron reactions are discussed.

  5. Analysis of alternative splicing events for cancer diagnosis using a multiplexing nanophotonic biosensor

    PubMed Central

    Huertas, César S.; Domínguez-Zotes, Santos; Lechuga, Laura M.

    2017-01-01

    Personalized medicine is a promising tool not only for prevention, screening and development of more efficient treatment strategies, but also for diminishing the side effects caused by current therapies. Deciphering gene regulation pathways provides a reliable prognostic analysis to elucidate the origin of grave diseases and facilitate the selection of the most adequate treatment for each individual. Alternative splicing of mRNA precursors is one of these gene regulation pathways and enables cells to generate different protein outputs from the same gene depending on their developmental or homeostatic status. Its deregulation is strongly linked to disease onset and progression constituting a relevant and innovative class of biomarker. Herein we report a highly selective and sensitive nanophotonic biosensor based on the direct monitoring of the aberrant alternative splicing of Fas gene. Unlike conventional methods, the nanobiosensor performs a real-time detection of the specific isoforms in the fM-pM range without any cDNA synthesis or PCR amplification requirements. The nanobiosensor has been proven isoform-specific with no crosshybridization, greatly minimizing detection biases. The demonstrated high sensitivity and specificity make our nanobiosensor ideal for examining significant tumor-associated expression shifts of alternatively spliced isoforms for the early and accurate theranostics of cancer. PMID:28120920

  6. Analysis of alternative splicing events for cancer diagnosis using a multiplexing nanophotonic biosensor.

    PubMed

    Huertas, César S; Domínguez-Zotes, Santos; Lechuga, Laura M

    2017-01-25

    Personalized medicine is a promising tool not only for prevention, screening and development of more efficient treatment strategies, but also for diminishing the side effects caused by current therapies. Deciphering gene regulation pathways provides a reliable prognostic analysis to elucidate the origin of grave diseases and facilitate the selection of the most adequate treatment for each individual. Alternative splicing of mRNA precursors is one of these gene regulation pathways and enables cells to generate different protein outputs from the same gene depending on their developmental or homeostatic status. Its deregulation is strongly linked to disease onset and progression constituting a relevant and innovative class of biomarker. Herein we report a highly selective and sensitive nanophotonic biosensor based on the direct monitoring of the aberrant alternative splicing of Fas gene. Unlike conventional methods, the nanobiosensor performs a real-time detection of the specific isoforms in the fM-pM range without any cDNA synthesis or PCR amplification requirements. The nanobiosensor has been proven isoform-specific with no crosshybridization, greatly minimizing detection biases. The demonstrated high sensitivity and specificity make our nanobiosensor ideal for examining significant tumor-associated expression shifts of alternatively spliced isoforms for the early and accurate theranostics of cancer.

  7. A comparison of methods for teaching receptive labeling to children with autism spectrum disorders.

    PubMed

    Grow, Laura L; Carr, James E; Kodak, Tiffany M; Jostad, Candice M; Kisamore, April N

    2011-01-01

    Many early intervention curricular manuals recommend teaching auditory-visual conditional discriminations (i.e., receptive labeling) using the simple-conditional method in which component simple discriminations are taught in isolation and in the presence of a distracter stimulus before the learner is required to respond conditionally. Some have argued that this procedure might be susceptible to faulty stimulus control such as stimulus overselectivity (Green, 2001). Consequently, there has been a call for the use of alternative teaching procedures such as the conditional-only method, which involves conditional discrimination training from the onset of intervention. The purpose of the present study was to compare the simple-conditional and conditional-only methods for teaching receptive labeling to 3 young children diagnosed with autism spectrum disorders. The data indicated that the conditional-only method was a more reliable and efficient teaching procedure. In addition, several error patterns emerged during training using the simple-conditional method. The implications of the results with respect to current teaching practices in early intervention programs are discussed.

  8. Body temperature measurement in mice during acute illness: implantable temperature transponder versus surface infrared thermometry.

    PubMed

    Mei, Jie; Riedel, Nico; Grittner, Ulrike; Endres, Matthias; Banneke, Stefanie; Emmrich, Julius Valentin

    2018-02-23

    Body temperature is a valuable parameter in determining the wellbeing of laboratory animals. However, using body temperature to refine humane endpoints during acute illness generally lacks comprehensiveness and exposes to inter-observer bias. Here we compared two methods to assess body temperature in mice, namely implanted radio frequency identification (RFID) temperature transponders (method 1) to non-contact infrared thermometry (method 2) in 435 mice for up to 7 days during normothermia and lipopolysaccharide (LPS) endotoxin-induced hypothermia. There was excellent agreement between core and surface temperature as determined by method 1 and 2, respectively, whereas the intra- and inter-subject variation was higher for method 2. Nevertheless, using machine learning algorithms to determine temperature-based endpoints both methods had excellent accuracy in predicting death as an outcome event. Therefore, less expensive and cumbersome non-contact infrared thermometry can serve as a reliable alternative for implantable transponder-based systems for hypothermic responses, although requiring standardization between experimenters.

  9. Clinical Application of Pluripotent Stem Cells: An Alternative Cell-Based Therapy for Treating Liver Diseases?

    PubMed

    Tolosa, Laia; Pareja, Eugenia; Gómez-Lechón, Maria José

    2016-12-01

    The worldwide shortage of donor livers for organ and hepatocyte transplantation has prompted the search for alternative therapies for intractable liver diseases. Cell-based therapy is envisaged as a useful therapeutic option to recover and stabilize the lost metabolic function for acute liver failure, end-stage and congenital liver diseases, or for those patients who are not considered eligible for organ transplantation. In recent years, research to identify alternative and reliable cell sources for transplantation that can be derived by reproducible methods has been encouraged. Human pluripotent stem cells (PSCs), which comprise both embryonic and induced PSCs, may offer many advantages as an alternative to hepatocytes for liver cell therapy. Their capacity for expansion, hepatic differentiation and self-renewal make them a promising source of unlimited numbers of hepatocyte-like cells for treating and repairing damaged livers. Immunogenicity and tumorigenicity of human PSCs remain the bottleneck for successful clinical application. However, recent advances made to develop disease-corrected hepatocyte-like cells from patients' human-induced PSCs by gene editing have opened up many potential gateways for the autologous treatment of hereditary liver diseases, which may likely reduce the risk of rejection and the need for lifelong immunosuppression. Well-defined methods to reduce the expression of oncogenic genes in induced PSCs, including protocols for their complete and safe hepatic differentiation, should be established to minimize the tumorigenicity of transplanted cells. On top of this, such new strategies are currently being rigorously tested and validated in preclinical studies before they can be safely transferred to clinical practice with patients.

  10. A new approach to determining net impulse and identification of its characteristics in countermovement jumping: reliability and validity.

    PubMed

    Mizuguchi, Satoshi; Sands, William A; Wassinger, Craig A; Lamont, Hugh S; Stone, Michael H

    2015-06-01

    Examining a countermovement jump (CMJ) force-time curve related to net impulse might be useful in monitoring athletes' performance. This study aimed to investigate the reliability of alternative net impulse calculation and net impulse characteristics (height, width, rate of force development, shape factor, and proportion) and validate against the traditional calculation in the CMJ. Twelve participants performed the CMJ in two sessions (48 hours apart) for test-retest reliability. Twenty participants were involved for the validity assessment. Results indicated intra-class correlation coefficient (ICC) of ≥ 0.89 and coefficient of variation (CV) of ≤ 5.1% for all of the variables except for rate of force development (ICC = 0.78 and CV = 22.3%). The relationship between the criterion and alternative calculations was r = 1.00. While the difference between them was statistically significant (245.96 ± 63.83 vs. 247.14 ± 64.08 N s, p < 0.0001), the effect size was trivial and deemed practically minimal (d = 0.02). In conclusion, variability of rate of force development will pose a greater challenge in detecting performance changes. Also, the alternative calculation can be used practically in place of the traditional calculation to identify net impulse characteristics and monitor and study athletes' performance in greater depth.

  11. The Deployment of Carbon Monoxide Wireless Sensor Network (CO-WSN) for Ambient Air Monitoring

    PubMed Central

    Chaiwatpongsakorn, Chaichana; Lu, Mingming; Keener, Tim C.; Khang, Soon-Jai

    2014-01-01

    Wireless sensor networks are becoming increasingly important as an alternative solution for environment monitoring because they can reduce cost and complexity. Also, they can improve reliability and data availability in places where traditional monitoring methods are difficult to site. In this study, a carbon monoxide wireless sensor network (CO-WSN) was developed to measure carbon monoxide concentrations at a major traffic intersection near the University of Cincinnati main campus. The system has been deployed over two weeks during Fall 2010, and Summer 2011–2012, traffic data was also recorded by using a manual traffic counter and a video camcorder to characterize vehicles at the intersection 24 h, particularly, during the morning and evening peak hour periods. According to the field test results, the 1 hr-average CO concentrations were found to range from 0.1–1.0 ppm which is lower than the National Ambient Air Quality Standards (NAAQS) 35 ppm on a one-hour averaging period. During rush hour periods, the traffic volume at the intersection varied from 2,067 to 3,076 vehicles per hour with 97% being passenger vehicles. Furthermore, the traffic volume based on a 1-h average showed good correlation (R2 = 0.87) with the 1-h average CO-WSN concentrations for morning and evening peak time periods whereas CO-WSN results provided a moderate correlation (R2 = 0.42) with 24 hours traffic volume due to fluctuated changes of meteorological conditions. It is concluded that the performance and the reliability of wireless ambient air monitoring networks can be used as an alternative method for real time air monitoring. PMID:24937527

  12. Rapid real-time PCR methods to distinguish Salmonella Enteritidis wildtype field isolates from vaccine strains Salmovac SE/Gallivac SE and AviPro SALMONELLA VAC E.

    PubMed

    Maurischat, Sven; Szabo, Istvan; Baumann, Beatrice; Malorny, Burkhard

    2015-05-01

    Salmonella enterica serovar Enteritidis is a major non-typhoid Salmonella serovar causing human salmonellosis mainly associated with the consumption of poultry and products thereof. To reduce infections in poultry, S. Enteritidis live vaccine strains AviPro SALMONELLA VAC E and Salmovac SE/Gallivac SE have been licensed and used in several countries worldwide. To definitively diagnose a S. Enteritidis contamination in vaccinated herds a reliable and fast method for the differentiation between vaccine and wildtype field isolates is required. In this study, we developed and validated real-time PCR (qPCR) assays to distinguish those variants genetically. Suitable target sequences were identified by whole genome sequencing (WGS) using the Illumina MiSeq system. SNP regions in kdpA and nhaA proved to be most useful for differentiation of AviPro SALMONELLA VAC E and Salmovac SE/Gallivac SE, respectively, from wildtype strains. For each vaccine strain one TaqMan-qPCR assay and one alternative approach using High Resolution Melting (HRM) analysis was designed. All 30 Salmovac SE and 7 AviPro SALMONELLA VAC E vaccine strain reisolates tested were correctly identified by both approaches (100% inclusivity). Furthermore, all 137 (TaqMan) and 97 (HRM) Salmonella non-vaccine and related Enterobacteriaceae strains tested were excluded (100% exclusivity). The analytical detection limits were determined to be approx. 10(2) genome copies/reaction for the TaqMan and 10(4) genome copies/reaction for the HRM approach. The real-time PCR assays proved to be a reliable and fast alternative to the cultural vaccine strain identification tests helping decision makers in control measurements to take action within a shorter period of time. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Human reliability in petrochemical industry: an action research.

    PubMed

    Silva, João Alexandre Pinheiro; Camarotto, João Alberto

    2012-01-01

    This paper aims to identify conflicts and gaps between the operators' strategies and actions and the organizational managerial approach for human reliability. In order to achieve these goals, the research approach adopted encompasses literature review, mixing action research methodology and Ergonomic Workplace Analysis in field research. The result suggests that the studied company has a classical and mechanistic point of view focusing on error identification and building barriers through procedures, checklists and other prescription alternatives to improve performance in reliability area. However, it was evident the fundamental role of the worker as an agent of maintenance and construction of system reliability during the action research cycle.

  14. Self-Cleaning Boudouard Reactor for Full Oxygen Recovery from CO2 Project

    NASA Technical Reports Server (NTRS)

    Zeitlin, Nancy; Muscatello, Anthony

    2015-01-01

    Oxygen recovery from respiratory CO2 is an important aspect of human spaceflight. Methods exist to sequester the CO2, but production of oxygen needs further development. The current ISS Carbon Dioxide Reduction System (CRS) uses the Sabatier reaction to produce water (and ultimately breathing air). Oxygen recovery is limited to 50% because half of the hydrogen used in the Sabatier reactor is lost as methane, which is vented overboard. The Bosch reaction is the only real alternative to the Sabatier reaction, but in the last reaction in the cycle (Boudouard) the resulting carbon buildup will eventually foul the nickel or iron catalyst, reducing reactor life and increasing consumables. To minimize this fouling, find a use for this waste product, and increase efficiency, we propose testing various self-cleaning catalyst designs in an existing MSFC Boudouard reaction test bed and to determine which one is the most reliable in conversion and lack of fouling. Challenges include mechanical reliability of the cleaning method and maintaining high conversion efficiency with lower catalyst surface area. The above chemical reactions are well understood, but planned implementations are novel (TRL 2) and haven't been investigated at any level.

  15. On the probability of exceeding allowable leak rates through degraded steam generator tubes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cizelj, L.; Sorsek, I.; Riesch-Oppermann, H.

    1997-02-01

    This paper discusses some possible ways of predicting the behavior of the total leak rate through the damaged steam generator tubes. This failure mode is of special concern in cases where most through-wall defects may remain In operation. A particular example is the application of alternate (bobbin coil voltage) plugging criterion to Outside Diameter Stress Corrosion Cracking at the tube support plate intersections. It is the authors aim to discuss some possible modeling options that could be applied to solve the problem formulated as: Estimate the probability that the sum of all individual leak rates through degraded tubes exceeds themore » predefined acceptable value. The probabilistic approach is of course aiming at reliable and computationaly bearable estimate of the failure probability. A closed form solution is given for a special case of exponentially distributed individual leak rates. Also, some possibilities for the use of computationaly efficient First and Second Order Reliability Methods (FORM and SORM) are discussed. The first numerical example compares the results of approximate methods with closed form results. SORM in particular shows acceptable agreement. The second numerical example considers a realistic case of NPP in Krsko, Slovenia.« less

  16. An inexact chance-constrained programming model for water quality management in Binhai New Area of Tianjin, China.

    PubMed

    Xie, Y L; Li, Y P; Huang, G H; Li, Y F; Chen, L R

    2011-04-15

    In this study, an inexact-chance-constrained water quality management (ICC-WQM) model is developed for planning regional environmental management under uncertainty. This method is based on an integration of interval linear programming (ILP) and chance-constrained programming (CCP) techniques. ICC-WQM allows uncertainties presented as both probability distributions and interval values to be incorporated within a general optimization framework. Complexities in environmental management systems can be systematically reflected, thus applicability of the modeling process can be highly enhanced. The developed method is applied to planning chemical-industry development in Binhai New Area of Tianjin, China. Interval solutions associated with different risk levels of constraint violation have been obtained. They can be used for generating decision alternatives and thus help decision makers identify desired policies under various system-reliability constraints of water environmental capacity of pollutant. Tradeoffs between system benefits and constraint-violation risks can also be tackled. They are helpful for supporting (a) decision of wastewater discharge and government investment, (b) formulation of local policies regarding water consumption, economic development and industry structure, and (c) analysis of interactions among economic benefits, system reliability and pollutant discharges. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Ensemble Flow Forecasts for Risk Based Reservoir Operations of Lake Mendocino in Mendocino County, California: A Framework for Objectively Leveraging Weather and Climate Forecasts in a Decision Support Environment

    NASA Astrophysics Data System (ADS)

    Delaney, C.; Hartman, R. K.; Mendoza, J.; Whitin, B.

    2017-12-01

    Forecast informed reservoir operations (FIRO) is a methodology that incorporates short to mid-range precipitation and flow forecasts to inform the flood operations of reservoirs. The Ensemble Forecast Operations (EFO) alternative is a probabilistic approach of FIRO that incorporates ensemble streamflow predictions (ESPs) made by NOAA's California-Nevada River Forecast Center (CNRFC). With the EFO approach, release decisions are made to manage forecasted risk of reaching critical operational thresholds. A water management model was developed for Lake Mendocino, a 111,000 acre-foot reservoir located near Ukiah, California, to evaluate the viability of the EFO alternative to improve water supply reliability but not increase downstream flood risk. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United States Army Corps of Engineers and is operated for water supply by the Sonoma County Water Agency. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has suffered from water supply reliability issues since 2007. The EFO alternative was simulated using a 26-year (1985-2010) ESP hindcast generated by the CNRFC. The ESP hindcast was developed using Global Ensemble Forecast System version 10 precipitation reforecasts processed with the Hydrologic Ensemble Forecast System to generate daily reforecasts of 61 flow ensemble members for a 15-day forecast horizon. Model simulation results demonstrate that the EFO alternative may improve water supply reliability for Lake Mendocino yet not increase flood risk for downstream areas. The developed operations framework can directly leverage improved skill in the second week of the forecast and is extendable into the S2S time domain given the demonstration of improved skill through a reliable reforecast of adequate historical duration and consistent with operationally available numerical weather predictions.

  18. 10 CFR 504.7 - Prohibition against excessive use of petroleum or natural gas in mixtures-electing powerplants.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) ALTERNATE FUELS EXISTING POWERPLANTS § 504.7 Prohibition against excessive use of petroleum or natural gas... technically and financially feasible for a unit to use a mixture of petroleum or natural gas and an alternate... natural gas, or both, in amounts exceeding the minimum amount necessary to maintain reliability of...

  19. Test-Retest Reliability of the Adaptive Chemistry Assessment Survey for Teachers: Measurement Error and Alternatives to Correlation

    ERIC Educational Resources Information Center

    Harshman, Jordan; Yezierski, Ellen

    2016-01-01

    Determining the error of measurement is a necessity for researchers engaged in bench chemistry, chemistry education research (CER), and a multitude of other fields. Discussions regarding what constructs measurement error entails and how to best measure them have occurred, but the critiques about traditional measures have yielded few alternatives.…

  20. Pilot Study for Standardizing Rapid Automatized Naming and Rapid Alternating Stimulus Tests in Arabic

    ERIC Educational Resources Information Center

    Abu-Hamour, Bashir

    2013-01-01

    This study examined the acceptability, reliability, and validity of the Arabic translated version of the Rapid Automatized Naming and Rapid Alternating Stimulus Tests (RAN/RAS; Wolf & Denckla, 2005) for Jordanian students. RAN/RAS tests are a vital assessment tool to distinguish good readers from poor readers. These tests have been…

  1. Reliability and Diagnostic Efficiency of the Diagnostic Inventory for Disharmony (DID) in Youths with Pervasive Developmental Disorder and Multiple Complex Developmental Disorder

    ERIC Educational Resources Information Center

    Xavier, Jean; Vannetzel, Leonard; Viaux, Sylvie; Leroy, Arthur; Plaza, Monique; Tordjman, Sylvie; Mille, Christian; Bursztejn, Claude; Cohen, David; Guile, Jean-Marc

    2011-01-01

    The Pervasive Developmental Disorder-Not Otherwise Specified (PDD-NOS) category is a psychopathological entity few have described and is poorly, and mainly negatively, defined by autism exclusion. In order to limit PDD-NOS heterogeneity, alternative clinical constructs have been developed. This study explored the reliability and the diagnostic…

  2. Equivalence Reliability among the FITNESSGRAM[R] Upper-Body Tests of Muscular Strength and Endurance

    ERIC Educational Resources Information Center

    Sherman, Todd; Barfield, J. P.

    2006-01-01

    This study was designed to investigate the equivalence reliability between the suggested FITNESSGRAM[R] muscular strength and endurance test, the 90[degrees] push-up (PSU), and alternate FITNESSGRAM[R] tests of upper-body strength and endurance (i.e., modified pull-up [MPU], flexed-arm hang [FAH], and pull-up [PU]). Children (N = 383) in Grades 3…

  3. Fuzzy multicriteria disposal method and site selection for municipal solid waste.

    PubMed

    Ekmekçioğlu, Mehmet; Kaya, Tolga; Kahraman, Cengiz

    2010-01-01

    The use of fuzzy multiple criteria analysis (MCA) in solid waste management has the advantage of rendering subjective and implicit decision making more objective and analytical, with its ability to accommodate both quantitative and qualitative data. In this paper a modified fuzzy TOPSIS methodology is proposed for the selection of appropriate disposal method and site for municipal solid waste (MSW). Our method is superior to existing methods since it has capability of representing vague qualitative data and presenting all possible results with different degrees of membership. In the first stage of the proposed methodology, a set of criteria of cost, reliability, feasibility, pollution and emission levels, waste and energy recovery is optimized to determine the best MSW disposal method. Landfilling, composting, conventional incineration, and refuse-derived fuel (RDF) combustion are the alternatives considered. The weights of the selection criteria are determined by fuzzy pairwise comparison matrices of Analytic Hierarchy Process (AHP). It is found that RDF combustion is the best disposal method alternative for Istanbul. In the second stage, the same methodology is used to determine the optimum RDF combustion plant location using adjacent land use, climate, road access and cost as the criteria. The results of this study illustrate the importance of the weights on the various factors in deciding the optimized location, with the best site located in Catalca. A sensitivity analysis is also conducted to monitor how sensitive our model is to changes in the various criteria weights. 2010 Elsevier Ltd. All rights reserved.

  4. Validity and reliability of the NAB Naming Test.

    PubMed

    Sachs, Bonnie C; Rush, Beth K; Pedraza, Otto

    2016-05-01

    Confrontation naming is commonly assessed in neuropsychological practice, but few standardized measures of naming exist and those that do are susceptible to the effects of education and culture. The Neuropsychological Assessment Battery (NAB) Naming Test is a 31-item measure used to assess confrontation naming. Despite adequate psychometric information provided by the test publisher, there has been limited independent validation of the test. In this study, we investigated the convergent and discriminant validity, internal consistency, and alternate forms reliability of the NAB Naming Test in a sample of adults (Form 1: n = 247, Form 2: n = 151) clinically referred for neuropsychological evaluation. Results indicate adequate-to-good internal consistency and alternate forms reliability. We also found strong convergent validity as demonstrated by relationships with other neurocognitive measures. We found preliminary evidence that the NAB Naming Test demonstrates a more pronounced ceiling effect than other commonly used measures of naming. To our knowledge, this represents the largest published independent validation study of the NAB Naming Test in a clinical sample. Our findings suggest that the NAB Naming Test demonstrates adequate validity and reliability and merits consideration in the test arsenal of clinical neuropsychologists.

  5. MEMS reliability: coming of age

    NASA Astrophysics Data System (ADS)

    Douglass, Michael R.

    2008-02-01

    In today's high-volume semiconductor world, one could easily take reliability for granted. As the MOEMS/MEMS industry continues to establish itself as a viable alternative to conventional manufacturing in the macro world, reliability can be of high concern. Currently, there are several emerging market opportunities in which MOEMS/MEMS is gaining a foothold. Markets such as mobile media, consumer electronics, biomedical devices, and homeland security are all showing great interest in microfabricated products. At the same time, these markets are among the most demanding when it comes to reliability assurance. To be successful, each company developing a MOEMS/MEMS device must consider reliability on an equal footing with cost, performance and manufacturability. What can this maturing industry learn from the successful development of DLP technology, air bag accelerometers and inkjet printheads? This paper discusses some basic reliability principles which any MOEMS/MEMS device development must use. Examples from the commercially successful and highly reliable Digital Micromirror Device complement the discussion.

  6. Development of a Magnetic Attachment Method for Bionic Eye Applications.

    PubMed

    Fox, Kate; Meffin, Hamish; Burns, Owen; Abbott, Carla J; Allen, Penelope J; Opie, Nicholas L; McGowan, Ceara; Yeoh, Jonathan; Ahnood, Arman; Luu, Chi D; Cicione, Rosemary; Saunders, Alexia L; McPhedran, Michelle; Cardamone, Lisa; Villalobos, Joel; Garrett, David J; Nayagam, David A X; Apollo, Nicholas V; Ganesan, Kumaravelu; Shivdasani, Mohit N; Stacey, Alastair; Escudie, Mathilde; Lichter, Samantha; Shepherd, Robert K; Prawer, Steven

    2016-03-01

    Successful visual prostheses require stable, long-term attachment. Epiretinal prostheses, in particular, require attachment methods to fix the prosthesis onto the retina. The most common method is fixation with a retinal tack; however, tacks cause retinal trauma, and surgical proficiency is important to ensure optimal placement of the prosthesis near the macula. Accordingly, alternate attachment methods are required. In this study, we detail a novel method of magnetic attachment for an epiretinal prosthesis using two prostheses components positioned on opposing sides of the retina. The magnetic attachment technique was piloted in a feline animal model (chronic, nonrecovery implantation). We also detail a new method to reliably control the magnet coupling force using heat. It was found that the force exerted upon the tissue that separates the two components could be minimized as the measured force is proportionately smaller at the working distance. We thus detail, for the first time, a surgical method using customized magnets to position and affix an epiretinal prosthesis on the retina. The position of the epiretinal prosthesis is reliable, and its location on the retina is accurately controlled by the placement of a secondary magnet in the suprachoroidal location. The electrode position above the retina is less than 50 microns at the center of the device, although there were pressure points seen at the two edges due to curvature misalignment. The degree of retinal compression found in this study was unacceptably high; nevertheless, the normal structure of the retina remained intact under the electrodes. Copyright © 2015 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  7. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    NASA Astrophysics Data System (ADS)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.

  8. The history and development of FETAX (ASTM standard guide, E-1439 on conducting the frog embryo teratogenesis Assay-Xenopus)

    USGS Publications Warehouse

    Dumont, J.N.; Bantle, J.A.; Linder, G.; ,

    2003-01-01

    The energy crisis of the 1970's and 1980's prompted the search for alternative sources of fuel. With development of alternate sources of energy, concerns for biological resources potentially adversely impacted by these alternative technologies also heightened. For example, few biological tests were available at the time to study toxic effects of effluents on surface waters likely to serve as receiving streams for energy-production facilities; hence, we began to use Xenopus laevis embryos as test organisms to examine potential toxic effects associated with these effluents upon entering aquatic systems. As studies focused on potential adverse effects on aquatic systems continued, a test procedure was developed that led to the initial standardization of FETAX. Other .than a limited number of aquatic toxicity tests that used fathead minnows and cold-water fishes such as rainbow trout, X. laevis represented the only other aquatic vertebrate test system readily available to evaluate complex effluents. With numerous laboratories collaborating, the test with X. laevis was refined, improved, and developed as ASTM E-1439, Standard Guide for the Conducting Frog Embryo Teratogenesis Assay-Xenopus (FETAX). Collabrative work in the 1990s yielded procedural enhancements, for example, development of standard test solutions and exposure methods to handle volatile organics and hydrophobic compounds. As part of the ASTM process, a collaborative interlaboratory study was performed to determine the repeatability and reliability of FETAX. Parallel to these efforts, methods were also developed to test sediments and soils, and in situ test methods were developed to address "lab-to-field extrapolation errors" that could influence the method's use in ecological risk assessments. Additionally, a metabolic activation system composed of rat liver microsomes was developed which made FETAX more relevant to mammalian studies.

  9. Illustrated structural application of universal first-order reliability method

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1994-01-01

    The general application of the proposed first-order reliability method was achieved through the universal normalization of engineering probability distribution data. The method superimposes prevailing deterministic techniques and practices on the first-order reliability method to surmount deficiencies of the deterministic method and provide benefits of reliability techniques and predictions. A reliability design factor is derived from the reliability criterion to satisfy a specified reliability and is analogous to the deterministic safety factor. Its application is numerically illustrated on several practical structural design and verification cases with interesting results and insights. Two concepts of reliability selection criteria are suggested. Though the method was developed to support affordable structures for access to space, the method should also be applicable for most high-performance air and surface transportation systems.

  10. An Evaluation Method of Equipment Reliability Configuration Management

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Feng, Weijia; Zhang, Wei; Li, Yuan

    2018-01-01

    At present, many equipment development companies have been aware of the great significance of reliability of the equipment development. But, due to the lack of effective management evaluation method, it is very difficult for the equipment development company to manage its own reliability work. Evaluation method of equipment reliability configuration management is to determine the reliability management capabilities of equipment development company. Reliability is not only designed, but also managed to achieve. This paper evaluates the reliability management capabilities by reliability configuration capability maturity model(RCM-CMM) evaluation method.

  11. Gastropod shell size and architecture influence the applicability of methods used to estimate internal volume.

    PubMed

    Ragagnin, Marilia Nagata; Gorman, Daniel; McCarthy, Ian Donald; Sant'Anna, Bruno Sampaio; de Castro, Cláudio Campi; Turra, Alexander

    2018-01-11

    Obtaining accurate and reproducible estimates of internal shell volume is a vital requirement for studies into the ecology of a range of shell-occupying organisms, including hermit crabs. Shell internal volume is usually estimated by filling the shell cavity with water or sand, however, there has been no systematic assessment of the reliability of these methods and moreover no comparison with modern alternatives, e.g., computed tomography (CT). This study undertakes the first assessment of the measurement reproducibility of three contrasting approaches across a spectrum of shell architectures and sizes. While our results suggested a certain level of variability inherent for all methods, we conclude that a single measure using sand/water is likely to be sufficient for the majority of studies. However, care must be taken as precision may decline with increasing shell size and structural complexity. CT provided less variation between repeat measures but volume estimates were consistently lower compared to sand/water and will need methodological improvements before it can be used as an alternative. CT indicated volume may be also underestimated using sand/water due to the presence of air spaces visible in filled shells scanned by CT. Lastly, we encourage authors to clearly describe how volume estimates were obtained.

  12. Dynamic light scattering: A fast and reliable method to analyze bacterial growth during the lag phase.

    PubMed

    Vargas, Susana; Millán-Chiu, Blanca E; Arvizu-Medrano, Sofía M; Loske, Achim M; Rodríguez, Rogelio

    2017-06-01

    A comparison between plate counting (PC) and dynamic light scattering (DLS) is reported. PC is the standard technique to determine bacterial population as a function of time; however, this method has drawbacks, such as the cumbersome preparation and handling of samples, as well as the long time required to obtain results. Alternative methods based on optical density are faster, but do not distinguish viable from non-viable cells. These inconveniences are overcome by using DLS. Two different bacteria strains were considered: Escherichia coli and Staphylococcus aureus. DLS was performed at two different illuminating conditions: continuous and intermittent. By the increment of particle size as a function of time, it was possible to observe cell division and the formation of aggregates containing very few bacteria. The scattered intensity profiles showed the lag phase and the transition to the exponential phase of growth, providing a quantity proportional to viable bacteria concentration. The results revealed a clear and linear correlation in both lag and exponential phase, between the Log 10 (colony-forming units/mL) from PC and the Log 10 of the scattered intensity I s from DLS. These correlations provide a good support to use DLS as an alternative technique to determine bacterial population. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Computing Real-time Streamflow Using Emerging Technologies: Non-contact Radars and the Probability Concept

    NASA Astrophysics Data System (ADS)

    Fulton, J. W.; Bjerklie, D. M.; Jones, J. W.; Minear, J. T.

    2015-12-01

    Measuring streamflow, developing, and maintaining rating curves at new streamgaging stations is both time-consuming and problematic. Hydro 21 was an initiative by the U.S. Geological Survey to provide vision and leadership to identify and evaluate new technologies and methods that had the potential to change the way in which streamgaging is conducted. Since 2014, additional trials have been conducted to evaluate some of the methods promoted by the Hydro 21 Committee. Emerging technologies such as continuous-wave radars and computationally-efficient methods such as the Probability Concept require significantly less field time, promote real-time velocity and streamflow measurements, and apply to unsteady flow conditions such as looped ratings and unsteady-flood flows. Portable and fixed-mount radars have advanced beyond the development phase, are cost effective, and readily available in the marketplace. The Probability Concept is based on an alternative velocity-distribution equation developed by C.-L. Chiu, who pioneered the concept. By measuring the surface-water velocity and correcting for environmental influences such as wind drift, radars offer a reliable alternative for measuring and computing real-time streamflow for a variety of hydraulic conditions. If successful, these tools may allow us to establish ratings more efficiently, assess unsteady flow conditions, and report real-time streamflow at new streamgaging stations.

  14. Novel hermetic packaging methods for MOEMS

    NASA Astrophysics Data System (ADS)

    Stark, David

    2003-01-01

    Hermetic packaging of micro-optoelectromechanical systems (MOEMS) is an immature technology, lacking industry-consensus methods and standards. Off-the-shelf, catalog window assemblies are not yet available. Window assemblies are in general custom designed and manufactured for each new product, resulting in longer than acceptable cycle times, high procurement costs and questionable reliability. There are currently two dominant window-manufacturing methods wherein a metal frame is attached to glass, as well as a third, less-used method. The first method creates a glass-to-metal seal by heating the glass above its Tg to fuse it to the frame. The second method involves first metallizing the glass where it is to be attached to the frame, and then soldering the glass to the frame. The third method employs solder-glass to bond the glass to the frame. A novel alternative with superior features compared to the three previously described window-manufacturing methods is proposed. The new approach lends itself to a plurality of glass-to-metal attachment techniques. Benefits include lower temperature processing than two of the current methods and potentially more cost-effective manufacturing than all three of today"s attachment methods.

  15. Alternative Splicing May Not Be the Key to Proteome Complexity.

    PubMed

    Tress, Michael L; Abascal, Federico; Valencia, Alfonso

    2017-02-01

    Alternative splicing is commonly believed to be a major source of cellular protein diversity. However, although many thousands of alternatively spliced transcripts are routinely detected in RNA-seq studies, reliable large-scale mass spectrometry-based proteomics analyses identify only a small fraction of annotated alternative isoforms. The clearest finding from proteomics experiments is that most human genes have a single main protein isoform, while those alternative isoforms that are identified tend to be the most biologically plausible: those with the most cross-species conservation and those that do not compromise functional domains. Indeed, most alternative exons do not seem to be under selective pressure, suggesting that a large majority of predicted alternative transcripts may not even be translated into proteins. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Tumour border configuration in colorectal cancer: proposal for an alternative scoring system based on the percentage of infiltrating margin.

    PubMed

    Karamitopoulou, Eva; Zlobec, Inti; Koelzer, Viktor Hendrik; Langer, Rupert; Dawson, Heather; Lugli, Alessandro

    2015-10-01

    Information on tumour border configuration (TBC) in colorectal cancer (CRC) is currently not included in most pathology reports, owing to lack of reproducibility and/or established evaluation systems. The aim of this study was to investigate whether an alternative scoring system based on the percentage of the infiltrating component may represent a reliable method for assessing TBC. Two hundred and fifteen CRCs with complete clinicopathological data were evaluated by two independent observers, both 'traditionally' by assigning the tumours into pushing/infiltrating/mixed categories, and alternatively by scoring the percentage of infiltrating margin. With the pushing/infiltrating/mixed pattern method, interobserver agreement (IOA) was moderate (κ = 0.58), whereas with the percentage of infiltrating margins method, IOA was excellent (intraclass correlation coefficient of 0.86). A higher percentage of infiltrating margin correlated with adverse features such as higher grade (P = 0.0025), higher pT (P = 0.0007), pN (P = 0.0001) and pM classification (P = 0.0063), high-grade tumour budding (P < 0.0001), lymphatic invasion (P < 0.0001), vascular invasion (P = 0.0032), and shorter survival (P = 0.0008), and was significantly associated with an increased probability of lymph node metastasis (P < 0.001). Information on TBC gives additional prognostic value to pathology reports on CRC. The novel proposed scoring system, by using the percentage of infiltrating margin, outperforms the 'traditional' way of reporting TBC. Additionally, it is reproducible and simple to apply, and can therefore be easily integrated into daily diagnostic practice. © 2015 John Wiley & Sons Ltd.

  17. Evaluating the sensitization potential of surfactants: integrating data from the local lymph node assay, guinea pig maximization test, and in vitro methods in a weight-of-evidence approach.

    PubMed

    Ball, Nicholas; Cagen, Stuart; Carrillo, Juan-Carlos; Certa, Hans; Eigler, Dorothea; Emter, Roger; Faulhammer, Frank; Garcia, Christine; Graham, Cynthia; Haux, Carl; Kolle, Susanne N; Kreiling, Reinhard; Natsch, Andreas; Mehling, Annette

    2011-08-01

    An integral part of hazard and safety assessments is the estimation of a chemical's potential to cause skin sensitization. Currently, only animal tests (OECD 406 and 429) are accepted in a regulatory context. Nonanimal test methods are being developed and formally validated. In order to gain more insight into the responses induced by eight exemplary surfactants, a battery of in vivo and in vitro tests were conducted using the same batch of chemicals. In general, the surfactants were negative in the GPMT, KeratinoSens and hCLAT assays and none formed covalent adducts with test peptides. In contrast, all but one was positive in the LLNA. Most were rated as being irritants by the EpiSkin assay with the additional endpoint, IL1-alpha. The weight of evidence based on this comprehensive testing indicates that, with one exception, they are non-sensitizing skin irritants, confirming that the LLNA tends to overestimate the sensitization potential of surfactants. As results obtained from LLNAs are considered as the gold standard for the development of new nonanimal alternative test methods, results such as these highlight the necessity to carefully evaluate the applicability domains of test methods in order to develop reliable nonanimal alternative testing strategies for sensitization testing. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. Report on noninvasive prenatal testing: classical and alternative approaches.

    PubMed

    Pantiukh, Kateryna S; Chekanov, Nikolay N; Zaigrin, Igor V; Zotov, Alexei M; Mazur, Alexander M; Prokhortchouk, Egor B

    2016-01-01

    Concerns of traditional prenatal aneuploidy testing methods, such as low accuracy of noninvasive and health risks associated with invasive procedures, were overcome with the introduction of novel noninvasive methods based on genetics (NIPT). These were rapidly adopted into clinical practice in many countries after a series of successful trials of various independent submethods. Here we present results of own NIPT trial carried out in Moscow, Russia. 1012 samples were subjected to the method aimed at measuring chromosome coverage by massive parallel sequencing. Two alternative approaches are ascertained: one based on maternal/fetal differential methylation and another based on allelic difference. While the former failed to provide stable results, the latter was found to be promising and worthy of conducting a large-scale trial. One critical point in any NIPT approach is the determination of fetal cell-free DNA fraction, which dictates the reliability of obtained results for a given sample. We show that two different chromosome Y representation measures-by real-time PCR and by whole-genome massive parallel sequencing-are practically interchangeable (r=0.94). We also propose a novel method based on maternal/fetal allelic difference which is applicable in pregnancies with fetuses of either sex. Even in its pilot form it correlates well with chromosome Y coverage estimates (r=0.74) and can be further improved by increasing the number of polymorphisms.

  19. Residual uncertainty estimation using instance-based learning with applications to hydrologic forecasting

    NASA Astrophysics Data System (ADS)

    Wani, Omar; Beckers, Joost V. L.; Weerts, Albrecht H.; Solomatine, Dimitri P.

    2017-08-01

    A non-parametric method is applied to quantify residual uncertainty in hydrologic streamflow forecasting. This method acts as a post-processor on deterministic model forecasts and generates a residual uncertainty distribution. Based on instance-based learning, it uses a k nearest-neighbour search for similar historical hydrometeorological conditions to determine uncertainty intervals from a set of historical errors, i.e. discrepancies between past forecast and observation. The performance of this method is assessed using test cases of hydrologic forecasting in two UK rivers: the Severn and Brue. Forecasts in retrospect were made and their uncertainties were estimated using kNN resampling and two alternative uncertainty estimators: quantile regression (QR) and uncertainty estimation based on local errors and clustering (UNEEC). Results show that kNN uncertainty estimation produces accurate and narrow uncertainty intervals with good probability coverage. Analysis also shows that the performance of this technique depends on the choice of search space. Nevertheless, the accuracy and reliability of uncertainty intervals generated using kNN resampling are at least comparable to those produced by QR and UNEEC. It is concluded that kNN uncertainty estimation is an interesting alternative to other post-processors, like QR and UNEEC, for estimating forecast uncertainty. Apart from its concept being simple and well understood, an advantage of this method is that it is relatively easy to implement.

  20. Optical and electrical nano eco-sensors using alternative deposition of charged layer

    NASA Astrophysics Data System (ADS)

    Ahmed, Syed Rahin; Hong, Seong Cheol; Lee, Jaebeom

    2011-03-01

    This review focuses on layer by layer (LBL) assembly-based nano ecological sensor (hereafter, eco-sensor) for pesticide detection, which is one of the most versatile methods. The effects of pesticides on human health and on the environment (air, water, soil, plants, and animals) are of great concern due to their increasing use. We highlight two of the most popular detecting methods, i.e., fluorescence and electrochemical detection of pesticides on an LBL assembly. Fluorescence materials are of great interest among researchers for their sensitivity and reliable detection, and electrochemical processes allow us to investigate synergistic interactions among film components through charge transfer mechanisms in LBL film at the molecular level. Then, we noted some prospective directions for development of different types of sensing systems.

  1. Projecting the potential evapotranspiration by coupling different formulations and input data reliabilities: The possible uncertainty source for climate change impacts on hydrological regime

    NASA Astrophysics Data System (ADS)

    Wang, Weiguang; Li, Changni; Xing, Wanqiu; Fu, Jianyu

    2017-12-01

    Representing atmospheric evaporating capability for a hypothetical reference surface, potential evapotranspiration (PET) determines the upper limit of actual evapotranspiration and is an important input to hydrological models. Due that present climate models do not give direct estimates of PET when simulating the hydrological response to future climate change, the PET must be estimated first and is subject to the uncertainty on account of many existing formulae and different input data reliabilities. Using four different PET estimation approaches, i.e., the more physically Penman (PN) equation with less reliable input variables, more empirical radiation-based Priestley-Taylor (PT) equation with relatively dependable downscaled data, the most simply temperature-based Hamon (HM) equation with the most reliable downscaled variable, and downscaling PET directly by the statistical downscaling model, this paper investigated the differences of runoff projection caused by the alternative PET methods by a well calibrated abcd monthly hydrological model. Three catchments, i.e., the Luanhe River Basin, the Source Region of the Yellow River and the Ganjiang River Basin, representing a large climatic diversity were chosen as examples to illustrate this issue. The results indicated that although similar monthly patterns of PET over the period 2021-2050 for each catchment were provided by the four methods, the magnitudes of PET were still slightly different, especially for spring and summer months in the Luanhe River Basin and the Source Region of the Yellow River with relatively dry climate feature. The apparent discrepancy in magnitude of change in future runoff and even the diverse change direction for summer months in the Luanhe River Basin and spring months in the Source Region of the Yellow River indicated that the PET method related uncertainty occurred, especially in the Luanhe River Basin and the Source Region of the Yellow River with smaller aridity index. Moreover, the possible reason of discrepancies in uncertainty between three catchments was quantitatively discussed by the contribution analysis based on climatic elasticity method. This study can provide beneficial reference to comprehensively understand the impacts of climate change on hydrological regime and thus improve the regional strategy for future water resource management.

  2. A Bayesian approach to estimating variance components within a multivariate generalizability theory framework.

    PubMed

    Jiang, Zhehan; Skorupski, William

    2017-12-12

    In many behavioral research areas, multivariate generalizability theory (mG theory) has been typically used to investigate the reliability of certain multidimensional assessments. However, traditional mG-theory estimation-namely, using frequentist approaches-has limits, leading researchers to fail to take full advantage of the information that mG theory can offer regarding the reliability of measurements. Alternatively, Bayesian methods provide more information than frequentist approaches can offer. This article presents instructional guidelines on how to implement mG-theory analyses in a Bayesian framework; in particular, BUGS code is presented to fit commonly seen designs from mG theory, including single-facet designs, two-facet crossed designs, and two-facet nested designs. In addition to concrete examples that are closely related to the selected designs and the corresponding BUGS code, a simulated dataset is provided to demonstrate the utility and advantages of the Bayesian approach. This article is intended to serve as a tutorial reference for applied researchers and methodologists conducting mG-theory studies.

  3. Multi-detector row computed tomography angiography of peripheral arterial disease

    PubMed Central

    Dijkshoorn, Marcel L.; Pattynama, Peter M. T.; Myriam Hunink, M. G.

    2007-01-01

    With the introduction of multi-detector row computed tomography (MDCT), scan speed and image quality has improved considerably. Since the longitudinal coverage is no longer a limitation, multi-detector row computed tomography angiography (MDCTA) is increasingly used to depict the peripheral arterial runoff. Hence, it is important to know the advantages and limitations of this new non-invasive alternative for the reference test, digital subtraction angiography. Optimization of the acquisition parameters and the contrast delivery is important to achieve a reliable enhancement of the entire arterial runoff in patients with peripheral arterial disease (PAD) using fast CT scanners. The purpose of this review is to discuss the different scanning and injection protocols using 4-, 16-, and 64-detector row CT scanners, to propose effective methods to evaluate and to present large data sets, to discuss its clinical value and major limitations, and to review the literature on the validity, reliability, and cost-effectiveness of multi-detector row CT in the evaluation of PAD. PMID:17882427

  4. Fabrication of piezoelectric ceramic micro-actuator and its reliability for hard disk drives.

    PubMed

    Jing, Yang; Luo, Jianbin; Yang, Wenyan; Ju, Guoxian

    2004-11-01

    A new U-type micro-actuator for precisely positioning a magnetic head in high-density hard disk drives was proposed and developed. The micro-actuator is composed of a U-type stainless steel substrate and two piezoelectric ceramic elements. Using a high-d31 piezoelectric coefficient PMN-PZT ceramic plate and adopting reactive ion etching process fabricate the piezoelectric elements. Reliability against temperature was investigated to ensure the practical application to the drive products. The U-type substrate attached to each side via piezoelectric elements also was simulated by the finite-element method and practically measured by a laser Doppler vibrometer in order to testify the driving mechanics of it. The micro-actuator coupled with two piezoelectric elements featured large displacement of 0.875 microm and high-resonance frequency over 22 kHz. The novel piezoelectric micro-actuators then possess a useful compromise performance to displacement, resonance frequency, and generative force. The results reveal that the new design concept provides a valuable alternative for multilayer piezoelectric micro-actuators.

  5. The Face-Symbol Test and the Symbol-Digit Test are not reliable surrogates for the Paced Auditory Serial Addition Test in multiple sclerosis.

    PubMed

    Williams, J; O'Rourke, K; Hutchinson, M; Tubridy, N

    2006-10-01

    The Paced Auditory Serial Addition Test (PASAT) is the chosen task for cognitive assessment in the multiple sclerosis functional composite (MSFC) and a widely used task in neuropsychological studies of people with multiple sclerosis (MS), but is unpopular with patients. The Face-Symbol Test (FST) and Symbol-Digit Tests (SDT) are alternative methods of cognitive testing in MS, which are easily administered and patient-friendly. In order to evaluate the potential of the FST as a possible surrogate for the PASAT, we directly compared the FST to the PASAT and the SDT in a cohort of 50 MS patients with varying levels of disability. There was significant correlation between SDT and FST scores (Spearman's rho 0.80, 95% CI 0.66-0.88), R(2) 65%, with moderate inter-test agreement (k =0.52). In contrast, SDT and FST scores were less predictive of PASAT scores. We concluded that neither the FST nor SDT are reliable surrogates for the PASAT.

  6. Paleointensity study of the historical andesitic lava flows: LTD-DHT Shaw and Thellier paleointensities from the Sakurajima 1914 and 1946 lavas in Japan

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Hoshi, H.

    2005-12-01

    Correct determination of absolute paleointensities is essential to investigate past geomagnetic field. There are two types of methods to obtain the paleointensities: the Thellier-type and Shaw-type methods. Many paleomagnetists have so far regarded the former method as reliable. However, there are increasing evidences that it is sometimes not robust for basaltic lavas resulting in systematic high paleointensities (e.g. Calvo et al., 2002; Yamamoto et al., 2003). Alternatively, the double heating technique of the Shaw method combined with low temperature demagnetization (LTD-DHT Shaw method; Tsunakawa et al., 1997; Yamamoto et al., 2003), a lately developed paleointensity technique in Japan, can yield reliable answers even from such basaltic samples (e.g. Yamamoto et al., 2003; Mochizuki et al., 2004; Oishi et al., 2005). In the Japanese archipelago, there are not only basaltic lavas but also andesitic lavas. They are important candidates of the absolute paleointensity determination in Japan. For a case study, we sampled oriented paleomagnetic cores from three sites of the Sakurajima 1914 (TS01 and TS02) and 1946 (SW01) lavas in Japan. Several rock magnetic experiments revealed that main magnetic carriers of the present samples are titanomagnetites with Curie temperatures of about 300-550 C, and that high temperature oxidation progresses in the order of SW01, TS01 and TS02. The LTD-DHT Shaw and Coe-Thellier experiments were conducted on 72 and 63 specimens, respectively. They gave 64 and 60 successful determinations. If the results are normalized by expected field intensities calculated from IGRF-9 (Macmillan et al., 2003) and grouped into LTD-DHT Shaw and Thellier datasets, their averages and standard deviations (1 sigma) resulted in 0.98+/-0.11 (LTD-DHT Shaw) and 1.13+/-0.13 (Thellier). Considering the standard deviations, we can say that both paleointensity methods recovered correct geomagnetic field. However, it is apparent that the LTD-DHT Shaw method has higher reliability than the Thellier method.

  7. A 96-well-plate-based optical method for the quantitative and qualitative evaluation of Pseudomonas aeruginosa biofilm formation and its application to susceptibility testing.

    PubMed

    Müsken, Mathias; Di Fiore, Stefano; Römling, Ute; Häussler, Susanne

    2010-08-01

    A major reason for bacterial persistence during chronic infections is the survival of bacteria within biofilm structures, which protect cells from environmental stresses, host immune responses and antimicrobial therapy. Thus, there is concern that laboratory methods developed to measure the antibiotic susceptibility of planktonic bacteria may not be relevant to chronic biofilm infections, and it has been suggested that alternative methods should test antibiotic susceptibility within a biofilm. In this paper, we describe a fast and reliable protocol for using 96-well microtiter plates for the formation of Pseudomonas aeruginosa biofilms; the method is easily adaptable for antimicrobial susceptibility testing. This method is based on bacterial viability staining in combination with automated confocal laser scanning microscopy. The procedure simplifies qualitative and quantitative evaluation of biofilms and has proven to be effective for standardized determination of antibiotic efficiency on P. aeruginosa biofilms. The protocol can be performed within approximately 60 h.

  8. Correcting for Sample Contamination in Genotype Calling of DNA Sequence Data

    PubMed Central

    Flickinger, Matthew; Jun, Goo; Abecasis, Gonçalo R.; Boehnke, Michael; Kang, Hyun Min

    2015-01-01

    DNA sample contamination is a frequent problem in DNA sequencing studies and can result in genotyping errors and reduced power for association testing. We recently described methods to identify within-species DNA sample contamination based on sequencing read data, showed that our methods can reliably detect and estimate contamination levels as low as 1%, and suggested strategies to identify and remove contaminated samples from sequencing studies. Here we propose methods to model contamination during genotype calling as an alternative to removal of contaminated samples from further analyses. We compare our contamination-adjusted calls to calls that ignore contamination and to calls based on uncontaminated data. We demonstrate that, for moderate contamination levels (5%–20%), contamination-adjusted calls eliminate 48%–77% of the genotyping errors. For lower levels of contamination, our contamination correction methods produce genotypes nearly as accurate as those based on uncontaminated data. Our contamination correction methods are useful generally, but are particularly helpful for sample contamination levels from 2% to 20%. PMID:26235984

  9. Electromagnetic pulse excitation of finite- and infinitely-long lossy conductors over a lossy ground plane

    DOE PAGES

    Campione, Salvatore; Warne, Larry K.; Basilio, Lorena I.; ...

    2017-01-13

    This study details a model for the response of a finite- or an infinite-length wire interacting with a conducting ground to an electromagnetic pulse excitation. We develop a frequency–domain method based on transmission line theory that we name ATLOG – Analytic Transmission Line Over Ground. This method is developed as an alternative to full-wave methods, as it delivers a fast and reliable solution. It allows for the treatment of finite or infinite lossy, coated wires, and lossy grounds. The cases of wire above ground, as well as resting on the ground and buried beneath the ground are treated. The reportedmore » method is general and the time response of the induced current is obtained using an inverse Fourier transform of the current in the frequency domain. The focus is on the characteristics and propagation of the transmission line mode. Comparisons with full-wave simulations strengthen the validity of the proposed method.« less

  10. Electromagnetic pulse excitation of finite- and infinitely-long lossy conductors over a lossy ground plane

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campione, Salvatore; Warne, Larry K.; Basilio, Lorena I.

    This study details a model for the response of a finite- or an infinite-length wire interacting with a conducting ground to an electromagnetic pulse excitation. We develop a frequency–domain method based on transmission line theory that we name ATLOG – Analytic Transmission Line Over Ground. This method is developed as an alternative to full-wave methods, as it delivers a fast and reliable solution. It allows for the treatment of finite or infinite lossy, coated wires, and lossy grounds. The cases of wire above ground, as well as resting on the ground and buried beneath the ground are treated. The reportedmore » method is general and the time response of the induced current is obtained using an inverse Fourier transform of the current in the frequency domain. The focus is on the characteristics and propagation of the transmission line mode. Comparisons with full-wave simulations strengthen the validity of the proposed method.« less

  11. Determination of volatile marker compounds in raw ham using headspace-trap gas chromatography.

    PubMed

    Bosse Née Danz, Ramona; Wirth, Melanie; Konstanz, Annette; Becker, Thomas; Weiss, Jochen; Gibis, Monika

    2017-03-15

    A simple, reliable and automated method was developed and optimized for qualification and quantification of aroma-relevant volatile marker compounds of North European raw ham using a headspace (HS)-Trap gas chromatography-mass spectrometry (GC-MS) and GC-flame ionization detector (FID) analysis. A total of 38 volatile compounds were detected with this HS-Trap GC-MS method amongst which the largest groups were ketones (12), alcohols (8), hydrocarbons (7), aldehydes (6) and esters (3). The HS-Trap GC-FID method was optimized for the parameters: thermostatting time and temperature, vial and desorption pressure, number of extraction cycles and salt addition. A validation for 13 volatile marker compounds with limits of detection in ng/g was carried out. The optimized method can serve as alternative to conventional headspace and solid phase micro extraction methods and allows users to determine volatile compounds in raw hams making it of interest to industrial and academic meat scientists. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. DNA-based identification of spices: DNA isolation, whole genome amplification, and polymerase chain reaction.

    PubMed

    Focke, Felix; Haase, Ilka; Fischer, Markus

    2011-01-26

    Usually spices are identified morphologically using simple methods like magnifying glasses or microscopic instruments. On the other hand, molecular biological methods like the polymerase chain reaction (PCR) enable an accurate and specific detection also in complex matrices. Generally, the origins of spices are plants with diverse genetic backgrounds and relationships. The processing methods used for the production of spices are complex and individual. Consequently, the development of a reliable DNA-based method for spice analysis is a challenging intention. However, once established, this method will be easily adapted to less difficult food matrices. In the current study, several alternative methods for the isolation of DNA from spices have been developed and evaluated in detail with regard to (i) its purity (photometric), (ii) yield (fluorimetric methods), and (iii) its amplifiability (PCR). Whole genome amplification methods were used to preamplify isolates to improve the ratio between amplifiable DNA and inhibiting substances. Specific primer sets were designed, and the PCR conditions were optimized to detect 18 spices selectively. Assays of self-made spice mixtures were performed to proof the applicability of the developed methods.

  13. Agrobacterium tumefaciens-mediated transformation of oleaginous yeast Lipomyces species

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Ziyu; Deng, Shuang; Culley, David E.

    Background: Because of interest in the production of renewable bio-hydrocarbon fuels, various living organisms have been explored for their potential use in producing fuels and chemicals. The oil-producing (oleaginous) yeast Lipomyces starkeyi is the subject of active research regarding the production of lipids using a wide variety of carbon and nutrient sources. The genome of L. starkeyi has been published, which opens the door to production strain improvements using the tools of synthetic biology and metabolic engineering. However, using these tools for strain improvement requires the establishment of effective and reliable transformation methods with suitable selectable markers (antibiotic resistance ormore » auxotrophic marker genes) and the necessary genetic elements (promoters and terminators) for expression of introduced genes. Chemical-based methods have been published, but suffer from low efficiency or the requirement for targeting to rRNA loci. To address these problems, Agrobacterium-mediated transformation was investigated as an alternative method for L. starkeyi and other Lipomyces species. Results: In this study, Agrobacterium-mediated transformation was demonstrated to be effective in the transformation of both L. starkeyi and other Lipomyces species and that the introduced DNA can be reliably integrated into the chromosomes of these species. The gene deletion of Ku70 and Pex10 was also demonstrated in L. starkeyi. In addition to the bacterial antibiotic selection marker gene hygromycin B phosphotransferase, the bacterial -glucuronidase reporter gene under the control of L. starkeyi translation elongation factor 1 promoter was also stably expressed in seven different Lipomyces species. Conclusion: The results from this study clearly demonstrate that Agrobacterium-mediated transformation is a reliable genetic tool for gene deletion and integration and expression of heterologous genes in L. starkeyi and other Lipomyces species.« less

  14. The correct estimate of the probability of false detection of the matched filter in weak-signal detection problems . II. Further results with application to a set of ALMA and ATCA data

    NASA Astrophysics Data System (ADS)

    Vio, R.; Vergès, C.; Andreani, P.

    2017-08-01

    The matched filter (MF) is one of the most popular and reliable techniques to the detect signals of known structure and amplitude smaller than the level of the contaminating noise. Under the assumption of stationary Gaussian noise, MF maximizes the probability of detection subject to a constant probability of false detection or false alarm (PFA). This property relies upon a priori knowledge of the position of the searched signals, which is usually not available. Recently, it has been shown that when applied in its standard form, MF may severely underestimate the PFA. As a consequence the statistical significance of features that belong to noise is overestimated and the resulting detections are actually spurious. For this reason, an alternative method of computing the PFA has been proposed that is based on the probability density function (PDF) of the peaks of an isotropic Gaussian random field. In this paper we further develop this method. In particular, we discuss the statistical meaning of the PFA and show that, although useful as a preliminary step in a detection procedure, it is not able to quantify the actual reliability of a specific detection. For this reason, a new quantity is introduced called the specific probability of false alarm (SPFA), which is able to carry out this computation. We show how this method works in targeted simulations and apply it to a few interferometric maps taken with the Atacama Large Millimeter/submillimeter Array (ALMA) and the Australia Telescope Compact Array (ATCA). We select a few potential new point sources and assign an accurate detection reliability to these sources.

  15. Quantifying the pattern of beta/A4 amyloid protein distribution in Alzheimer's disease by image analysis.

    PubMed

    Bruce, C V; Clinton, J; Gentleman, S M; Roberts, G W; Royston, M C

    1992-04-01

    We have undertaken a study of the distribution of the beta/A4 amyloid deposited in the cerebral cortex in Alzheimer's disease. Previous studies which have examined the differential distribution of amyloid in the cortex in order to determine the laminar pattern of cortical pathology have not proved to be conclusive. We have developed an alternative method for the solution of this problem. It involves the immunostaining of sections followed by computer-enhanced image analysis. A mathematical model is then used to describe both the amount and the pattern of amyloid across the cortex. This method is both accurate and reliable and also removes many of the problems concerning inter and intra-rater variability in measurement. This method will provide the basis for further quantitative studies on the differential distribution of amyloid in Alzheimer's disease and other cases of dementia where cerebral amyloidosis occurs.

  16. Moles: Tool-Assisted Environment Isolation with Closures

    NASA Astrophysics Data System (ADS)

    de Halleux, Jonathan; Tillmann, Nikolai

    Isolating test cases from environment dependencies is often desirable, as it increases test reliability and reduces test execution time. However, code that calls non-virtual methods or consumes sealed classes is often impossible to test in isolation. Moles is a new lightweight framework which addresses this problem. For any .NET method, Moles allows test-code to provide alternative implementations, given as .NET delegates, for which C# provides very concise syntax while capturing local variables in a closure object. Using code instrumentation, the Moles framework will redirect calls to provided delegates instead of the original methods. The Moles framework is designed to work together with the dynamic symbolic execution tool Pex to enable automated test generation. In a case study, testing code programmed against the Microsoft SharePoint Foundation API, we achieved full code coverage while running tests in isolation without an actual SharePoint server. The Moles framework integrates with .NET and Visual Studio.

  17. Chemometric analysis of attenuated total reflectance infrared spectra of Proteus mirabilis strains with defined structures of LPS.

    PubMed

    Zarnowiec, Paulina; Mizera, Andrzej; Chrapek, Magdalena; Urbaniak, Mariusz; Kaca, Wieslaw

    2016-07-01

    Proteus spp. strains are some of the most important pathogens associated with complicated urinary tract infections and bacteremia affecting patients with immunodeficiency and long-term urinary catheterization. For epidemiological purposes, various molecular typing methods have been developed for this pathogen. However, these methods are labor intensive and time consuming. We evaluated a new method of differentiation between strains. A collection of Proteus spp. strains was analyzed by attenuated total reflectance Fourier transform infrared (ATR FT-IR) spectroscopy in the mid-infrared region. ATR FT-IR spectroscopy used in conjunction with a diamond ATR accessory directly produced the biochemical profile of the surface chemistry of bacteria. We conclude that a combination of ATR FT-IR spectroscopy and mathematical modeling provides a fast and reliable alternative for discrimination between Proteus isolates, contributing to epidemiological research. © The Author(s) 2016.

  18. Fast tomographic methods for the tokamak ISTTOK

    NASA Astrophysics Data System (ADS)

    Carvalho, P. J.; Thomsen, H.; Gori, S.; Toussaint, U. v.; Weller, A.; Coelho, R.; Neto, A.; Pereira, T.; Silva, C.; Fernandes, H.

    2008-04-01

    The achievement of long duration, alternating current discharges on the tokamak IST-TOK requires a real-time plasma position control system. The plasma position determination based on magnetic probes system has been found to be inadequate during the current inversion due to the reduced plasma current. A tomography diagnostic has been therefore installed to supply the required feedback to the control system. Several tomographic methods are available for soft X-ray or bolo-metric tomography, among which the Cormack and Neural networks methods stand out due to their inherent speed of up to 1000 reconstructions per second, with currently available technology. This paper discusses the application of these algorithms on fusion devices while comparing performance and reliability of the results. It has been found that although the Cormack based inversion proved to be faster, the neural networks reconstruction has fewer artifacts and is more accurate.

  19. [Usefulness of the molecular techniques for detecting and/or identifing of parasites and fungi in humans and animals or pathogens transmitted by ticks (Part I)].

    PubMed

    Myjak, P; Majewska, A C; Bajer, A; Siński, E; Wedrychowicz, H; Gołab, E; Budak, A; Stańczak, J

    2001-01-01

    After a long period of using basic microscopic, immunological and biochemical methods for diagnosis, rapid development of nucleic acids investigation enabled introduction of specific and sensitive methods of detection of pathogenic agents on the molecular level. Among others, polymerase chain reaction (PCR), discovered in mid of 80'ies and then automatized, offered an attractive alternative to conventional testing systems. In this paper we describe reliable diagnostic tests widely used in the world, including Poland, and capable of detecting different disease agents as parasites and fungi in clinical specimens and pathogens of emerging zoonotic diseases in ticks. The possibilities of using molecular methods for determination of Plasmodium falciparum drug resistance is also discussed. Moreover, the report offers information concerning kinds of molecular tests and institutions in which there are executed.

  20. An alternate approach to the production of radioisotopes for nuclear medicine applications

    NASA Astrophysics Data System (ADS)

    D'Auria, John M.; Keller, Roderich; Ladouceur, Keith; Lapi, Suzanne E.; Ruth, Thomas J.; Schmor, Paul

    2013-03-01

    There is a growing need for the production of radioisotopes for both diagnostic and therapeutic medical applications. Radioisotopes that are produced using the (n,γ) or (γ,n) reactions, however, typically result in samples with low specific activity (radioactivity/gram) due to the high abundance of target material of the same element. One method to effectively remove the isotopic impurity is electro-magnetic mass separation. An Ion Source Test Facility has been constructed at TRIUMF to develop high-intensity, high-efficiency, reliable ion sources for purification of radioactive isotopes, particularly those used in nuclear medicine. In progress studies are presented.

  1. Microgravity

    NASA Image and Video Library

    1992-06-25

    Zeolites are crystalline aluminosilicates that have complex framework structures. However, there are several features of zeolite crystals that make unequivocal structure determinations difficult. The acquisition of reliable structural information on zeolites is greatly facilitated by the availability of high-quality specimens. For structure determinations by conventional diffraction techniques, large single-crystal specimens are essential. Alternatively, structural determinations by powder profile refinement methods relax the constraints on crystal size, but still require materials with a high degree of crystalline perfection. Studies conducted at CAMMP (Center for Advanced Microgravity Materials Processing) have demonstrated that microgravity processing can produce larger crystal sizes and fewer structural defects relative to terrestrial crystal growth. Principal Investigator: Dr. Albert Sacco

  2. An alternate approach to the production of radioisotopes for nuclear medicine applications.

    PubMed

    D'Auria, John M; Keller, Roderich; Ladouceur, Keith; Lapi, Suzanne E; Ruth, Thomas J; Schmor, Paul

    2013-03-01

    There is a growing need for the production of radioisotopes for both diagnostic and therapeutic medical applications. Radioisotopes that are produced using the (n,γ) or (γ,n) reactions, however, typically result in samples with low specific activity (radioactivity∕gram) due to the high abundance of target material of the same element. One method to effectively remove the isotopic impurity is electro-magnetic mass separation. An Ion Source Test Facility has been constructed at TRIUMF to develop high-intensity, high-efficiency, reliable ion sources for purification of radioactive isotopes, particularly those used in nuclear medicine. In progress studies are presented.

  3. Zeolites

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Zeolites are crystalline aluminosilicates that have complex framework structures. However, there are several features of zeolite crystals that make unequivocal structure determinations difficult. The acquisition of reliable structural information on zeolites is greatly facilitated by the availability of high-quality specimens. For structure determinations by conventional diffraction techniques, large single-crystal specimens are essential. Alternatively, structural determinations by powder profile refinement methods relax the constraints on crystal size, but still require materials with a high degree of crystalline perfection. Studies conducted at CAMMP (Center for Advanced Microgravity Materials Processing) have demonstrated that microgravity processing can produce larger crystal sizes and fewer structural defects relative to terrestrial crystal growth. Principal Investigator: Dr. Albert Sacco

  4. Autogenic-feedback training - A treatment for motion and space sickness

    NASA Technical Reports Server (NTRS)

    Cowings, Patricia S.

    1990-01-01

    A training method for preventing the occurrence of motion sickness in humans, called autogenic-feedback training (AFT), is described. AFT is based on a combination of biofeedback and autogenic therapy which involves training physiological self-regulation as an alternative to pharmacological management. AFT was used to reliably increase tolerance to motion-sickness-inducing tests in both men and women ranging in age from 18 to 54 years. The effectiveness of AFT is found to be significantly higher than that of protective adaptation training. Data obtained show that there is no apparent effect from AFT on measures of vestibular perception and no side effects.

  5. Comparative Proteomics Reveals a Significant Bias Toward Alternative Protein Isoforms with Conserved Structure and Function

    PubMed Central

    Ezkurdia, Iakes; del Pozo, Angela; Frankish, Adam; Rodriguez, Jose Manuel; Harrow, Jennifer; Ashman, Keith; Valencia, Alfonso; Tress, Michael L.

    2012-01-01

    Advances in high-throughput mass spectrometry are making proteomics an increasingly important tool in genome annotation projects. Peptides detected in mass spectrometry experiments can be used to validate gene models and verify the translation of putative coding sequences (CDSs). Here, we have identified peptides that cover 35% of the genes annotated by the GENCODE consortium for the human genome as part of a comprehensive analysis of experimental spectra from two large publicly available mass spectrometry databases. We detected the translation to protein of “novel” and “putative” protein-coding transcripts as well as transcripts annotated as pseudogenes and nonsense-mediated decay targets. We provide a detailed overview of the population of alternatively spliced protein isoforms that are detectable by peptide identification methods. We found that 150 genes expressed multiple alternative protein isoforms. This constitutes the largest set of reliably confirmed alternatively spliced proteins yet discovered. Three groups of genes were highly overrepresented. We detected alternative isoforms for 10 of the 25 possible heterogeneous nuclear ribonucleoproteins, proteins with a key role in the splicing process. Alternative isoforms generated from interchangeable homologous exons and from short indels were also significantly enriched, both in human experiments and in parallel analyses of mouse and Drosophila proteomics experiments. Our results show that a surprisingly high proportion (almost 25%) of the detected alternative isoforms are only subtly different from their constitutive counterparts. Many of the alternative splicing events that give rise to these alternative isoforms are conserved in mouse. It was striking that very few of these conserved splicing events broke Pfam functional domains or would damage globular protein structures. This evidence of a strong bias toward subtle differences in CDS and likely conserved cellular function and structure is remarkable and strongly suggests that the translation of alternative transcripts may be subject to selective constraints. PMID:22446687

  6. Relationships between In-Course Alignment Indicators and Post-Course Criteria of Quality Teaching and Learning in Higher Education.

    ERIC Educational Resources Information Center

    Bastick, Tony

    The research literature on student evaluation of teaching (SET) is filled with criticisms of the process, its applications, and the student feedback questionnaire it uses. SETs are still used, however, because there has seemed to be no economical, valid, and reliable alternative. This paper reports on an alternative alignment process for…

  7. Regional analyses of streamflow characteristics

    USGS Publications Warehouse

    Riggs, H.C.

    1973-01-01

    This manual describes various ways of generalizing streamflow characteristics and evaluates the applicability and reliability of each under various hydrologic conditions. Several alternatives to regionalization are briefly described.

  8. Probabilistic Design Storm Method for Improved Flood Estimation in Ungauged Catchments

    NASA Astrophysics Data System (ADS)

    Berk, Mario; Å pačková, Olga; Straub, Daniel

    2017-12-01

    The design storm approach with event-based rainfall-runoff models is a standard method for design flood estimation in ungauged catchments. The approach is conceptually simple and computationally inexpensive, but the underlying assumptions can lead to flawed design flood estimations. In particular, the implied average recurrence interval (ARI) neutrality between rainfall and runoff neglects uncertainty in other important parameters, leading to an underestimation of design floods. The selection of a single representative critical rainfall duration in the analysis leads to an additional underestimation of design floods. One way to overcome these nonconservative approximations is the use of a continuous rainfall-runoff model, which is associated with significant computational cost and requires rainfall input data that are often not readily available. As an alternative, we propose a novel Probabilistic Design Storm method that combines event-based flood modeling with basic probabilistic models and concepts from reliability analysis, in particular the First-Order Reliability Method (FORM). The proposed methodology overcomes the limitations of the standard design storm approach, while utilizing the same input information and models without excessive computational effort. Additionally, the Probabilistic Design Storm method allows deriving so-called design charts, which summarize representative design storm events (combinations of rainfall intensity and other relevant parameters) for floods with different return periods. These can be used to study the relationship between rainfall and runoff return periods. We demonstrate, investigate, and validate the method by means of an example catchment located in the Bavarian Pre-Alps, in combination with a simple hydrological model commonly used in practice.

  9. Accurate paleointensities - the multi-method approach

    NASA Astrophysics Data System (ADS)

    de Groot, Lennart

    2016-04-01

    The accuracy of models describing rapid changes in the geomagnetic field over the past millennia critically depends on the availability of reliable paleointensity estimates. Over the past decade methods to derive paleointensities from lavas (the only recorder of the geomagnetic field that is available all over the globe and through geologic times) have seen significant improvements and various alternative techniques were proposed. The 'classical' Thellier-style approach was optimized and selection criteria were defined in the 'Standard Paleointensity Definitions' (Paterson et al, 2014). The Multispecimen approach was validated and the importance of additional tests and criteria to assess Multispecimen results must be emphasized. Recently, a non-heating, relative paleointensity technique was proposed -the pseudo-Thellier protocol- which shows great potential in both accuracy and efficiency, but currently lacks a solid theoretical underpinning. Here I present work using all three of the aforementioned paleointensity methods on suites of young lavas taken from the volcanic islands of Hawaii, La Palma, Gran Canaria, Tenerife, and Terceira. Many of the sampled cooling units are <100 years old, the actual field strength at the time of cooling is therefore reasonably well known. Rather intuitively, flows that produce coherent results from two or more different paleointensity methods yield the most accurate estimates of the paleofield. Furthermore, the results for some flows pass the selection criteria for one method, but fail in other techniques. Scrutinizing and combing all acceptable results yielded reliable paleointensity estimates for 60-70% of all sampled cooling units - an exceptionally high success rate. This 'multi-method paleointensity approach' therefore has high potential to provide the much-needed paleointensities to improve geomagnetic field models for the Holocene.

  10. Application of gas chromatography/flame ionization detector-based metabolite fingerprinting for authentication of Asian palm civet coffee (Kopi Luwak).

    PubMed

    Jumhawan, Udi; Putri, Sastia Prama; Yusianto; Bamba, Takeshi; Fukusaki, Eiichiro

    2015-11-01

    Development of authenticity screening for Asian palm civet coffee, the world-renowned priciest coffee, was previously reported using metabolite profiling through gas chromatography/mass spectrometry (GC/MS). However, a major drawback of this approach is the high cost of the instrument and maintenance. Therefore, an alternative method is needed for quality and authenticity evaluation of civet coffee. A rapid, reliable and cost-effective analysis employing a universal detector, GC coupled with flame ionization detector (FID), and metabolite fingerprinting has been established for discrimination analysis of 37 commercial and non-commercial coffee beans extracts. gas chromatography/flame ionization detector (GC/FID) provided higher sensitivity over a similar range of detected compounds than GC/MS. In combination with multivariate analysis, GC/FID could successfully reproduce quality prediction from GC/MS for differentiation of commercial civet coffee, regular coffee and coffee blend with 50 wt % civet coffee content without prior metabolite details. Our study demonstrated that GC/FID-based metabolite fingerprinting can be effectively actualized as an alternative method for coffee authenticity screening in industries. Copyright © 2015. Published by Elsevier B.V.

  11. Patterns of Cognitive Strengths and Weaknesses: Identification Rates, Agreement, and Validity for Learning Disabilities Identification

    PubMed Central

    Miciak, Jeremy; Fletcher, Jack M.; Stuebing, Karla; Vaughn, Sharon; Tolar, Tammy D.

    2014-01-01

    Purpose Few empirical investigations have evaluated LD identification methods based on a pattern of cognitive strengths and weaknesses (PSW). This study investigated the reliability and validity of two proposed PSW methods: the concordance/discordance method (C/DM) and cross battery assessment (XBA) method. Methods Cognitive assessment data for 139 adolescents demonstrating inadequate response to intervention was utilized to empirically classify participants as meeting or not meeting PSW LD identification criteria using the two approaches, permitting an analysis of: (1) LD identification rates; (2) agreement between methods; and (3) external validity. Results LD identification rates varied between the two methods depending upon the cut point for low achievement, with low agreement for LD identification decisions. Comparisons of groups that met and did not meet LD identification criteria on external academic variables were largely null, raising questions of external validity. Conclusions This study found low agreement and little evidence of validity for LD identification decisions based on PSW methods. An alternative may be to use multiple measures of academic achievement to guide intervention. PMID:24274155

  12. Identification and validation of nebulized aerosol devices for sputum induction

    PubMed Central

    Davidson, Warren J; Dennis, John; The, Stephanie; Litoski, Belinda; Pieron, Cora; Leigh, Richard

    2014-01-01

    Induced sputum cell count measurement has proven reliability for evaluating airway inflammation in patients with asthma and other airway diseases. Although the use of nebulizer devices for sputum induction is commonplace, they are generally labelled as single-patient devices by the manufacturer and, therefore, cannot be used for multiple patients in large clinical sputum induction programs due to infect ion-control requirements. Accordingly, this study investigated the aerosol characteristics of alternative devices that could be used in such programs. BACKGROUND: Induced sputum cell counts are a noninvasive and reliable method for evaluating the presence, type and degree of airway inflammation in patients with asthma. Currently, standard nebulizer devices used for sputum induction in multiple patients are labelled as single-patient devices by the manufacturer, which conflicts with infection prevention and control requirements. As such, these devices cannot feasibly be used in a clinical sputum induction program. Therefore, there is a need to identify alternative nebulizer devices that are either disposable or labelled for multi-patient use. OBJECTIVE: To apply validated rigorous, scientific testing methods to identify and validate commercially available nebulizer devices appropriate for use in a clinical sputum induction program. METHODS: Measurement of nebulized aerosol output and size for the selected nebulizer designs followed robust International Organization for Standardization methods. Sputum induction using two of these nebulizers was successfully performed on 10 healthy adult subjects. The cytotechnologist performing sputum cell counts was blinded to the type of nebulizer used. RESULTS: The studied nebulizers had variable aerosol outputs. The AeroNeb Solo (Aerogen, Ireland), Omron NE-U17 (Omron, Japan) and EASYneb II (Flaem Nuova, Italy) systems were found to have similar measurements of aerosol size. There was no significant difference in induced sputum cell results between the AeroNeb Solo and EASYneb II devices. DISCUSSION: There is a need for rigorous, scientific evaluation of nebulizer devices for clinical applications, including sputum induction, for measurement of cell counts. CONCLUSION: The present study was the most comprehensive analysis of different nebulizer devices for sputum induction to measure cell counts, and provides a framework for appropriate evaluation of nebulizer devices for induced sputum testing. PMID:24288700

  13. First international collaborative study to evaluate rabies antibody detection method for use in monitoring the effectiveness of oral vaccination programmes in fox and raccoon dog in Europe.

    PubMed

    Wasniewski, M; Almeida, I; Baur, A; Bedekovic, T; Boncea, D; Chaves, L B; David, D; De Benedictis, P; Dobrostana, M; Giraud, P; Hostnik, P; Jaceviciene, I; Kenklies, S; König, M; Mähar, K; Mojzis, M; Moore, S; Mrenoski, S; Müller, T; Ngoepe, E; Nishimura, M; Nokireki, T; Pejovic, N; Smreczak, M; Strandbygaard, B; Wodak, E; Cliquet, F

    2016-12-01

    The most effective and sustainable method to control and eliminate rabies in wildlife is the oral rabies vaccination (ORV) of target species, namely foxes and raccoon dogs in Europe. According to WHO and OIE, the effectiveness of oral vaccination campaigns should be regularly assessed via disease surveillance and ORV antibody monitoring. Rabies antibodies are generally screened for in field animal cadavers, whose body fluids are often of poor quality. Therefore, the use of alternative methods such as the enzyme-linked immunosorbent assay (ELISA) has been proposed to improve reliability of serological results obtained on wildlife samples. We undertook an international collaborative study to determine if the commercial BioPro ELISA Rabies Ab kit is a reliable and reproducible tool for rabies serological testing. Our results reveal that the overall specificity evaluated on naive samples reached 96.7%, and the coefficients of concordance obtained for fox and raccoon dog samples were 97.2% and 97.5%, respectively. The overall agreement values obtained for the four marketed oral vaccines used in Europe were all equal to or greater than 95%. The coefficients of concordance obtained by laboratories ranged from 87.2% to 100%. The results of this collaborative study show good robustness and reproducibility of the BioPro ELISA Rabies Ab kit. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. PCA leverage: outlier detection for high-dimensional functional magnetic resonance imaging data.

    PubMed

    Mejia, Amanda F; Nebel, Mary Beth; Eloyan, Ani; Caffo, Brian; Lindquist, Martin A

    2017-07-01

    Outlier detection for high-dimensional (HD) data is a popular topic in modern statistical research. However, one source of HD data that has received relatively little attention is functional magnetic resonance images (fMRI), which consists of hundreds of thousands of measurements sampled at hundreds of time points. At a time when the availability of fMRI data is rapidly growing-primarily through large, publicly available grassroots datasets-automated quality control and outlier detection methods are greatly needed. We propose principal components analysis (PCA) leverage and demonstrate how it can be used to identify outlying time points in an fMRI run. Furthermore, PCA leverage is a measure of the influence of each observation on the estimation of principal components, which are often of interest in fMRI data. We also propose an alternative measure, PCA robust distance, which is less sensitive to outliers and has controllable statistical properties. The proposed methods are validated through simulation studies and are shown to be highly accurate. We also conduct a reliability study using resting-state fMRI data from the Autism Brain Imaging Data Exchange and find that removal of outliers using the proposed methods results in more reliable estimation of subject-level resting-state networks using independent components analysis. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. NURD: an implementation of a new method to estimate isoform expression from non-uniform RNA-seq data

    PubMed Central

    2013-01-01

    Background RNA-Seq technology has been used widely in transcriptome study, and one of the most important applications is to estimate the expression level of genes and their alternative splicing isoforms. There have been several algorithms published to estimate the expression based on different models. Recently Wu et al. published a method that can accurately estimate isoform level expression by considering position-related sequencing biases using nonparametric models. The method has advantages in handling different read distributions, but there hasn’t been an efficient program to implement this algorithm. Results We developed an efficient implementation of the algorithm in the program NURD. It uses a binary interval search algorithm. The program can correct both the global tendency of sequencing bias in the data and local sequencing bias specific to each gene. The correction makes the isoform expression estimation more reliable under various read distributions. And the implementation is computationally efficient in both the memory cost and running time and can be readily scaled up for huge datasets. Conclusion NURD is an efficient and reliable tool for estimating the isoform expression level. Given the reads mapping result and gene annotation file, NURD will output the expression estimation result. The package is freely available for academic use at http://bioinfo.au.tsinghua.edu.cn/software/NURD/. PMID:23837734

  16. Reliability and comparison of Kinect-based methods for estimating spatiotemporal gait parameters of healthy and post-stroke individuals.

    PubMed

    Latorre, Jorge; Llorens, Roberto; Colomer, Carolina; Alcañiz, Mariano

    2018-04-27

    Different studies have analyzed the potential of the off-the-shelf Microsoft Kinect, in its different versions, to estimate spatiotemporal gait parameters as a portable markerless low-cost alternative to laboratory grade systems. However, variability in populations, measures, and methodologies prevents accurate comparison of the results. The objective of this study was to determine and compare the reliability of the existing Kinect-based methods to estimate spatiotemporal gait parameters in healthy and post-stroke adults. Forty-five healthy individuals and thirty-eight stroke survivors participated in this study. Participants walked five meters at a comfortable speed and their spatiotemporal gait parameters were estimated from the data retrieved by a Kinect v2, using the most common methods in the literature, and by visual inspection of the videotaped performance. Errors between both estimations were computed. For both healthy and post-stroke participants, highest accuracy was obtained when using the speed of the ankles to estimate gait speed (3.6-5.5 cm/s), stride length (2.5-5.5 cm), and stride time (about 45 ms), and when using the distance between the sacrum and the ankles and toes to estimate double support time (about 65 ms) and swing time (60-90 ms). Although the accuracy of these methods is limited, these measures could occasionally complement traditional tools. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Serbian translation of the 20-item Toronto Alexithymia Scale: psychometric properties and the new methodological approach in translating scales.

    PubMed

    Trajanović, Nikola N; Djurić, Vladimir; Latas, Milan; Milovanović, Srdjan; Jovanović, Aleksandar A; Djurić, Dusan

    2013-01-01

    Since inception of the alexithymia construct in 1970's, there has been a continuous effort to improve both its theoretical postulates and the clinical utility through development, standardization and validation of assessment scales. The aim of this study was to validate the Serbian translation of the 20-item Toronto Alexithymia Scale (TAS-20) and to propose a new method of translation of scales with a property of temporal stability. The scale was expertly translated by bilingual medical professionals and a linguist, and given to a sample of bilingual participants from the general population who completed both the English and the Serbian version of the scale one week apart. The findings showed that the Serbian version of the TAS-20 had a good internal consistency reliability regarding total scale (alpha=0.86), and acceptable reliability of the three factors (alpha=0.71-0.79). The analysis confirmed the validity and consistency of the Serbian translation of the scale, with observed weakness of the factorial structure consistent with studies in other languages. The results also showed that the method of utilizing a self-control bilingual subject is a useful alternative to the back-translation method, particularly in cases of linguistically and structurally sensitive scales, or in cases where a larger sample is not available. This method, dubbed as 'forth-translation' could be used to translate psychometric scales measuring properties which have temporal stability over the period of at least several weeks.

  18. Establishing Reliability and Validity of the Criterion Referenced Exam of GeoloGy Standards EGGS

    NASA Astrophysics Data System (ADS)

    Guffey, S. K.; Slater, S. J.; Slater, T. F.; Schleigh, S.; Burrows, A. C.

    2016-12-01

    Discipline-based geoscience education researchers have considerable need for a criterion-referenced, easy-to-administer and -score conceptual diagnostic survey for undergraduates taking introductory science survey courses in order for faculty to better be able to monitor the learning impacts of various interactive teaching approaches. To support ongoing education research across the geosciences, we are continuing to rigorously and systematically work to firmly establish the reliability and validity of the recently released Exam of GeoloGy Standards, EGGS. In educational testing, reliability refers to the consistency or stability of test scores whereas validity refers to the accuracy of the inferences or interpretations one makes from test scores. There are several types of reliability measures being applied to the iterative refinement of the EGGS survey, including test-retest, alternate form, split-half, internal consistency, and interrater reliability measures. EGGS rates strongly on most measures of reliability. For one, Cronbach's alpha provides a quantitative index indicating the extent to which if students are answering items consistently throughout the test and measures inter-item correlations. Traditional item analysis methods further establish the degree to which a particular item is reliably assessing students is actually quantifiable, including item difficulty and item discrimination. Validity, on the other hand, is perhaps best described by the word accuracy. For example, content validity is the to extent to which a measurement reflects the specific intended domain of the content, stemming from judgments of people who are either experts in the testing of that particular content area or are content experts. Perhaps more importantly, face validity is a judgement of how representative an instrument is reflective of the science "at face value" and refers to the extent to which a test appears to measure a the targeted scientific domain as viewed by laypersons, examinees, test users, the public, and other invested stakeholders.

  19. Reliability and Validity Evidence of Multiple Balance Assessments in Athletes With a Concussion

    PubMed Central

    Murray, Nicholas; Salvatore, Anthony; Powell, Douglas; Reed-Jones, Rebecca

    2014-01-01

    Context: An estimated 300 000 sport-related concussion injuries occur in the United States annually. Approximately 30% of individuals with concussions experience balance disturbances. Common methods of balance assessment include the Clinical Test of Sensory Organization and Balance (CTSIB), the Sensory Organization Test (SOT), the Balance Error Scoring System (BESS), and the Romberg test; however, the National Collegiate Athletic Association recommended the Wii Fit as an alternative measure of balance in athletes with a concussion. A central concern regarding the implementation of the Wii Fit is whether it is reliable and valid for measuring balance disturbance in athletes with concussion. Objective: To examine the reliability and validity evidence for the CTSIB, SOT, BESS, Romberg test, and Wii Fit for detecting balance disturbance in athletes with a concussion. Data Sources: Literature considered for review included publications with reliability and validity data for the assessments of balance (CTSIB, SOT, BESS, Romberg test, and Wii Fit) from PubMed, PsycINFO, and CINAHL. Data Extraction: We identified 63 relevant articles for consideration in the review. Of the 63 articles, 28 were considered appropriate for inclusion and 35 were excluded. Data Synthesis: No current reliability or validity information supports the use of the CTSIB, SOT, Romberg test, or Wii Fit for balance assessment in athletes with a concussion. The BESS demonstrated moderate to high reliability (interclass correlation coefficient = 0.87) and low to moderate validity (sensitivity = 34%, specificity = 87%). However, the Romberg test and Wii Fit have been shown to be reliable tools in the assessment of balance in Parkinson patients. Conclusions: The BESS can evaluate balance problems after a concussion. However, it lacks the ability to detect balance problems after the third day of recovery. Further investigation is needed to establish the use of the CTSIB, SOT, Romberg test, and Wii Fit for assessing balance in athletes with concussions. PMID:24933431

  20. New Rock-Drilling Method in 'Mars Yard' Test

    NASA Image and Video Library

    2017-10-23

    This photo taken in the "Mars Yard" at NASA's Jet Propulsion Laboratory, Pasadena, California, on Aug. 1, 2017, shows a step in development of possible alternative techniques that NASA's Curiosity Mars rover might be able to use to resume drilling into rocks on Mars. In late 2016, after Curiosity's drill had collected sample material from 15 Martian rocks in four years, the drill's feed mechanism ceased working reliably. That motorized mechanism moved the bit forward or back with relation to stabilizer posts on either side of the bit. In normal drilling by Curiosity, the stabilizers were positioned on the target rock first, and then the feed mechanism extended the rotation-percussion bit into the rock. In the alternative technique seen here, called "feed-extended drilling," the test rover's stabilizers are not used to touch the rock. The bit is advanced into the rock by motion of the robotic arm rather than the drill's feed mechanism. https://photojournal.jpl.nasa.gov/catalog/PIA22062

  1. Development and ultra-structure of an ultra-thin silicone epidermis of bioengineered alternative tissue.

    PubMed

    Wessels, Quenton; Pretorius, Etheresia

    2015-08-01

    Burn wound care today has a primary objective of temporary or permanent wound closure. Commercially available engineered alternative tissues have become a valuable adjunct to the treatment of burn injuries. Their constituents can be biological, alloplastic or a combination of both. Here the authors describe the aspects of the development of a siloxane epidermis for a collagen-glycosaminoglycan and for nylon-based artificial skin replacement products. A method to fabricate an ultra-thin epidermal equivalent is described. Pores, to allow the escape of wound exudate, were punched and a tri-filament nylon mesh or collagen scaffold was imbedded and silicone polymerisation followed at 120°C for 5 minutes. The ultra-structure of these bilaminates was assessed through scanning electron microscopy. An ultra-thin biomedical grade siloxane film was reliably created through precision coating on a pre-treated polyethylene terephthalate carrier. © 2013 The Authors. International Wound Journal © 2013 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  2. t4 Workshop Report

    PubMed Central

    Silbergeld, Ellen K.; Contreras, Elizabeth Q.; Hartung, Thomas; Hirsch, Cordula; Hogberg, Helena; Jachak, Ashish C.; Jordan, William; Landsiedel, Robert; Morris, Jeffery; Patri, Anil; Pounds, Joel G.; de Vizcaya Ruiz, Andrea; Shvedova, Anna; Tanguay, Robert; Tatarazako, Norihasa; van Vliet, Erwin; Walker, Nigel J.; Wiesner, Mark; Wilcox, Neil; Zurlo, Joanne

    2014-01-01

    Summary In October 2010, a group of experts met as part of the transatlantic think tank for toxicology (t4) to exchange ideas about the current status and future of safety testing of nanomaterials. At present, there is no widely accepted path forward to assure appropriate and effective hazard identification for engineered nanomaterials. The group discussed needs for characterization of nanomaterials and identified testing protocols that incorporate the use of innovative alternative whole models such as zebrafish or C. elegans, as well as in vitro or alternative methods to examine specific functional pathways and modes of action. The group proposed elements of a potential testing scheme for nanomaterials that works towards an integrated testing strategy, incorporating the goals of the NRC report Toxicity Testing in the 21st Century: A Vision and a Strategy by focusing on pathways of toxic response, and utilizing an evidence-based strategy for developing the knowledge base for safety assessment. Finally, the group recommended that a reliable, open, curated database be developed that interfaces with existing databases to enable sharing of information. PMID:21993959

  3. Accuracy of a Classical Test Theory-Based Procedure for Estimating the Reliability of a Multistage Test. Research Report. ETS RR-17-02

    ERIC Educational Resources Information Center

    Kim, Sooyeon; Livingston, Samuel A.

    2017-01-01

    The purpose of this simulation study was to assess the accuracy of a classical test theory (CTT)-based procedure for estimating the alternate-forms reliability of scores on a multistage test (MST) having 3 stages. We generated item difficulty and discrimination parameters for 10 parallel, nonoverlapping forms of the complete 3-stage test and…

  4. Objectivity, Reliability, and Validity of the Bent-Knee Push-Up for College-Age Women

    ERIC Educational Resources Information Center

    Wood, Heather M.; Baumgartner, Ted A.

    2004-01-01

    The revised push-up test has been found to have good validity but it produces many zero scores for women. Maybe there should be an alternative to the revised push-up test for college-age women. The purpose of this study was to determine the objectivity, reliability, and validity for the bent-knee push-up test (executed on hands and knees) for…

  5. Applicability and Limitations of Reliability Allocation Methods

    NASA Technical Reports Server (NTRS)

    Cruz, Jose A.

    2016-01-01

    Reliability allocation process may be described as the process of assigning reliability requirements to individual components within a system to attain the specified system reliability. For large systems, the allocation process is often performed at different stages of system design. The allocation process often begins at the conceptual stage. As the system design develops, more information about components and the operating environment becomes available, different allocation methods can be considered. Reliability allocation methods are usually divided into two categories: weighting factors and optimal reliability allocation. When properly applied, these methods can produce reasonable approximations. Reliability allocation techniques have limitations and implied assumptions that need to be understood by system engineers. Applying reliability allocation techniques without understanding their limitations and assumptions can produce unrealistic results. This report addresses weighting factors, optimal reliability allocation techniques, and identifies the applicability and limitations of each reliability allocation technique.

  6. A 'feather-trap' for collecting DNA samples from birds.

    PubMed

    Maurer, Golo; Beck, Nadeena; Double, Michael C

    2010-01-01

    Genetic analyses of birds are usually based on DNA extracted from a blood sample. For some species, however, obtaining blood samples is difficult because they are sensitive to handling, pose a conservation or animal welfare concern, or evade capture. In such cases, feathers obtained from live birds in the wild can provide an alternative source of DNA. Here, we provide the first description and evaluation of a 'feather-trap', consisting of small strips of double-sided adhesive tape placed close to a nest with chicks, as a simple, inexpensive and minimally invasive method to collect feathers. The feather-trap was tested in tropical conditions on the Australian pheasant coucal (Centropus phasianinus). None of the 12 pairs of coucals on which the feather-trap was used abandoned the nest, and feeding rates did not differ from those of birds not exposed to a feather-trap. On average, 4.2 feathers were collected per trap over 2-5 days and, despite exposure to monsoonal rain, DNA was extracted from 71.4% of samples, albeit at low concentrations. The amount of genomic DNA extracted from each feather was sufficient to reliably genotype individuals at up to five microsatellite loci for parentage analysis. We show that a feather-trap can provide a reliable alternative for obtaining DNA in species where taking blood is difficult. It may also prove useful for collecting feather samples for other purposes, e.g. stable-isotope analysis. © 2009 Blackwell Publishing Ltd.

  7. Alternative magnetic flux leakage modalities for pipeline inspection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katragadda, G.; Lord, W.; Sun, Y.S.

    1996-05-01

    Increasing quality consciousness is placing higher demands on the accuracy and reliability of inspection systems used in defect detection and characterization. Nondestructive testing techniques often rely on using multi-transducer approaches to obtain greater defect sensitivity. This paper investigates the possibility of taking advantage of alternative modalities associated with the standard magnetic flux leakage tool to obtain additional defect information, while still using a single excitation source.

  8. High Energy Astronomy Observatory, Mission C, Phase A. Volume 3: Appendices

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Technical data, and experiment and spacecraft alternatives are presented in support of the HEAO-C, whose primary objective is a detailed study of the more interesting high energy sources, using grazing incidence X-ray telescopes and a spacecraft pointing accuracy of + or - 1 arc minute. The analyses presented cover the mission analysis and launch vehicle; thermal control trade studies and supporting analyses; attitude sensing and control analyses; electrical systems; and reliability analysis. The alternate experiments which were considered are listed, and the advantages and disadvantages of several alternate observatory configurations are assessed.

  9. Quantitative metabolomics of the thermophilic methylotroph Bacillus methanolicus.

    PubMed

    Carnicer, Marc; Vieira, Gilles; Brautaset, Trygve; Portais, Jean-Charles; Heux, Stephanie

    2016-06-01

    The gram-positive bacterium Bacillus methanolicus MGA3 is a promising candidate for methanol-based biotechnologies. Accurate determination of intracellular metabolites is crucial for engineering this bacteria into an efficient microbial cell factory. Due to the diversity of chemical and cell properties, an experimental protocol validated on B. methanolicus is needed. Here a systematic evaluation of different techniques for establishing a reliable basis for metabolome investigations is presented. Metabolome analysis was focused on metabolites closely linked with B. methanolicus central methanol metabolism. As an alternative to cold solvent based procedures, a solvent-free quenching strategy using stainless steel beads cooled to -20 °C was assessed. The precision, the consistency of the measurements, and the extent of metabolite leakage from quenched cells were evaluated in procedures with and without cell separation. The most accurate and reliable performance was provided by the method without cell separation, as significant metabolite leakage occurred in the procedures based on fast filtration. As a biological test case, the best protocol was used to assess the metabolome of B. methanolicus grown in chemostat on methanol at two different growth rates and its validity was demonstrated. The presented protocol is a first and helpful step towards developing reliable metabolomics data for thermophilic methylotroph B. methanolicus. This will definitely help for designing an efficient methylotrophic cell factory.

  10. Reliable oligonucleotide conformational ensemble generation in explicit solvent for force field assessment using reservoir replica exchange molecular dynamics simulations

    PubMed Central

    Henriksen, Niel M.; Roe, Daniel R.; Cheatham, Thomas E.

    2013-01-01

    Molecular dynamics force field development and assessment requires a reliable means for obtaining a well-converged conformational ensemble of a molecule in both a time-efficient and cost-effective manner. This remains a challenge for RNA because its rugged energy landscape results in slow conformational sampling and accurate results typically require explicit solvent which increases computational cost. To address this, we performed both traditional and modified replica exchange molecular dynamics simulations on a test system (alanine dipeptide) and an RNA tetramer known to populate A-form-like conformations in solution (single-stranded rGACC). A key focus is on providing the means to demonstrate that convergence is obtained, for example by investigating replica RMSD profiles and/or detailed ensemble analysis through clustering. We found that traditional replica exchange simulations still require prohibitive time and resource expenditures, even when using GPU accelerated hardware, and our results are not well converged even at 2 microseconds of simulation time per replica. In contrast, a modified version of replica exchange, reservoir replica exchange in explicit solvent, showed much better convergence and proved to be both a cost-effective and reliable alternative to the traditional approach. We expect this method will be attractive for future research that requires quantitative conformational analysis from explicitly solvated simulations. PMID:23477537

  11. Reliable oligonucleotide conformational ensemble generation in explicit solvent for force field assessment using reservoir replica exchange molecular dynamics simulations.

    PubMed

    Henriksen, Niel M; Roe, Daniel R; Cheatham, Thomas E

    2013-04-18

    Molecular dynamics force field development and assessment requires a reliable means for obtaining a well-converged conformational ensemble of a molecule in both a time-efficient and cost-effective manner. This remains a challenge for RNA because its rugged energy landscape results in slow conformational sampling and accurate results typically require explicit solvent which increases computational cost. To address this, we performed both traditional and modified replica exchange molecular dynamics simulations on a test system (alanine dipeptide) and an RNA tetramer known to populate A-form-like conformations in solution (single-stranded rGACC). A key focus is on providing the means to demonstrate that convergence is obtained, for example, by investigating replica RMSD profiles and/or detailed ensemble analysis through clustering. We found that traditional replica exchange simulations still require prohibitive time and resource expenditures, even when using GPU accelerated hardware, and our results are not well converged even at 2 μs of simulation time per replica. In contrast, a modified version of replica exchange, reservoir replica exchange in explicit solvent, showed much better convergence and proved to be both a cost-effective and reliable alternative to the traditional approach. We expect this method will be attractive for future research that requires quantitative conformational analysis from explicitly solvated simulations.

  12. PWSCC Assessment by Using Extended Finite Element Method

    NASA Astrophysics Data System (ADS)

    Lee, Sung-Jun; Lee, Sang-Hwan; Chang, Yoon-Suk

    2015-12-01

    The head penetration nozzle of control rod driving mechanism (CRDM) is known to be susceptible to primary water stress corrosion cracking (PWSCC) due to the welding-induced residual stress. Especially, the J-groove dissimilar metal weld regions have received many attentions in the previous studies. However, even though several advanced techniques such as weight function and finite element alternating methods have been introduced to predict the occurrence of PWSCC, there are still difficulties in respect of applicability and efficiency. In this study, the extended finite element method (XFEM), which allows convenient crack element modeling by enriching degree of freedom (DOF) with special displacement function, was employed to evaluate structural integrity of the CRDM head penetration nozzle. The resulting stress intensity factors of surface cracks were verified for the reliability of proposed method through the comparison with those suggested in the American Society of Mechanical Engineering (ASME) code. The detailed results from the FE analyses are fully discussed in the manuscript.

  13. Robust Coefficients Alpha and Omega and Confidence Intervals With Outlying Observations and Missing Data: Methods and Software.

    PubMed

    Zhang, Zhiyong; Yuan, Ke-Hai

    2016-06-01

    Cronbach's coefficient alpha is a widely used reliability measure in social, behavioral, and education sciences. It is reported in nearly every study that involves measuring a construct through multiple items. With non-tau-equivalent items, McDonald's omega has been used as a popular alternative to alpha in the literature. Traditional estimation methods for alpha and omega often implicitly assume that data are complete and normally distributed. This study proposes robust procedures to estimate both alpha and omega as well as corresponding standard errors and confidence intervals from samples that may contain potential outlying observations and missing values. The influence of outlying observations and missing data on the estimates of alpha and omega is investigated through two simulation studies. Results show that the newly developed robust method yields substantially improved alpha and omega estimates as well as better coverage rates of confidence intervals than the conventional nonrobust method. An R package coefficientalpha is developed and demonstrated to obtain robust estimates of alpha and omega.

  14. Robust estimation procedure in panel data model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shariff, Nurul Sima Mohamad; Hamzah, Nor Aishah

    2014-06-19

    The panel data modeling has received a great attention in econometric research recently. This is due to the availability of data sources and the interest to study cross sections of individuals observed over time. However, the problems may arise in modeling the panel in the presence of cross sectional dependence and outliers. Even though there are few methods that take into consideration the presence of cross sectional dependence in the panel, the methods may provide inconsistent parameter estimates and inferences when outliers occur in the panel. As such, an alternative method that is robust to outliers and cross sectional dependencemore » is introduced in this paper. The properties and construction of the confidence interval for the parameter estimates are also considered in this paper. The robustness of the procedure is investigated and comparisons are made to the existing method via simulation studies. Our results have shown that robust approach is able to produce an accurate and reliable parameter estimates under the condition considered.« less

  15. Comparison of manual and automatic techniques for substriatal segmentation in 11C-raclopride high-resolution PET studies.

    PubMed

    Johansson, Jarkko; Alakurtti, Kati; Joutsa, Juho; Tohka, Jussi; Ruotsalainen, Ulla; Rinne, Juha O

    2016-10-01

    The striatum is the primary target in regional C-raclopride-PET studies, and despite its small volume, it contains several functional and anatomical subregions. The outcome of the quantitative dopamine receptor study using C-raclopride-PET depends heavily on the quality of the region-of-interest (ROI) definition of these subregions. The aim of this study was to evaluate subregional analysis techniques because new approaches have emerged, but have not yet been compared directly. In this paper, we compared manual ROI delineation with several automatic methods. The automatic methods used either direct clustering of the PET image or individualization of chosen brain atlases on the basis of MRI or PET image normalization. State-of-the-art normalization methods and atlases were applied, including those provided in the FreeSurfer, Statistical Parametric Mapping8, and FSL software packages. Evaluation of the automatic methods was based on voxel-wise congruity with the manual delineations and the test-retest variability and reliability of the outcome measures using data from seven healthy male participants who were scanned twice with C-raclopride-PET on the same day. The results show that both manual and automatic methods can be used to define striatal subregions. Although most of the methods performed well with respect to the test-retest variability and reliability of binding potential, the smallest average test-retest variability and SEM were obtained using a connectivity-based atlas and PET normalization (test-retest variability=4.5%, SEM=0.17). The current state-of-the-art automatic ROI methods can be considered good alternatives for subjective and laborious manual segmentation in C-raclopride-PET studies.

  16. CB4-03: An Eye on the Future: A Review of Data Virtualization Techniques to Improve Research Analytics

    PubMed Central

    Richter, Jack; McFarland, Lela; Bredfeldt, Christine

    2012-01-01

    Background/Aims Integrating data across systems can be a daunting process. The traditional method of moving data to a common location, mapping fields with different formats and meanings, and performing data cleaning activities to ensure valid and reliable integration across systems can be both expensive and extremely time consuming. As the scope of needed research data increases, the traditional methodology may not be sustainable. Data Virtualization provides an alternative to traditional methods that may reduce the effort required to integrate data across disparate systems. Objective Our goal was to survey new methods in data integration, cloud computing, enterprise data management and virtual data management for opportunities to increase the efficiency of producing VDW and similar data sets. Methods Kaiser Permanente Information Technology (KPIT), in collaboration with the Mid-Atlantic Permanente Research Institute (MAPRI) reviewed methodologies in the burgeoning field of Data Virtualization. We identified potential strengths and weaknesses of new approaches to data integration. For each method, we evaluated its potential application for producing effective research data sets. Results Data Virtualization provides opportunities to reduce the amount of data movement required to integrate data sources on different platforms in order to produce research data sets. Additionally, Data Virtualization also includes methods for managing “fuzzy” matching used to match fields known to have poor reliability such as names, addresses and social security numbers. These methods could improve the efficiency of integrating state and federal data such as patient race, death, and tumors with internal electronic health record data. Discussion The emerging field of Data Virtualization has considerable potential for increasing the efficiency of producing research data sets. An important next step will be to develop a proof of concept project that will help us understand to benefits and drawbacks of these techniques.

  17. Multi-targeted interference-free determination of ten β-blockers in human urine and plasma samples by alternating trilinear decomposition algorithm-assisted liquid chromatography-mass spectrometry in full scan mode: comparison with multiple reaction monitoring.

    PubMed

    Gu, Hui-Wen; Wu, Hai-Long; Yin, Xiao-Li; Li, Yong; Liu, Ya-Juan; Xia, Hui; Zhang, Shu-Rong; Jin, Yi-Feng; Sun, Xiao-Dong; Yu, Ru-Qin; Yang, Peng-Yuan; Lu, Hao-Jie

    2014-10-27

    β-blockers are the first-line therapeutic agents for treating cardiovascular diseases and also a class of prohibited substances in athletic competitions. In this work, a smart strategy that combines three-way liquid chromatography-mass spectrometry (LC-MS) data with second-order calibration method based on alternating trilinear decomposition (ATLD) algorithm was developed for simultaneous determination of ten β-blockers in human urine and plasma samples. This flexible strategy proved to be a useful tool to solve the problems of overlapped peaks and uncalibrated interferences encountered in quantitative LC-MS, and made the multi-targeted interference-free qualitative and quantitative analysis of β-blockers in complex matrices possible. The limits of detection were in the range of 2.0×10(-5)-6.2×10(-3) μg mL(-1), and the average recoveries were between 90 and 110% with standard deviations and average relative prediction errors less than 10%, indicating that the strategy could provide satisfactory prediction results for ten β-blockers in human urine and plasma samples only using liquid chromatography hyphenated single-quadrupole mass spectrometer in full scan mode. To further confirm the feasibility and reliability of the proposed method, the same batch samples were analyzed by multiple reaction monitoring (MRM) method. T-test demonstrated that there are no significant differences between the prediction results of the two methods. Considering the advantages of fast, low-cost, high sensitivity, and no need of complicated chromatographic and tandem mass spectrometric conditions optimization, the proposed strategy is expected to be extended as an attractive alternative method to quantify analyte(s) of interest in complex systems such as cells, biological fluids, food, environment, pharmaceuticals and other complex samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Principal Component Analysis of Thermographic Data

    NASA Technical Reports Server (NTRS)

    Winfree, William P.; Cramer, K. Elliott; Zalameda, Joseph N.; Howell, Patricia A.; Burke, Eric R.

    2015-01-01

    Principal Component Analysis (PCA) has been shown effective for reducing thermographic NDE data. While a reliable technique for enhancing the visibility of defects in thermal data, PCA can be computationally intense and time consuming when applied to the large data sets typical in thermography. Additionally, PCA can experience problems when very large defects are present (defects that dominate the field-of-view), since the calculation of the eigenvectors is now governed by the presence of the defect, not the "good" material. To increase the processing speed and to minimize the negative effects of large defects, an alternative method of PCA is being pursued where a fixed set of eigenvectors, generated from an analytic model of the thermal response of the material under examination, is used to process the thermal data from composite materials. This method has been applied for characterization of flaws.

  19. Establishing Reliable miRNA-Cancer Association Network Based on Text-Mining Method

    PubMed Central

    Yang, Zhaowan; Fang, Ming; Zhang, Libin; Zhou, Yanhong

    2014-01-01

    Associating microRNAs (miRNAs) with cancers is an important step of understanding the mechanisms of cancer pathogenesis and finding novel biomarkers for cancer therapies. In this study, we constructed a miRNA-cancer association network (miCancerna) based on more than 1,000 miRNA-cancer associations detected from millions of abstracts with the text-mining method, including 226 miRNA families and 20 common cancers. We further prioritized cancer-related miRNAs at the network level with the random-walk algorithm, achieving a relatively higher performance than previous miRNA disease networks. Finally, we examined the top 5 candidate miRNAs for each kind of cancer and found that 71% of them are confirmed experimentally. miCancerna would be an alternative resource for the cancer-related miRNA identification. PMID:24895499

  20. [Perioperative use of medical hypnosis. Therapy options for anaesthetists and surgeons].

    PubMed

    Hermes, D; Trübger, D; Hakim, S G; Sieg, P

    2004-04-01

    Surgical treatment of patients under local anaesthesia is quite commonly restricted by limited compliance from the patient. An alternative to treatment under pharmacological sedation or general anaesthesia could be the application of medical hypnosis. With this method, both suggestive and autosuggestive procedures are used for anxiolysis, relaxation, sedation and analgesia of the patient. During a 1-year period of first clinical application, a total of 207 surgical procedures on a non-selected collective of 174 patients were carried out under combined local anaesthesia and medical hypnosis. Medical hypnosis proved to be a standardisable and reliable method by which remarkable improvements in treatment conditions for both patient and surgeons were achievable. Medical hypnosis is not considered to be a substitute for conscious sedation or general anaesthesia but a therapeutic option equally interesting for anaesthesists and surgeons.

  1. A method to establish stimulus control and compliance with instructions.

    PubMed

    Borgen, John G; Charles Mace, F; Cavanaugh, Brenna M; Shamlian, Kenneth; Lit, Keith R; Wilson, Jillian B; Trauschke, Stephanie L

    2017-10-01

    We evaluated a unique procedure to establish compliance with instructions in four young children diagnosed with autism spectrum disorder (ASD) who had low levels of compliance. Our procedure included methods to establish a novel therapist as a source of positive reinforcement, reliably evoke orienting responses to the therapist, increase the number of exposures to instruction-compliance-reinforcer contingencies, and minimize the number of exposures to instruction-noncompliance-no reinforcer contingencies. We further alternated between instructions with a high probability of compliance (high-p instructions) with instructions that had a prior low probability of compliance (low-p instructions) as soon as low-p instructions lost stimulus control. The intervention is discussed in relation to the conditions necessary for the development of stimulus control and as an example of a variation of translational research. © 2017 Society for the Experimental Analysis of Behavior.

  2. A Psychometric Analysis of the Italian Version of the eHealth Literacy Scale Using Item Response and Classical Test Theory Methods

    PubMed Central

    Dima, Alexandra Lelia; Schulz, Peter Johannes

    2017-01-01

    Background The eHealth Literacy Scale (eHEALS) is a tool to assess consumers’ comfort and skills in using information technologies for health. Although evidence exists of reliability and construct validity of the scale, less agreement exists on structural validity. Objective The aim of this study was to validate the Italian version of the eHealth Literacy Scale (I-eHEALS) in a community sample with a focus on its structural validity, by applying psychometric techniques that account for item difficulty. Methods Two Web-based surveys were conducted among a total of 296 people living in the Italian-speaking region of Switzerland (Ticino). After examining the latent variables underlying the observed variables of the Italian scale via principal component analysis (PCA), fit indices for two alternative models were calculated using confirmatory factor analysis (CFA). The scale structure was examined via parametric and nonparametric item response theory (IRT) analyses accounting for differences between items regarding the proportion of answers indicating high ability. Convergent validity was assessed by correlations with theoretically related constructs. Results CFA showed a suboptimal model fit for both models. IRT analyses confirmed all items measure a single dimension as intended. Reliability and construct validity of the final scale were also confirmed. The contrasting results of factor analysis (FA) and IRT analyses highlight the importance of considering differences in item difficulty when examining health literacy scales. Conclusions The findings support the reliability and validity of the translated scale and its use for assessing Italian-speaking consumers’ eHealth literacy. PMID:28400356

  3. confFuse: High-Confidence Fusion Gene Detection across Tumor Entities.

    PubMed

    Huang, Zhiqin; Jones, David T W; Wu, Yonghe; Lichter, Peter; Zapatka, Marc

    2017-01-01

    Background: Fusion genes play an important role in the tumorigenesis of many cancers. Next-generation sequencing (NGS) technologies have been successfully applied in fusion gene detection for the last several years, and a number of NGS-based tools have been developed for identifying fusion genes during this period. Most fusion gene detection tools based on RNA-seq data report a large number of candidates (mostly false positives), making it hard to prioritize candidates for experimental validation and further analysis. Selection of reliable fusion genes for downstream analysis becomes very important in cancer research. We therefore developed confFuse, a scoring algorithm to reliably select high-confidence fusion genes which are likely to be biologically relevant. Results: confFuse takes multiple parameters into account in order to assign each fusion candidate a confidence score, of which score ≥8 indicates high-confidence fusion gene predictions. These parameters were manually curated based on our experience and on certain structural motifs of fusion genes. Compared with alternative tools, based on 96 published RNA-seq samples from different tumor entities, our method can significantly reduce the number of fusion candidates (301 high-confidence from 8,083 total predicted fusion genes) and keep high detection accuracy (recovery rate 85.7%). Validation of 18 novel, high-confidence fusions detected in three breast tumor samples resulted in a 100% validation rate. Conclusions: confFuse is a novel downstream filtering method that allows selection of highly reliable fusion gene candidates for further downstream analysis and experimental validations. confFuse is available at https://github.com/Zhiqin-HUANG/confFuse.

  4. The STAR score: a method for auditing clinical records

    PubMed Central

    Tuffaha, H

    2012-01-01

    INTRODUCTION Adequate medical note keeping is critical in delivering high quality healthcare. However, there are few robust tools available for the auditing of notes. The aim of this paper was to describe the design, validation and implementation of a novel scoring tool to objectively assess surgical notes. METHODS An initial ‘path finding’ study was performed to evaluate the quality of note keeping using the CRABEL scoring tool. The findings prompted the development of the Surgical Tool for Auditing Records (STAR) as an alternative. STAR was validated using inter-rater reliability analysis. An audit cycle of surgical notes using STAR was performed. The results were analysed and a structured form for the completion of surgical notes was introduced to see if the quality improved in the next audit cycle using STAR. An education exercise was conducted and all participants said the exercise would change their practice, with 25% implementing major changes. RESULTS Statistical analysis of STAR showed that it is reliable (Cronbach’s a = 0.959). On completing the audit cycle, there was an overall increase in the STAR score from 83.344% to 97.675% (p<0.001) with significant improvements in the documentation of the initial clerking from 59.0% to 96.5% (p<0.001) and subsequent entries from 78.4% to 96.1% (p<0.001). CONCLUSIONS The authors believe in the value of STAR as an effective, reliable and reproducible tool. Coupled with the application of structured forms to note keeping, it can significantly improve the quality of surgical documentation and can be implemented universally. PMID:22613300

  5. Alternative Methods for Assessing Contaminant Transport from the Vadose Zone to Indoor Air

    NASA Astrophysics Data System (ADS)

    Baylor, K. J.; Lee, A.; Reddy, P.; Plate, M.

    2010-12-01

    Vapor intrusion, which is the transport of contaminant vapors from groundwater and the vadose zone to indoor air, has emerged as a significant human health risk near hazardous waste sites. Volatile organic compounds (VOCs) such as trichloroethylene (TCE) and tetrachloroethylene (PCE) can volatilize from groundwater and from residual sources in the vadose zone and enter homes and commercial buildings through cracks in the slab, plumbing conduits, or other preferential pathways. Assessment of the vapor intrusion pathway typically requires collection of groundwater, soil gas, and indoor air samples, a process which can be expensive and time-consuming. We evaluated three alternative vapor intrusion assessment methods, including 1) use of radon as a surrogate for vapor intrusion, 2) use of pressure differential measurements between indoor/outdoor and indoor/subslab to assess the potential for vapor intrusion, and 3) use of passive, longer-duration sorbent methods to measure indoor air VOC concentrations. The primary test site, located approximately 30 miles south of San Francisco, was selected due to the presence of TCE (10 - 300 ug/L) in shallow groundwater (5 to 10 feet bgs). At this test site, we found that radon was not a suitable surrogate to asses vapor intrusion and that pressure differential measurements are challenging to implement and equipment-intensive. More significantly, we found that the passive, longer-duration sorbent methods are easy to deploy and compared well quantitatively with standard indoor air sampling methods. The sorbent technique is less than half the cost of typical indoor air methods, and also provides a longer duration sample, typically 3 to 14 days rather than 8 to 24 hours for standard methods. The passive sorbent methods can be a reliable, cost-effective, and easy way to sample for TCE, PCE and other VOCs as part of a vapor intrusion investigation.

  6. Noninvasive aortic bloodflow by Pulsed Doppler Echocardiography (PDE) compared to cardiac output by the direct Fick procedure

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Left ventricular stroke volume was estimated from the systolic velocity integral in the ascending aorta by pulsed Doppler Echocardiography (PDE) and the cross sectional area of the aorta estimated by M mode echocardiography on 15 patients with coronary disease undergoing right catheterization for diagnostic purposes. Cardiac output was calculated from stroke volume and heart volume using the PDE method as well as the Fick procedure for comparison. The mean value for the cardiac output via the PDE method (4.42 L/min) was only 6% lower than for the cardiac output obtained from the Fick procedure (4.69 L/min) and the correlation between the two methods was excellent (r=0.967, p less than .01). The good agreement between the two methods demonstrates that the PDE technique offers a reliable noninvasive alternative for estimating cardiac output, requiring no active cooperation by the subject. It was concluded that the Doppler method is superior to the Fick method in that it provides beat by beat information on cardiac performance.

  7. A COMPARISON OF METHODS FOR TEACHING RECEPTIVE LABELING TO CHILDREN WITH AUTISM SPECTRUM DISORDERS

    PubMed Central

    Grow, Laura L; Carr, James E; Kodak, Tiffany M; Jostad, Candice M; Kisamore, April N

    2011-01-01

    Many early intervention curricular manuals recommend teaching auditory-visual conditional discriminations (i.e., receptive labeling) using the simple-conditional method in which component simple discriminations are taught in isolation and in the presence of a distracter stimulus before the learner is required to respond conditionally. Some have argued that this procedure might be susceptible to faulty stimulus control such as stimulus overselectivity (Green, 2001). Consequently, there has been a call for the use of alternative teaching procedures such as the conditional-only method, which involves conditional discrimination training from the onset of intervention. The purpose of the present study was to compare the simple-conditional and conditional-only methods for teaching receptive labeling to 3 young children diagnosed with autism spectrum disorders. The data indicated that the conditional-only method was a more reliable and efficient teaching procedure. In addition, several error patterns emerged during training using the simple-conditional method. The implications of the results with respect to current teaching practices in early intervention programs are discussed. PMID:21941380

  8. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 4: HARP Output (HARPO) graphics display user's guide

    NASA Technical Reports Server (NTRS)

    Sproles, Darrell W.; Bavuso, Salvatore J.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical postprocessor program HARPO (HARP Output). HARPO reads ASCII files generated by HARP. It provides an interactive plotting capability that can be used to display alternate model data for trade-off analyses. File data can also be imported to other commercial software programs.

  9. A novel interacting multiple model based network intrusion detection scheme

    NASA Astrophysics Data System (ADS)

    Xin, Ruichi; Venkatasubramanian, Vijay; Leung, Henry

    2006-04-01

    In today's information age, information and network security are of primary importance to any organization. Network intrusion is a serious threat to security of computers and data networks. In internet protocol (IP) based network, intrusions originate in different kinds of packets/messages contained in the open system interconnection (OSI) layer 3 or higher layers. Network intrusion detection and prevention systems observe the layer 3 packets (or layer 4 to 7 messages) to screen for intrusions and security threats. Signature based methods use a pre-existing database that document intrusion patterns as perceived in the layer 3 to 7 protocol traffics and match the incoming traffic for potential intrusion attacks. Alternately, network traffic data can be modeled and any huge anomaly from the established traffic pattern can be detected as network intrusion. The latter method, also known as anomaly based detection is gaining popularity for its versatility in learning new patterns and discovering new attacks. It is apparent that for a reliable performance, an accurate model of the network data needs to be established. In this paper, we illustrate using collected data that network traffic is seldom stationary. We propose the use of multiple models to accurately represent the traffic data. The improvement in reliability of the proposed model is verified by measuring the detection and false alarm rates on several datasets.

  10. Comparing Alternative Methods of Measuring Skin Color and Damage

    PubMed Central

    Daniel, Lauren C.; Heckman, Carolyn J.; Kloss, Jacqueline D.; Manne, Sharon L.

    2009-01-01

    Objective: The current study investigated the reliability and validity of several skin color and damage measurement strategies and explored their applicability among participants of different races, skin types, and sexes. Methods: One hundred college-aged participants completed an online survey about their perceived skin damage and skin protection. They also attended an in-person session in which an observer rated their skin color; additionally, UV photos and spectrophotometry readings were taken. Results: Trained research assistants rated the damage depicted in the UV photos reliably. Moderate to high correlations emerged between skin color self-report and spectrophotometry readings. Observer rating correlated with spectrophotometry rating of current but not natural skin color. Lighter-skinned individuals reported more cumulative skin damage, which was supported by UV photography. Although women's current skin color was lighter and their UV photos showed similar damage to men's, women reported significantly more damaged skin than men did. Conclusions: These findings suggest that self-report continues to be a valuable measurement strategy when skin reflectance measurement is not feasible or appropriate and that UV photos and observer ratings may be useful but need to be tested further. The results also suggest that young women and men may benefit from different types of skin cancer prevention interventions. PMID:18931926

  11. Robust Unit Commitment Considering Uncertain Demand Response

    DOE PAGES

    Liu, Guodong; Tomsovic, Kevin

    2014-09-28

    Although price responsive demand response has been widely accepted as playing an important role in the reliable and economic operation of power system, the real response from demand side can be highly uncertain due to limited understanding of consumers' response to pricing signals. To model the behavior of consumers, the price elasticity of demand has been explored and utilized in both research and real practice. However, the price elasticity of demand is not precisely known and may vary greatly with operating conditions and types of customers. To accommodate the uncertainty of demand response, alternative unit commitment methods robust to themore » uncertainty of the demand response require investigation. In this paper, a robust unit commitment model to minimize the generalized social cost is proposed for the optimal unit commitment decision taking into account uncertainty of the price elasticity of demand. By optimizing the worst case under proper robust level, the unit commitment solution of the proposed model is robust against all possible realizations of the modeled uncertain demand response. Numerical simulations on the IEEE Reliability Test System show the e ectiveness of the method. Finally, compared to unit commitment with deterministic price elasticity of demand, the proposed robust model can reduce the average Locational Marginal Prices (LMPs) as well as the price volatility.« less

  12. Advancing methods for reliably assessing motivational interviewing fidelity using the Motivational Interviewing Skills Code

    PubMed Central

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W.; Imel, Zac E.; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C.

    2014-01-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. PMID:25242192

  13. Advancing methods for reliably assessing motivational interviewing fidelity using the motivational interviewing skills code.

    PubMed

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C

    2015-02-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Sex genes for genomic analysis in human brain: internal controls for comparison of probe level data extraction.

    PubMed Central

    Galfalvy, Hanga C; Erraji-Benchekroun, Loubna; Smyrniotopoulos, Peggy; Pavlidis, Paul; Ellis, Steven P; Mann, J John; Sibille, Etienne; Arango, Victoria

    2003-01-01

    Background Genomic studies of complex tissues pose unique analytical challenges for assessment of data quality, performance of statistical methods used for data extraction, and detection of differentially expressed genes. Ideally, to assess the accuracy of gene expression analysis methods, one needs a set of genes which are known to be differentially expressed in the samples and which can be used as a "gold standard". We introduce the idea of using sex-chromosome genes as an alternative to spiked-in control genes or simulations for assessment of microarray data and analysis methods. Results Expression of sex-chromosome genes were used as true internal biological controls to compare alternate probe-level data extraction algorithms (Microarray Suite 5.0 [MAS5.0], Model Based Expression Index [MBEI] and Robust Multi-array Average [RMA]), to assess microarray data quality and to establish some statistical guidelines for analyzing large-scale gene expression. These approaches were implemented on a large new dataset of human brain samples. RMA-generated gene expression values were markedly less variable and more reliable than MAS5.0 and MBEI-derived values. A statistical technique controlling the false discovery rate was applied to adjust for multiple testing, as an alternative to the Bonferroni method, and showed no evidence of false negative results. Fourteen probesets, representing nine Y- and two X-chromosome linked genes, displayed significant sex differences in brain prefrontal cortex gene expression. Conclusion In this study, we have demonstrated the use of sex genes as true biological internal controls for genomic analysis of complex tissues, and suggested analytical guidelines for testing alternate oligonucleotide microarray data extraction protocols and for adjusting multiple statistical analysis of differentially expressed genes. Our results also provided evidence for sex differences in gene expression in the brain prefrontal cortex, supporting the notion of a putative direct role of sex-chromosome genes in differentiation and maintenance of sexual dimorphism of the central nervous system. Importantly, these analytical approaches are applicable to all microarray studies that include male and female human or animal subjects. PMID:12962547

  15. Sex genes for genomic analysis in human brain: internal controls for comparison of probe level data extraction.

    PubMed

    Galfalvy, Hanga C; Erraji-Benchekroun, Loubna; Smyrniotopoulos, Peggy; Pavlidis, Paul; Ellis, Steven P; Mann, J John; Sibille, Etienne; Arango, Victoria

    2003-09-08

    Genomic studies of complex tissues pose unique analytical challenges for assessment of data quality, performance of statistical methods used for data extraction, and detection of differentially expressed genes. Ideally, to assess the accuracy of gene expression analysis methods, one needs a set of genes which are known to be differentially expressed in the samples and which can be used as a "gold standard". We introduce the idea of using sex-chromosome genes as an alternative to spiked-in control genes or simulations for assessment of microarray data and analysis methods. Expression of sex-chromosome genes were used as true internal biological controls to compare alternate probe-level data extraction algorithms (Microarray Suite 5.0 [MAS5.0], Model Based Expression Index [MBEI] and Robust Multi-array Average [RMA]), to assess microarray data quality and to establish some statistical guidelines for analyzing large-scale gene expression. These approaches were implemented on a large new dataset of human brain samples. RMA-generated gene expression values were markedly less variable and more reliable than MAS5.0 and MBEI-derived values. A statistical technique controlling the false discovery rate was applied to adjust for multiple testing, as an alternative to the Bonferroni method, and showed no evidence of false negative results. Fourteen probesets, representing nine Y- and two X-chromosome linked genes, displayed significant sex differences in brain prefrontal cortex gene expression. In this study, we have demonstrated the use of sex genes as true biological internal controls for genomic analysis of complex tissues, and suggested analytical guidelines for testing alternate oligonucleotide microarray data extraction protocols and for adjusting multiple statistical analysis of differentially expressed genes. Our results also provided evidence for sex differences in gene expression in the brain prefrontal cortex, supporting the notion of a putative direct role of sex-chromosome genes in differentiation and maintenance of sexual dimorphism of the central nervous system. Importantly, these analytical approaches are applicable to all microarray studies that include male and female human or animal subjects.

  16. Assessing the Potential Environmental Consequences of a New Energetic Material: A Phased Approach, September 2005

    DTIC Science & Technology

    2007-12-01

    there are no reliable alternatives to animal testing in the determination of toxicity. QSARs are only as reliable as the corroborating toxicological ...2) QSAR approaches can also be used to estimate toxicological impact. Toxicity QSAR models can often predict many toxicity parameters without... Toxicology Study No. 87-XE-03N3-05, Assessing the Potential Environmental Consequences of a New Energetic Material: A Phased Approach, September 2005 1

  17. Reliability study of refractory gate gallium arsenide MESFETS

    NASA Technical Reports Server (NTRS)

    Yin, J. C. W.; Portnoy, W. M.

    1981-01-01

    Refractory gate MESFET's were fabricated as an alternative to aluminum gate devices, which have been found to be unreliable as RF power amplifiers. In order to determine the reliability of the new structures, statistics of failure and information about mechanisms of failure in refractory gate MESFET's are given. Test transistors were stressed under conditions of high temperature and forward gate current to enhance failure. Results of work at 150 C and 275 C are reported.

  18. Reliability study of refractory gate gallium arsenide MESFETS

    NASA Astrophysics Data System (ADS)

    Yin, J. C. W.; Portnoy, W. M.

    Refractory gate MESFET's were fabricated as an alternative to aluminum gate devices, which have been found to be unreliable as RF power amplifiers. In order to determine the reliability of the new structures, statistics of failure and information about mechanisms of failure in refractory gate MESFET's are given. Test transistors were stressed under conditions of high temperature and forward gate current to enhance failure. Results of work at 150 C and 275 C are reported.

  19. Reliability of cognitive tests of ELSA-Brasil, the brazilian longitudinal study of adult health

    PubMed Central

    Batista, Juliana Alves; Giatti, Luana; Barreto, Sandhi Maria; Galery, Ana Roscoe Papini; Passos, Valéria Maria de Azeredo

    2013-01-01

    Cognitive function evaluation entails the use of neuropsychological tests, applied exclusively or in sequence. The results of these tests may be influenced by factors related to the environment, the interviewer or the interviewee. OBJECTIVES We examined the test-retest reliability of some tests of the Brazilian version from the Consortium to Establish a Registry for Alzheimer's disease. METHODS The ELSA-Brasil is a multicentre study of civil servants (35-74 years of age) from public institutions across six Brazilian States. The same tests were applied, in different order of appearance, by the same trained and certified interviewer, with an approximate 20-day interval, to 160 adults (51% men, mean age 52 years). The Intraclass Correlation Coefficient (ICC) was used to assess the reliability of the measures; and a dispersion graph was used to examine the patterns of agreement between them. RESULTS We observed higher retest scores in all tests as well as a shorter test completion time for the Trail Making Test B. ICC values for each test were as following: Word List Learning Test (0.56), Word Recall (0.50), Word Recognition (0.35), Phonemic Verbal Fluency Test (VFT, 0.61), Semantic VFT (0.53) and Trail B (0.91). The Bland-Altman plot showed better correlation of executive function (VFT and Trail B) than of memory tests. CONCLUSIONS Better performance in retest may reflect a learning effect, and suggest that retest should be repeated using alternate forms or after longer periods. In this sample of adults with high schooling level, reliability was only moderate for memory tests whereas the measurement of executive function proved more reliable. PMID:29213860

  20. Construct validity and test–retest reliability of the International Fitness Scale (IFIS) in Colombian children and adolescents aged 9–17.9 years: the FUPRECOL study

    PubMed Central

    Correa-Bautista, Jorge E.; Izquierdo, Mikel

    2017-01-01

    Background There is a lack of instruments and studies written in Spanish evaluating physical fitness, impeding the determination of the current status of this important health indicator in the Latin population, especially in Colombia. The aim of the study was two-fold: to examine the validity of the International Fitness Scale (IFIS) with a population-based sample of schoolchildren from Bogota, Colombia and to examine the reliability of the IFIS with children and adolescents from Engativa, Colombia. Methods The sample comprised 1,873 Colombian youths (54.5% girls) aged 9–17.9 years. We measured their adiposity markers (waist-to-height ratio, skinfold thickness, percentage of body fat and body mass index), blood pressure, lipids profile, fasting glucose, and physical fitness level (self-reported and measured). A validated cardiometabolic risk index score was also used. An age- and sex-matched subsample of 229 schoolchildren who were not originally included in the sample completed the IFIS twice for reliability purposes. Results Our data suggest that both measured and self-reported overall physical fitness levels were inversely associated with percentage of body fat indicators and the cardiometabolic risk index score. Overall, schoolchildren who self-reported “good” or “very good” fitness had better measured fitness levels than those who reported “very poor/poor” fitness (all p < 0.001). The test-retest reliability of the IFIS items was also good, with an average weighted kappa of 0.811. Discussion Our findings suggest that self-reported fitness, as assessed by the IFIS, is a valid, reliable, and health-related measure. Furthermore, it can be a good alternative for future use in large studies with Latin schoolchildren from Colombia. PMID:28560104

  1. Improving 1D Site Specific Velocity Profiles for the Kik-Net Network

    NASA Astrophysics Data System (ADS)

    Holt, James; Edwards, Benjamin; Pilz, Marco; Fäh, Donat; Rietbrock, Andreas

    2017-04-01

    Ground motion predication equations (GMPEs) form the cornerstone of modern seismic hazard assessments. When produced to a high standard they provide reliable estimates of ground motion/spectral acceleration for a given site and earthquake scenario. This information is crucial for engineers to optimise design and for regulators who enforce legal minimum safe design capacities. Classically, GMPEs were built upon the assumption that variability around the median model could be treated as aleatory. As understanding improved, it was noted that the propagation could be segregated into the response of the average path from the source and the response of the site. This is because the heterogeneity of the near-surface lithology is significantly different from that of the bulk path. It was then suggested that the semi-ergodic approach could be taken if the site response could be determined, moving uncertainty away from aleatory to epistemic. The determination of reliable site-specific response models is therefore becoming increasingly critical for ground motion models used in engineering practice. Today it is common practice to include proxies for site response within the scope of a GMPE, such as Vs30 or site classification, in an effort to reduce the overall uncertainty of the predication at a given site. However, these proxies are not always reliable enough to give confident ground motion estimates, due to the complexity of the near-surface. Other approaches of quantifying the response of the site include detailed numerical simulations (1/2/3D - linear, EQL, non-linear etc.). However, in order to be reliable, they require highly detailed and accurate velocity and, for non-linear analyses, material property models. It is possible to obtain this information through invasive methods, but is expensive, and not feasible for most projects. Here we propose an alternative method to derive reliable velocity profiles (and their uncertainty), calibrated using almost 20 years of recorded data from the Kik-Net network. First, using a reliable subset of sites, the empirical surface to borehole (S/B) ratio is calculated in the frequency domain using all events recorded at that site. In a subsequent step, we use numerical simulation to produce 1D SH transfer function curves using a suite of stochastic velocity models. Comparing the resulting amplification with the empirical S/B ratio we find optimal 1D velocity models and their uncertainty. The method will be tested to determine the level of initial information required to obtain a reliable Vs profile (e.g., starting Vs model, only Vs30, site-class, H/V ratio etc.) and then applied and tested against data from other regions using site-to-reference or empirical spectral model amplification.

  2. Estimating Conditional Distributions of Scores on an Alternate Form of a Test. Research Report. ETS RR-15-18

    ERIC Educational Resources Information Center

    Livingston, Samuel A.; Chen, Haiwen H.

    2015-01-01

    Quantitative information about test score reliability can be presented in terms of the distribution of equated scores on an alternate form of the test for test takers with a given score on the form taken. In this paper, we describe a procedure for estimating that distribution, for any specified score on the test form taken, by estimating the joint…

  3. What should students learn about complementary and alternative medicine?

    PubMed

    Gaster, Barak; Unterborn, John N; Scott, Richard B; Schneeweiss, Ronald

    2007-10-01

    With thousands of complementary and alternative medicine (CAM) treatments currently being used in the United States today, it is challenging to design a concise body of CAM content which will fit into already overly full curricula for health care students. The purpose of this article is to outline key principles which 15 National Center for Complementary and Alternative Medicine-funded education programs found useful when developing CAM course-work and selecting CAM content. Three key guiding principles are discussed: teach foundational CAM competencies to give students a framework for learning about CAM; choose specific content on the basis of evidence, demographics and condition (what conditions are most appropriate for CAM therapies?); and finally, provide students with skills for future learning, including where to find reliable information about CAM and how to search the scientific literature and assess the results of CAM research. Most of the programs developed evidence-based guides to help students find reliable CAM resources. The cumulative experiences of the 15 programs have been compiled, and an annotated table outlining the most highly recommended resources about CAM is presented.

  4. Smartphones as experimental tools to measure acoustical and mechanical properties of vibrating rods

    NASA Astrophysics Data System (ADS)

    González, Manuel Á.; González, Miguel Á.

    2016-07-01

    Modern smartphones have calculation and sensor capabilities that make them suitable for use as versatile and reliable measurement devices in simple teaching experiments. In this work a smartphone is used, together with low cost materials, in an experiment to measure the frequencies emitted by vibrating rods of different materials, shapes and lengths. The results obtained with the smartphone have been compared with theoretical calculations and the agreement is good. Alternatively, physics students can perform the experiment described here and use their results to determine the dependencies of the obtained frequencies on the rod characteristics. In this way they will also practice research methods that they will probably use in their professional life.

  5. Sub-30 nm patterning of molecular resists based on crosslinking through tip based oxidation

    NASA Astrophysics Data System (ADS)

    Lorenzoni, Matteo; Wagner, Daniel; Neuber, Christian; Schmidt, Hans-Werner; Perez-Murano, Francesc

    2018-06-01

    Oxidation Scanning Probe Lithography (o-SPL) is an established method employed for device patterning at the nanometer scale. It represents a feasible and inexpensive alternative to standard lithographic techniques such as electron beam lithography (EBL) and nanoimprint lithography (NIL). In this work we applied non-contact o-SPL to an engineered class of molecular resists in order to obtain crosslinking by electrochemical driven oxidation. By patterning and developing various resist formulas we were able to obtain a reliable negative tone resist behavior based on local oxidation. Under optimal conditions, directly written patterns can routinely reach sub-30 nm lateral resolution, while the final developed features result wider, approaching 50 nm width.

  6. The Pot Calling the Kettle Black? A Comparison of Measures of Current Tobacco Use

    PubMed Central

    ROSENMAN, ROBERT

    2014-01-01

    Researchers often use the discrepancy between self-reported and biochemically assessed active smoking status to argue that self-reported smoking status is not reliable, ignoring the limitations of biochemically assessed measures and treating it as the gold standard in their comparisons. Here, we employ econometric techniques to compare the accuracy of self-reported and biochemically assessed current tobacco use, taking into account measurement errors with both methods. Our approach allows estimating and comparing the sensitivity and specificity of each measure without directly observing true smoking status. The results, robust to several alternative specifications, suggest that there is no clear reason to think that one measure dominates the other in accuracy. PMID:25587199

  7. Intelligent failure-tolerant control

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.

    1991-01-01

    An overview of failure-tolerant control is presented, beginning with robust control, progressing through parallel and analytical redundancy, and ending with rule-based systems and artificial neural networks. By design or implementation, failure-tolerant control systems are 'intelligent' systems. All failure-tolerant systems require some degrees of robustness to protect against catastrophic failure; failure tolerance often can be improved by adaptivity in decision-making and control, as well as by redundancy in measurement and actuation. Reliability, maintainability, and survivability can be enhanced by failure tolerance, although each objective poses different goals for control system design. Artificial intelligence concepts are helpful for integrating and codifying failure-tolerant control systems, not as alternatives but as adjuncts to conventional design methods.

  8. Preliminary evaluation of several nondestructive-evaluation techniques for silicon nitride gas-turbine rotors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kupperman, D. S.; Sciammarella, C.; Lapinski, N. P.

    1978-01-01

    Several nondestructive-evaluation (NDE) techniques have been examined to establish their effectiveness for detecting critically sized flaws in silicon nitride gas-turbine rotors. Preliminary results have been obtained for holographic interferometry, acoustic microscopy, dye-enhanced radiography, acoustic emission, and acoustic-impact testing techniques. This report discusses the relative effectiveness of these techniques in terms of their applicability to the rotor geometry and ability to detect critically sized flaws. Where feasible, flaw indications were verified by alternative NDE techniques or destructive examination. This study has indicated that, since the various techniques have different advantages, ultimately a reliable interrogation of ceramic rotors may require the applicationmore » of several NDE methods.« less

  9. Optically controlled phased-array antenna technology for space communication systems

    NASA Technical Reports Server (NTRS)

    Kunath, Richard R.; Bhasin, Kul B.

    1988-01-01

    Using MMICs in phased-array applications above 20 GHz requires complex RF and control signal distribution systems. Conventional waveguide, coaxial cable, and microstrip methods are undesirable due to their high weight, high loss, limited mechanical flexibility and large volume. An attractive alternative to these transmission media, for RF and control signal distribution in MMIC phased-array antennas, is optical fiber. Presented are potential system architectures and their associated characteristics. The status of high frequency opto-electronic components needed to realize the potential system architectures is also discussed. It is concluded that an optical fiber network will reduce weight and complexity, and increase reliability and performance, but may require higher power.

  10. [Assessment of the right ventricular anatomy and function by advanced echocardiography: pathological and physiological insights].

    PubMed

    Lakatos, Bálint; Kovács, Attila; Tokodi, Márton; Doronina, Alexandra; Merkely, Béla

    2016-07-01

    Accurate assessment of right ventricular geometry and function is of high clinical importance. However, several limitations have to be taken into consideration if using conventional echocardiographic parameters. Advanced echocardiographic techniques, such as speckle-tracking analysis or 3D echocardiography are reliable and simple tools providing a cost-effective and non-invasive alternative of current modalities used to characterize the right ventricle. There is a growing interest in the diagnostic and prognostic value of these methods regarding pathological (right ventricular infarction, pulmonary hypertension, arrhythmogenic right ventricular dysplasia, follow-up of heart transplantation) and even physiological (athlete's heart) alterations of the right ventricle. Orv. Hetil., 2016, 157(29), 1139-1146.

  11. Using quasars as standard clocks for measuring cosmological redshift.

    PubMed

    Dai, De-Chang; Starkman, Glenn D; Stojkovic, Branislav; Stojkovic, Dejan; Weltman, Amanda

    2012-06-08

    We report hitherto unnoticed patterns in quasar light curves. We characterize segments of the quasar's light curves with the slopes of the straight lines fit through them. These slopes appear to be directly related to the quasars' redshifts. Alternatively, using only global shifts in time and flux, we are able to find significant overlaps between the light curves of different pairs of quasars by fitting the ratio of their redshifts. We are then able to reliably determine the redshift of one quasar from another. This implies that one can use quasars as standard clocks, as we explicitly demonstrate by constructing two independent methods of finding the redshift of a quasar from its light curve.

  12. [Ultrasound-guided rectus sheath block for upper abdominal surgery].

    PubMed

    Osaka, Yoshimune; Kashiwagi, Masanori; Nagatsuka, Yukio; Oosaku, Masayoshi; Hirose, Chikako

    2010-08-01

    Upper abdominal surgery leads to severe postoperative pain. Insufficient postoperative analgesia accompanies a high incidence of complications. Therefore, postoperative analgesia is very important. The epidural analgesia has many advantages. However it has a high risk of epidural hematoma in anticoagulated patients. Rectus sheath block provided safer and more reliable analgesia in recent years, by the development of ultrasound tools. We experienced two cases of the rectus sheath block in upper abdominal surgery under ultrasound guidance. Ultrasound guided rectus sheath block can reduce the risk of peritoneal puncture, bleeding, and other complications. Rectus sheath block is very effective to reduce postoperative pain in upper abdominal surgery as an alternative method to epidural anesthesia in anticoagulated patients.

  13. Semi-empirical quantum evaluation of peptide - MHC class II binding

    NASA Astrophysics Data System (ADS)

    González, Ronald; Suárez, Carlos F.; Bohórquez, Hugo J.; Patarroyo, Manuel A.; Patarroyo, Manuel E.

    2017-01-01

    Peptide presentation by the major histocompatibility complex (MHC) is a key process for triggering a specific immune response. Studying peptide-MHC (pMHC) binding from a structural-based approach has potential for reducing the costs of investigation into vaccine development. This study involved using two semi-empirical quantum chemistry methods (PM7 and FMO-DFTB) for computing the binding energies of peptides bonded to HLA-DR1 and HLA-DR2. We found that key stabilising water molecules involved in the peptide binding mechanism were required for finding high correlation with IC50 experimental values. Our proposal is computationally non-intensive, and is a reliable alternative for studying pMHC binding interactions.

  14. Rapid, sensitive and reproducible method for point-of-collection screening of liquid milk for adulterants using a portable Raman spectrometer with novel optimized sample well

    NASA Astrophysics Data System (ADS)

    Nieuwoudt, Michel K.; Holroyd, Steve E.; McGoverin, Cushla M.; Simpson, M. Cather; Williams, David E.

    2017-02-01

    Point-of-care diagnostics are of interest in the medical, security and food industry, the latter particularly for screening food adulterated for economic gain. Milk adulteration continues to be a major problem worldwide and different methods to detect fraudulent additives have been investigated for over a century. Laboratory based methods are limited in their application to point-of-collection diagnosis and also require expensive instrumentation, chemicals and skilled technicians. This has encouraged exploration of spectroscopic methods as more rapid and inexpensive alternatives. Raman spectroscopy has excellent potential for screening of milk because of the rich complexity inherent in its signals. The rapid advances in photonic technologies and fabrication methods are enabling increasingly sensitive portable mini-Raman systems to be placed on the market that are both affordable and feasible for both point-of-care and point-of-collection applications. We have developed a powerful spectroscopic method for rapidly screening liquid milk for sucrose and four nitrogen-rich adulterants (dicyandiamide (DCD), ammonium sulphate, melamine, urea), using a combined system: a small, portable Raman spectrometer with focusing fibre optic probe and optimized reflective focusing wells, simply fabricated in aluminium. The reliable sample presentation of this system enabled high reproducibility of 8% RSD (residual standard deviation) within four minutes. Limit of detection intervals for PLS calibrations ranged between 140 - 520 ppm for the four N-rich compounds and between 0.7 - 3.6 % for sucrose. The portability of the system and reliability and reproducibility of this technique opens opportunities for general, reagentless adulteration screening of biological fluids as well as milk, at point-of-collection.

  15. Satellite Power System (SPS): an Overview of Prospective Organizational Structures in the Solar Satellite Field

    NASA Technical Reports Server (NTRS)

    Edler, H. G.

    1978-01-01

    A literature survey, interviews with acknowledged experts in the fields of organizational entities, space, solar energy, and the SPS concept, and an analysis of these inputs to identify the organizational alternatives and make judgments as to their feasibility to serve as patterns for a future SPS entity are presented. Selection and evaluation criteria were determined to include timeliness, reliability, and adequacy to contribute meaningfully to the U.S. supply; political feasibility (both national and international) and cost-effectiveness (including environmental and other external costs). Based on these criteria, four organizational alternatives are discussed which offer reasonable promise as potential options for SPS. These included three domestic alternatives and one international alternative.

  16. Gathering opinion leader data for a tailored implementation intervention in secondary healthcare: a randomised trial.

    PubMed

    Farley, Katherine; Hanbury, Andria; Thompson, Carl

    2014-03-10

    Health professionals' behaviour is a key component in compliance with evidence-based recommendations. Opinion leaders are an oft-used method of influencing such behaviours in implementation studies, but reliably and cost effectively identifying them is not straightforward. Survey and questionnaire based data collection methods have potential and carefully chosen items can - in theory - both aid identification of opinion leaders and help in the design of an implementation strategy itself. This study compares two methods of identifying opinion leaders for behaviour-change interventions. Healthcare professionals working in a single UK mental health NHS Foundation Trust were randomly allocated to one of two questionnaires. The first, slightly longer questionnaire, asked for multiple nominations of opinion leaders, with specific information about the nature of the relationship with each nominee. The second, shorter version, asked simply for a list of named "champions" but no more additional information. We compared, using Chi Square statistics, both the questionnaire response rates and the number of health professionals likely to be influenced by the opinion leaders (i.e. the "coverage" rates) for both questionnaire conditions. Both questionnaire versions had low response rates: only 15% of health professionals named colleagues in the longer questionnaire and 13% in the shorter version. The opinion leaders identified by both methods had a low number of contacts (range of coverage, 2-6 each). There were no significant differences in response rates or coverage between the two identification methods. The low response and population coverage rates for both questionnaire versions suggest that alternative methods of identifying opinion leaders for implementation studies may be more effective. Future research should seek to identify and evaluate alternative, non-questionnaire based, methods of identifying opinion leaders in order to maximise their potential in organisational behaviour change interventions.

  17. Thermo-Mechanical Analysis for John Deere Electronics Solutions | Advanced

    Science.gov Websites

    impacts of alternative manufacturing processes Die, package, and interface material analysis for power module reliability Manufacturing process impacts versus thermal cycling impacts on power module

  18. Air Starters for Transit Buses

    DOT National Transportation Integrated Search

    1983-05-01

    This study was conducted to familiarize transit agencies with the potential benefits gained by utilizing air starting systems as an alternative to electrical starting systems. The potential benefits include improved starting reliability under hot and...

  19. Meta-analytic guidelines for evaluating single-item reliabilities of personality instruments.

    PubMed

    Spörrle, Matthias; Bekk, Magdalena

    2014-06-01

    Personality is an important predictor of various outcomes in many social science disciplines. However, when personality traits are not the principal focus of research, for example, in global comparative surveys, it is often not possible to assess them extensively. In this article, we first provide an overview of the advantages and challenges of single-item measures of personality, a rationale for their construction, and a summary of alternative ways of assessing their reliability. Second, using seven diverse samples (Ntotal = 4,263) we develop the SIMP-G, the German adaptation of the Single-Item Measures of Personality, an instrument assessing the Big Five with one item per trait, and evaluate its validity and reliability. Third, we integrate previous research and our data into a first meta-analysis of single-item reliabilities of personality measures, and provide researchers with guidelines and recommendations for the evaluation of single-item reliabilities. © The Author(s) 2013.

  20. Probability techniques for reliability analysis of composite materials

    NASA Technical Reports Server (NTRS)

    Wetherhold, Robert C.; Ucci, Anthony M.

    1994-01-01

    Traditional design approaches for composite materials have employed deterministic criteria for failure analysis. New approaches are required to predict the reliability of composite structures since strengths and stresses may be random variables. This report will examine and compare methods used to evaluate the reliability of composite laminae. The two types of methods that will be evaluated are fast probability integration (FPI) methods and Monte Carlo methods. In these methods, reliability is formulated as the probability that an explicit function of random variables is less than a given constant. Using failure criteria developed for composite materials, a function of design variables can be generated which defines a 'failure surface' in probability space. A number of methods are available to evaluate the integration over the probability space bounded by this surface; this integration delivers the required reliability. The methods which will be evaluated are: the first order, second moment FPI methods; second order, second moment FPI methods; the simple Monte Carlo; and an advanced Monte Carlo technique which utilizes importance sampling. The methods are compared for accuracy, efficiency, and for the conservativism of the reliability estimation. The methodology involved in determining the sensitivity of the reliability estimate to the design variables (strength distributions) and importance factors is also presented.

Top